Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Can not generate LLM structured inference output with exllamav2 version >0.20 #709

Closed
3 tasks done
debasish-mihup opened this issue Jan 2, 2025 · 2 comments
Closed
3 tasks done
Labels
bug Something isn't working

Comments

@debasish-mihup
Copy link

OS

Windows

GPU Library

CUDA 12.x

Python version

3.11

Pytorch version

2.5.1

Model

llama3.1

Describe the bug

Getting this error

AttributeError: 'ExLlamaV2TokenEnforcerFilter' object has no attribute 'background_drop'

at the below code:

outputs = self.generator.generate(
			prompt=prompts,
			filters=filters,
			filter_prefer_eos=True,
			max_new_tokens=1024,
			add_bos=add_bos,
			stop_conditions=get_phi4_stop_conditions(self.tokenizer), #get_llama3_stop_conditions(self.tokenizer),
			completion_only=True,
			encode_special_tokens=encode_special_tokens,
		)

As a bonus question:

I am trying to run inference with phi4 model. I could not locate any example for the same. Can you check below three functions for their correctness?

def format_phi4_prompt(system_prompt, user_prompt):
	return (
		f"<|system|>{system_prompt}<|endoftext|>\n"
		f"<|user|>{user_prompt}<|endoftext|>\n"
		f"<|assistant|>"
	)


def phi4_encoding_options():
	return False, False, True


def get_phi4_stop_conditions(tokenizer):
	return [tokenizer.single_id("<|endoftext|>")]

Reproduction steps

N/A

Expected behavior

N/A

Logs

N/A

Additional context

N/A

Acknowledgements

  • I have looked for similar issues before submitting this one.
  • I understand that the developers have lives and my issue will be answered when possible.
  • I understand the developers of this program are human, and I will ask my questions politely.
@debasish-mihup debasish-mihup added the bug Something isn't working label Jan 2, 2025
@turboderp
Copy link
Member

See #696.

@debasish-mihup
Copy link
Author

See #696.

@turboderp Can you look into the format_phi4_prompt function?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants