Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

README references models that seem to no longer work #64

Open
gkapfham opened this issue Nov 22, 2024 · 0 comments
Open

README references models that seem to no longer work #64

gkapfham opened this issue Nov 22, 2024 · 0 comments
Assignees
Labels
bug Something isn't working documentation Improvements or additions to documentation

Comments

@gkapfham
Copy link
Collaborator

The README.md file currently references models like the following:

openrouter/meta-llama/llama-3.1-8b-instruct:free
openrouter/google/gemma-2-9b-it:free

However, when I recently tried one of these models I got this error message:

Exception Type: BadRequestError
Error Message: Error code: 400 - {'error': {'message': "litellm.UnsupportedParamsError: openrouter does not support
parameters: {'stream': False}, for model=google/gemma-2-9b-it:free. To drop these, set `litellm.drop_params=True` or for
proxy:\n\n`litellm_settings:\n drop_params: true`\n\nReceived Model Group=openrouter/google/gemma-2-9b-it:free\nAvailable
Model Group Fallbacks=None", 'type': 'None', 'param': None, 'code': '400'}}

If your issue persists, ensure the model you entered is correct, such as:
- anthropic/claude-3-haiku-20240307
- anthropic/claude-3-opus-20240229
- groq/llama3-8b-8192
- openrouter/meta-llama/llama-3.1-8b-instruct:free

Please visit https://docs.litellm.ai/docs/providers for more valid LiteLLM models

For server connectivity issues, please visit https://docs.litellm.ai/docs/simple_proxy for a valid LiteLLM proxy.

Since the other models are working correctly I think that this means that add-ons like :free may no longer work for OpenRouter. Perhaps we should remove these?

Also, I think that the README should do a better job of explaining that the tool will support all of the models that LiteLLM supports if the tool points to a valid LiteLLM proxy.

Also, the README needs to make more clear that the tool does not provide a default LiteLLM proxy. The people who are running and using ExecExam need to setup their own LiteLLM proxy if they want to use this approach. Otherwise, the person using this feature needs to specify their own API key.

@gkapfham gkapfham added bug Something isn't working documentation Improvements or additions to documentation labels Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

2 participants