You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, when I recently tried one of these models I got this error message:
Exception Type: BadRequestError
Error Message: Error code: 400 - {'error': {'message': "litellm.UnsupportedParamsError: openrouter does not support
parameters: {'stream': False}, for model=google/gemma-2-9b-it:free. To drop these, set `litellm.drop_params=True` or for
proxy:\n\n`litellm_settings:\n drop_params: true`\n\nReceived Model Group=openrouter/google/gemma-2-9b-it:free\nAvailable
Model Group Fallbacks=None", 'type': 'None', 'param': None, 'code': '400'}}
If your issue persists, ensure the model you entered is correct, such as:
- anthropic/claude-3-haiku-20240307
- anthropic/claude-3-opus-20240229
- groq/llama3-8b-8192
- openrouter/meta-llama/llama-3.1-8b-instruct:free
Please visit https://docs.litellm.ai/docs/providers for more valid LiteLLM models
For server connectivity issues, please visit https://docs.litellm.ai/docs/simple_proxy for a valid LiteLLM proxy.
Since the other models are working correctly I think that this means that add-ons like :free may no longer work for OpenRouter. Perhaps we should remove these?
Also, I think that the README should do a better job of explaining that the tool will support all of the models that LiteLLM supports if the tool points to a valid LiteLLM proxy.
Also, the README needs to make more clear that the tool does not provide a default LiteLLM proxy. The people who are running and using ExecExam need to setup their own LiteLLM proxy if they want to use this approach. Otherwise, the person using this feature needs to specify their own API key.
The text was updated successfully, but these errors were encountered:
The README.md file currently references models like the following:
However, when I recently tried one of these models I got this error message:
Since the other models are working correctly I think that this means that add-ons like
:free
may no longer work for OpenRouter. Perhaps we should remove these?Also, I think that the README should do a better job of explaining that the tool will support all of the models that LiteLLM supports if the tool points to a valid LiteLLM proxy.
Also, the README needs to make more clear that the tool does not provide a default LiteLLM proxy. The people who are running and using ExecExam need to setup their own LiteLLM proxy if they want to use this approach. Otherwise, the person using this feature needs to specify their own API key.
The text was updated successfully, but these errors were encountered: