Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini Flash support #7

Open
nischalj10 opened this issue Sep 17, 2024 · 2 comments
Open

Gemini Flash support #7

nischalj10 opened this issue Sep 17, 2024 · 2 comments
Labels
enhancement minor enhancements good first issue Good for newcomers

Comments

@nischalj10
Copy link
Member

heard a lotta good things about the model. we should try and add support for google as a provider.

@Taytay
Copy link

Taytay commented Sep 17, 2024

What would you think about using one of the wrapper libs (litellm, etc) to make it easier to swap LLMs out?

@nischalj10
Copy link
Member Author

most of the LLMs don't support structured outputs like OpenAI's latest models - which leads to poor reliability of taking actions.

Also, we use http://python.useinstructor.com under the hood - but somehow it doesn't work well for models from providers like groq when proxied via litellm

@nischalj10 nischalj10 added enhancement minor enhancements good first issue Good for newcomers labels Sep 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement minor enhancements good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants