-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 feat: Integrate Amazon Bedrock Support #689
Conversation
…y extendable to include more!)
Add support for Amazon Bedrock models, including: - Implement AWS credentials retrieval for Bedrock - Add Bedrock model initialization and handling - Include Claude 3 models (Opus, Sonnet, Haiku) for Bedrock - Adjust token limits for Bedrock models - Update chat action to support model selection - Add @ai-sdk/amazon-bedrock dependency Key changes: - app/lib/.server/llm/api-key.ts: Add getAWSCredentials function - app/lib/.server/llm/constants.ts: Define MAX_TOKENS_BEDROCK - app/lib/.server/llm/model.ts: Implement getBedrockModel function - app/lib/.server/llm/stream-text.ts: Use Bedrock-specific token limit - app/routes/api.chat.ts: Update to support model selection - app/utils/constants.ts: Add Bedrock model options - package.json: Add @ai-sdk/amazon-bedrock dependency - pnpm-lock.yaml: Update with new dependencies
I implemented Bedrock based on #531 of https://github.com/coleam00/bolt.new-any-llm! |
- Translate comments to English for consistency - Add explanatory comment for AWS credentials function - Refactor default region assignment with inline comment
# Bolt.new Fork by Cole Medin | ||
|
||
This fork of bolt.new allows you to choose the LLM that you use for each prompt! Currently you can use OpenAI, Anthropic, Ollama, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See instructions below for running this locally and extending to include more models. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- you added on readme and I don't know if creators will accept this:
return getOpenAIModel(apiKey, model); | ||
case 'Groq': | ||
return getGroqModel(apiKey, model); | ||
case 'OpenRouter': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps you need change the branch name to pass on CI, something like feat/multi-llm because of semantic Pull request
I changed the branch name, but the content is the same as |
Overview
This PR adds support for Amazon Bedrock models to our LLM integration, enhancing our AI capabilities with Claude 3 models (Opus, Sonnet, Haiku).
Key Changes
Detailed Changes
app/lib/.server/llm/api-key.ts
:getAWSCredentials
function to fetch AWS access keys and regionapp/lib/.server/llm/constants.ts
:MAX_TOKENS_BEDROCK
constant (4096) for Bedrock modelsapp/lib/.server/llm/model.ts
:getBedrockModel
function for Bedrock model initializationgetModel
function to handle Bedrock providerapp/lib/.server/llm/stream-text.ts
:MAX_TOKENS_BEDROCK
)app/routes/api.chat.ts
:app/utils/constants.ts
:package.json
:@ai-sdk/amazon-bedrock
dependency (version 0.0.30)pnpm-lock.yaml
: