-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat : convert from openai to ollama model #24
Conversation
Warning Rate Limit Exceeded@rajadilipkolli has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 19 minutes and 31 seconds before requesting another review. How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. WalkthroughThe update transitions the AI model in a Spring application from "gpt-3.5-turbo" to "llama2" for chat and embedding functionalities. It also introduces a configuration for vector storage dimensionality. Additionally, the test environment is enhanced with an Changes
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 1
Configuration used: CodeRabbit UI
Files ignored due to path filters (1)
llm-rag-with-springai/pom.xml
is excluded by:!**/*.xml
Files selected for processing (1)
- llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java (2 hunks)
Additional comments: 2
llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java (2)
- 3-3: The import of
java.io.IOException
is appropriate given the operations within theollama
bean method that can throw this exception.- 9-9: The addition of
org.springframework.test.context.DynamicPropertyRegistry
is crucial for dynamically configuring properties, which is used effectively in theollama
bean method.
@Bean | ||
OllamaContainer ollama(DynamicPropertyRegistry properties) throws UnsupportedOperationException, IOException, InterruptedException { | ||
OllamaContainer ollama = new OllamaContainer("ollama/ollama"); | ||
ollama.start(); | ||
ollama.execInContainer("ollama", "pull", "mistral"); | ||
properties.add("spring.ai.ollama.base-url", ollama::getEndpoint); | ||
return ollama; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ollama
bean method is well-structured and follows good practices for initializing and configuring the OllamaContainer
. However, there are a few points to consider for improvement:
-
Exception Handling: The method declares it can throw
UnsupportedOperationException
,IOException
, andInterruptedException
. WhileIOException
andInterruptedException
are justified due toollama.execInContainer
andollama.start()
,UnsupportedOperationException
is not explicitly thrown within the method. Ensure that this exception is necessary or remove it from the method signature if it's not. -
Hardcoded Values: The method contains hardcoded values such as the Docker image name
"ollama/ollama"
and the command parameters"ollama", "pull", "mistral"
. Consider externalizing these values to configuration properties or constants for easier maintenance and configuration. -
Error Handling: The method starts the container and executes a command within it but does not explicitly handle potential errors from these operations. Consider adding error handling or logging to improve resilience and debuggability.
- OllamaContainer ollama(DynamicPropertyRegistry properties) throws UnsupportedOperationException, IOException, InterruptedException {
+ OllamaContainer ollama(DynamicPropertyRegistry properties) throws IOException, InterruptedException {
OllamaContainer ollama = new OllamaContainer("ollama/ollama");
ollama.start();
try {
ollama.execInContainer("ollama", "pull", "mistral");
} catch (Exception e) {
// Handle or log the exception
throw new RuntimeException("Failed to execute command in Ollama container", e);
}
properties.add("spring.ai.ollama.base-url", ollama::getEndpoint);
return ollama;
}
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.
@Bean | |
OllamaContainer ollama(DynamicPropertyRegistry properties) throws UnsupportedOperationException, IOException, InterruptedException { | |
OllamaContainer ollama = new OllamaContainer("ollama/ollama"); | |
ollama.start(); | |
ollama.execInContainer("ollama", "pull", "mistral"); | |
properties.add("spring.ai.ollama.base-url", ollama::getEndpoint); | |
return ollama; | |
} | |
@Bean | |
OllamaContainer ollama(DynamicPropertyRegistry properties) throws IOException, InterruptedException { | |
OllamaContainer ollama = new OllamaContainer("ollama/ollama"); | |
ollama.start(); | |
try { | |
ollama.execInContainer("ollama", "pull", "mistral"); | |
} catch (Exception e) { | |
// Handle or log the exception | |
throw new RuntimeException("Failed to execute command in Ollama container", e); | |
} | |
properties.add("spring.ai.ollama.base-url", ollama::getEndpoint); | |
return ollama; | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Configuration used: CodeRabbit UI
Files selected for processing (2)
- llm-rag-with-springai/src/main/resources/application.properties (1 hunks)
- llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java (2 hunks)
Files skipped from review as they are similar to previous changes (1)
- llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java
Additional comments: 4
llm-rag-with-springai/src/main/resources/application.properties (4)
- 5-6: The update from "gpt-3.5-turbo" to "llama2" for chat options and the setting of the response format to "json_object" align well with the PR's objective of transitioning to the Ollama model. This change should be reflected in the documentation and any relevant parts of the application that interact with the AI model to ensure seamless integration and usage.
- 5-5: Ensure that the "llama2" model is compatible with all existing functionalities and that any necessary adjustments have been made to accommodate the new model's requirements or limitations. It might be beneficial to add comments or documentation specifying why "llama2" was chosen and any significant differences from "gpt-3.5-turbo" that could affect the application's behavior.
- 6-6: Setting the response format to "json_object" is a good practice for ensuring structured and predictable responses from the AI model. Verify that the application's logic that processes these responses is updated accordingly to handle the "json_object" format efficiently.
- 8-8: The commented-out configuration for the OpenAI image model (
dall-e-3
) suggests that image generation features might either be deprecated or transitioned to a different model. If this is an intentional deprecation, consider removing the line entirely to avoid confusion. If transitioning to a different model for image generation, ensure that the new model's configuration is added and documented appropriately.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Configuration used: CodeRabbit UI
Files selected for processing (1)
- llm-rag-with-springai/src/main/resources/application.properties (1 hunks)
Additional comments: 3
llm-rag-with-springai/src/main/resources/application.properties (3)
- 5-6: The update to use "llama2" for chat options and specifying the response format as
json_object
aligns well with the PR objectives and is a good practice for handling structured data.- 8-8: The change to "llama2" for embedding options is consistent with the transition to the Ollama model and is approved.
- 10-10: The addition of
spring.ai.vectorstore.pgvector.dimensions=1536
seems to be aligned with leveraging the Ollama model's capabilities. It's important to verify that1536
matches the expected dimensions for embeddings as per the Ollama model's documentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Configuration used: CodeRabbit UI
Files selected for processing (1)
- llm-rag-with-springai/src/main/resources/application.properties (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- llm-rag-with-springai/src/main/resources/application.properties
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Configuration used: CodeRabbit UI
Files selected for processing (1)
- llm-rag-with-springai/src/main/resources/application.properties (1 hunks)
Files skipped from review as they are similar to previous changes (1)
- llm-rag-with-springai/src/main/resources/application.properties
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Configuration used: CodeRabbit UI
Files ignored due to path filters (1)
llm-rag-with-springai/pom.xml
is excluded by!**/*.xml
Files selected for processing (2)
- llm-rag-with-springai/src/main/resources/application.properties (1 hunks)
- llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java (1 hunks)
Files skipped from review as they are similar to previous changes (2)
- llm-rag-with-springai/src/main/resources/application.properties
- llm-rag-with-springai/src/test/java/com/learning/ai/llmragwithspringai/TestLlmRagWithSpringAiApplication.java
closing in favor of #26 |
Summary by CodeRabbit
New Features
Tests