-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@ubiquityos
gpt command
#1
@ubiquityos
gpt command
#1
Conversation
Unused dependencies (1)
Unused types (2)
|
https://platform.openai.com/docs/guides/reasoning I'm not sure which model is best. I'm assuming
|
o1 in my opinion is too slow compared to 4o, I'd prefer to use it and honestly, reasoning models on the OpenAi website have not impressed me so far idk about you guys.
i.e it's faster and cheaper than o1-preview but it drags compared to 4o.
I hope so and as soon as it gets merged. I will apply the finishing touches and it should be mergeable following any other review comments. |
Typically slash command type plugins have a
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be nice to be able to configure the ChatGpt endpoint and model through the configuration (can be inside another issue).
I think it's fine. A comment responding ten seconds later isn't a problem |
I moved |
Some recent additional QA that was built on top of this plugin:
I noticed that I don't have o1 access so I had to specify in the config or it would error for me. I know as an org we'll use o1 but should we use a stable GPT4 as the default to avoid this error for others? |
This PR should be merged separate from your feature. If required branch off from this PR do not add your logic to it This PR is held back by review only |
Realize I never pushed the branch to my repo which facilitated the onboarding bot built on top of this PR ubiquity-os-marketplace/text-vector-embeddings#18 |
Resolves ubiquity-os/plugins-wishlist#29
I followed your prompt template and kept the system message short and sweet.
It seems it's able to lose the question being asked so I think it might be better to prioritize the question.
I think filling the chat history slightly would do the trick