-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AIQuality Really Drops Off When There is a lot of context #269
Labels
Comments
I think this could be partially due to the fact that our embeddings computation foyle/app/pkg/oai/embeddings.go Line 22 in e2feb99
Tries to compute the embeddings of the full notebook. If this exceeds the context then know embeddings are retrieved and we don't get any RAG results. So for really long learning we don't benefit from RAG. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Here's a long notebook
https://gist.github.com/jlewi/31857545ec62b36d2949ccd904918d53
The final markdown cell contained the following prompt
The LLM gave a really terrible response. It ended up giving a curl command with GetTrace which is a non-existent RPC.
The generated block id was 01J9545C1FHMBD9YNHFYTZGGSY.
I think the problem is all of the context (both in the notebook itself) and also likely retrieved examples is confusing the AI.
The text was updated successfully, but these errors were encountered: