-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
243 max chat history length #259
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! Looks super thorough. I am right in saying this doesn't include a limit on the session chat history array? Fine if not but we should raise a new ticket to deal with that.
I've put const maxMessageLength = 1000 on line 385 to limit it - i'll rename that variable actually as it's not the best. Are you happy with a limit of 1000 or would you prefer another? |
ahh completely missed that 😅 yeah 1000 is more than enough I reckon. |
* remove old messages from chat history when queue limit reached * filter chat history based on max tokens * add max token sizes for each model * fix selecting gpt model not updating * fix the button * rename max chat history variable
I set the max chat history for the session to be 1000 although happy to make larger/smaller.
For the limit of sending tokens to the api, i've used the max token limit per model.
It might help performance to decrease this further and summarise earlier conversion but i've added a separate ticket for this #260
Think it would be useful to performance test this in some way to see if the overhead of token calculation and filtering is worth it