You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The conversation that are longer than 2048 tokens gotta be split to multiple conversations.
I am seeing errors like this:
Token indices sequence length is longer than the specified maximum sequence length for this model (4209 > 2048). Running this sequence through the model will result in indexing errors
The text was updated successfully, but these errors were encountered:
This is okay because it's a warning and doesn't actually throw an error, but it would be much better if it included all the content, so you can break it up into chunks if you want.
The conversation that are longer than 2048 tokens gotta be split to multiple conversations.
I am seeing errors like this:
Token indices sequence length is longer than the specified maximum sequence length for this model (4209 > 2048). Running this sequence through the model will result in indexing errors
The text was updated successfully, but these errors were encountered: