You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages.
#37
Open
bahattab opened this issue
Aug 21, 2023
· 1 comment
Problem:
InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages.
Fix:
def get_text_chunks(text):
text_splitter = CharacterTextSplitter(
separator="\n",
chunk_size=500, # the fix is to change from 1000 to 500
chunk_overlap=200,
length_function=len
)
The text was updated successfully, but these errors were encountered:
Problem:
InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages.
Fix:
def get_text_chunks(text):
text_splitter = CharacterTextSplitter(
separator="\n",
chunk_size=500, # the fix is to change from 1000 to 500
chunk_overlap=200,
length_function=len
)
The text was updated successfully, but these errors were encountered: