openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': "This model's maximum context length is 4096 tokens. #4295
Unanswered
Liwan-Chen
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': "This model's maximum context length is 4096 tokens. However, your messages resulted in 4544 tokens. Please reduce the length of the messages.", 'code': 40303}
使用的[Llama-3-70B] 模型 max token 是8k, 为什么会出现这个token 的限制?
Beta Was this translation helpful? Give feedback.
All reactions