OpenAI temperature and seed #1734
thundergore
started this conversation in
General
Replies: 2 comments
-
Yes, you can use the |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you for your response! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Im interested in testing temperature and seed when using openAI for representation: https://platform.openai.com/docs/api-reference/chat/create
I note we've got the Llama2 temperature mentioned in the llm.md but not anything for openAI. I'm wondering if it's something to set in the client, the openAI model or maybe in the prompt itself?
Is it possible?
Beta Was this translation helpful? Give feedback.
All reactions