-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the 'logprobs' in the response object #24
Comments
Explanation will work without logpobs but our current scoring method won't,
we will update if we have a better scoring method
…On Wed, Nov 1, 2023 at 8:56 PM Youcheng Huang ***@***.***> wrote:
Hi,
I find that the simulator will *postprocess the response of
'text-davinci-003' using the response field 'logprobs'*.
However, as I read the document of openai, *the field 'logprobs' is going
to be deprecated* since the completion response object will be replaced
by chat completion object, and also the model 'text-davince-003' is going
to be deprecated.
*I now have the access to the gpt-4 and gpt-3.5-turbo with chat completion
response object. Is there any way to conduct the neuron-explainer using the
two models? i.e. without the field 'logprobs'.*
Or, *is it necessary to call 'text-davince-003' and other models with
completion response object that return 'logprobs'.*
Thanks a lot!
—
Reply to this email directly, view it on GitHub
<#24>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ASWBKK7HMDWB3WNQBFDMQWDYCMKWPAVCNFSM6AAAAAA62HZWASVHI2DSMVQWIX3LMV43ASLTON2WKOZRHE3TGNBVHAYDANI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Hi @williamrs-openai , could you explain how the variables in L209 to L212 correspond to the logprobs results given in gpt4-1106? Especially choice["logprobs"]["text_offset"] made me confused. Here I set logprobs=True and top_logprobs=1. choice.logprobs.content include: By the way, how many should I set top_logprobs to? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi,
I find that the simulator will postprocess the response of 'text-davinci-003' using the response field 'logprobs'.
However, as I read the document of openai, the field 'logprobs' is going to be deprecated since the completion response object will be replaced by chat completion object, and also the model 'text-davince-003' is going to be deprecated.
I now have the access to the gpt-4 and gpt-3.5-turbo with chat completion response object. Is there any way to conduct the neuron-explainer using the two models? i.e. without the field 'logprobs'.
Or, is it necessary to call 'text-davince-003' and other models with completion response object that return 'logprobs'.
Thanks a lot!
The text was updated successfully, but these errors were encountered: