Replies: 1 comment
-
I have the same error when using logit_bias |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm sending the following api request to the
/v1/completions
endpoint, and receiving an error, but when I removelogit_bias
it works. I've also tried giving the key as a number instead of a string, to no effect.I'd really appreciate some help.
And here's the error:
I've updated text-generation-webui today (2024-04-27), the model is llama-3-8b, and it doesn't seem to have any probelms with
logit_bias
if I load it in llama.cpp and give it an equivalent request.Anyone got an idea what's going on here?
Beta Was this translation helpful? Give feedback.
All reactions