-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOllama raises 'invalid format: expected "json" or a JSON schema' on invoke method call when not specifying json format #28753
Comments
This might be an issue with ollama since its happening in fabric as well danielmiessler/fabric#1209 |
Hi all, sorry about this. We are working on a fix. |
Getting the same error when using chat from ollama. |
Thanks @jmorganca! |
I am also seeing the same error and also right after upgrading ollama: my application used to work before the ollama upgrade: |
I can confirm this issue occurs using LlamaIndex+Ollama(docker) but not with Ollama(Mac app). |
setting |
Hi all, this should be fixed in version 0.5.3: https://github.com/ollama/ollama/releases/tag/v0.5.3 |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
After upgrading to the latest version of ollama (it now supports json output formatting 🥳 ) I can no longer use the ChatOllama object without specifying
format=json
. I did not change any of my code just upgraded ollama and it broke.Using example straight from docs this no longer works:
but this does
I dont know if @ccurme you know anything about this since you last edited ChatOllama to support json formatting
System Info
version: Python 3.10.15
The text was updated successfully, but these errors were encountered: