Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatOllama raises 'invalid format: expected "json" or a JSON schema' on invoke method call when not specifying json format #28753

Closed
5 tasks done
lharrison13 opened this issue Dec 17, 2024 · 8 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@lharrison13
Copy link

lharrison13 commented Dec 17, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

llm = ChatOllama(
    model="llama3:8b",
    temperature=0,
    # other params...
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Users/lukeharrison/Documents/GitHub/wommbot-cli/womm/test.py", line 17, in <module>
    ai_msg = llm.invoke(messages)
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 286, in invoke
    self.generate_prompt(
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 786, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 643, in generate
    raise e
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate
    self._generate_with_cache(
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
    result = self._generate(
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 690, in _generate
    final_chunk = self._chat_stream_with_aggregation(
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 591, in _chat_stream_with_aggregation
    for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 578, in _create_chat_stream
    yield from self._client.chat(**chat_params)
  File "/Users/lukeharrison/Library/Caches/pypoetry/virtualenvs/wommbot-cli-Wl4Uzl6N-py3.10/lib/python3.10/site-packages/ollama/_client.py", line 172, in inner
    raise ResponseError(err)
ollama._types.ResponseError: invalid format: expected "json" or a JSON schema

Description

After upgrading to the latest version of ollama (it now supports json output formatting 🥳 ) I can no longer use the ChatOllama object without specifying format=json. I did not change any of my code just upgraded ollama and it broke.
Using example straight from docs this no longer works:

from langchain_ollama import ChatOllama
from langchain_core.messages import AIMessage

llm = ChatOllama(
    model="llama3:8b",
    temperature=0,
    # other params...
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

but this does

from langchain_ollama import ChatOllama
from langchain_core.messages import AIMessage

llm = ChatOllama(
    model="llama3:8b",
    temperature=0,
    format='json'
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

I dont know if @ccurme you know anything about this since you last edited ChatOllama to support json formatting

System Info

aiohappyeyeballs==2.4.4
aiohttp==3.11.10
aiosignal==1.3.1
annotated-types==0.7.0
anyio==4.7.0
asgiref==3.8.1
async-timeout==4.0.3
attrs==24.2.0
backoff==2.2.1
bcrypt==4.2.1
beautifulsoup4==4.12.3
build==1.2.2.post1
cachetools==5.5.0
certifi==2024.8.30
cffi==1.17.1
charset-normalizer==3.4.0
chroma-hnswlib==0.7.6
chromadb==0.5.23
click==8.1.7
coloredlogs==15.0.1
cryptography==44.0.0
dataclasses-json==0.6.7
deepsearch-glm==0.26.2
Deprecated==1.2.15
docling==2.8.3
docling-core==2.8.0
docling-ibm-models==2.0.7
docling-parse==2.1.2
docutils==0.21.2
durationpy==0.9
easyocr==1.7.2
et_xmlfile==2.0.0
exceptiongroup==1.2.2
fastapi==0.115.6
filelock==3.16.1
filetype==1.2.0
flatbuffers==24.3.25
frozenlist==1.5.0
fsspec==2024.10.0
google-auth==2.36.0
googleapis-common-protos==1.66.0
grpcio==1.68.1
h11==0.14.0
httpcore==1.0.7
httptools==0.6.4
httpx==0.27.2
httpx-sse==0.4.0
huggingface-hub==0.26.5
humanfriendly==10.0
idna==3.10
imageio==2.36.1
importlib_metadata==8.5.0
importlib_resources==6.4.5
Jinja2==3.1.4
joblib==1.4.2
jsonlines==3.1.0
jsonpatch==1.33
jsonpointer==3.0.0
jsonref==1.1.0
jsonschema==4.23.0
jsonschema-specifications==2024.10.1
kubernetes==31.0.0
langchain==0.3.10
langchain-chroma==0.1.4
langchain-community==0.3.10
langchain-core==0.3.24
langchain-huggingface==0.1.2
langchain-ollama==0.2.1
langchain-text-splitters==0.3.2
langserve==0.3.0
langsmith==0.1.147
lazy_loader==0.4
lxml==5.3.0
markdown-it-py==3.0.0
marko==2.1.2
MarkupSafe==3.0.2
marshmallow==3.23.1
mdurl==0.1.2
mmh3==5.0.1
monotonic==1.6
mpmath==1.3.0
multidict==6.1.0
mypy-extensions==1.0.0
networkx==3.2.1
ninja==1.11.1.2
numpy==1.26.4
oauthlib==3.2.2
ollama==0.4.4
onnxruntime==1.20.1
opencv-python-headless==4.10.0.84
openpyxl==3.1.5
opentelemetry-api==1.28.2
opentelemetry-exporter-otlp-proto-common==1.28.2
opentelemetry-exporter-otlp-proto-grpc==1.28.2
opentelemetry-instrumentation==0.49b2
opentelemetry-instrumentation-asgi==0.49b2
opentelemetry-instrumentation-fastapi==0.49b2
opentelemetry-proto==1.28.2
opentelemetry-sdk==1.28.2
opentelemetry-semantic-conventions==0.49b2
opentelemetry-util-http==0.49b2
orjson==3.10.12
overrides==7.7.0
packaging==24.2
pandas==2.2.3
pillow==10.4.0
posthog==3.7.4
propcache==0.2.1
protobuf==5.29.1
pyasn1==0.6.1
pyasn1_modules==0.4.1
pyclipper==1.3.0.post6
pycparser==2.22
pydantic==2.9.2
pydantic-settings==2.6.1
pydantic_core==2.23.4
PyGithub==2.5.0
Pygments==2.18.0
PyJWT==2.10.1
PyNaCl==1.5.0
pypdfium2==4.30.0
PyPika==0.48.9
pyproject_hooks==1.2.0
python-bidi==0.6.3
python-dateutil==2.9.0.post0
python-docx==1.1.2
python-dotenv==1.0.1
python-pptx==1.0.2
pytz==2024.2
PyYAML==6.0.2
referencing==0.35.1
regex==2024.11.6
requests==2.32.3
requests-oauthlib==2.0.0
requests-toolbelt==1.0.0
rich==13.9.4
rpds-py==0.22.3
rsa==4.9
Rtree==1.3.0
safetensors==0.4.5
scikit-image==0.24.0
scikit-learn==1.5.2
scipy==1.13.1
sentence-transformers==3.3.1
shapely==2.0.6
shellingham==1.5.4
six==1.17.0
sniffio==1.3.1
soupsieve==2.6
SQLAlchemy==2.0.36
sse-starlette==2.1.3
starlette==0.41.3
sympy==1.13.3
tabulate==0.9.0
tenacity==9.0.0
threadpoolctl==3.5.0
tifffile==2024.9.20
tokenizers==0.20.3
tomli==2.2.1
torch==2.4.1
torchvision==0.19.1
tqdm==4.67.1
transformers==4.46.3
typer==0.12.5
typing-inspect==0.9.0
typing_extensions==4.12.2
tzdata==2024.2
urllib3==2.2.3
uvicorn==0.32.1
uvloop==0.21.0
watchfiles==1.0.0
websocket-client==1.8.0
websockets==14.1
wrapt==1.17.0
XlsxWriter==3.2.0
yarl==1.18.3
zipp==3.21.0

version: Python 3.10.15

@lharrison13 lharrison13 changed the title ChatOllama raises 'invalid format: expected "json" or a JSON schema' ChatOllama raises 'invalid format: expected "json" or a JSON schema' on invoke Dec 17, 2024
@lharrison13 lharrison13 changed the title ChatOllama raises 'invalid format: expected "json" or a JSON schema' on invoke ChatOllama raises 'invalid format: expected "json" or a JSON schema' on invoke method call when not specifying json format Dec 17, 2024
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Dec 17, 2024
@lharrison13
Copy link
Author

This might be an issue with ollama since its happening in fabric as well danielmiessler/fabric#1209

@jmorganca
Copy link
Contributor

Hi all, sorry about this. We are working on a fix.

@xindoreen
Copy link

Getting the same error when using chat from ollama.

@lharrison13
Copy link
Author

lharrison13 commented Dec 17, 2024

Thanks @jmorganca!

@iotnxt
Copy link

iotnxt commented Dec 17, 2024

I am also seeing the same error and also right after upgrading ollama:
Error querying LLM: invalid format: expected "json" or a JSON schema

my application used to work before the ollama upgrade:
ollama version is 0.5.2-0-g60f7556-dirty

@roychowdhuryrohit-dev
Copy link

I can confirm this issue occurs using LlamaIndex+Ollama(docker) but not with Ollama(Mac app).

@utkucanaytac
Copy link

setting llm.format = None solves it.

@jmorganca
Copy link
Contributor

Hi all, this should be fixed in version 0.5.3: https://github.com/ollama/ollama/releases/tag/v0.5.3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

7 participants
@jmorganca @utkucanaytac @roychowdhuryrohit-dev @iotnxt @lharrison13 @xindoreen and others