Inconsistent promp when calling LLMGraphTransformer with ChatOllama using from langchain_ollama import ChatOllama #26614
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
In this code example the outcome of of processing a simple text through LLMGraphTransformer changes if I replace the definition of ChatOllama from
from langchain_community.chat_models import ChatOllama
to
from langchain_ollama import ChatOllama
Output :
from langchain_community.chat_models import ChatOllama
page_content='The dog has a ball which is red'
[GraphDocument(nodes=[Node(id='dog', type='Animal', properties={}), Node(id='ball', type='Object', properties={})], relationships=[Relationship(source=Node(id='dog', type='Animal', properties={}), target=Node(id='ball', type='Object', properties={}), type='HAS_OBJECT', properties={})], source=Document(metadata={}, page_content='The dog has a ball which is red'))]
from langchain_ollama import ChatOllama
page_content='The dog has a ball which is red'
[GraphDocument(nodes=[Node(id='John Doe', properties={}), Node(id='Red Ball', properties={})], relationships=[], source=Document(metadata={}, page_content='The dog has a ball which is red'))]
System Info
langchain_core '0.3.1'
langchain_community '0.3.0'
langchain_experimental '0.3.0'
langchain_ollama '0.2.0'
The text was updated successfully, but these errors were encountered: