Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent promp when calling LLMGraphTransformer with ChatOllama using from langchain_ollama import ChatOllama #26614

Open
5 tasks done
csaiedu opened this issue Sep 18, 2024 · 1 comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@csaiedu
Copy link

csaiedu commented Sep 18, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_community.chat_models import ChatOllama
from langchain_core.documents import Document
from langchain_community.graphs.graph_document import GraphDocument

llm = ChatOllama(model=local model)
llm_transformer = LLMGraphTransformer(
    llm=llm,
)
doc= Document(page_content="The dog has a ball which is red")
graph=llm_transformer.convert_to_graph_documents([doc])
print(graph)

from langchain_ollama import ChatOllama

llm = ChatOllama(model=local model)
llm_transformer = LLMGraphTransformer(
    llm=llm,
)
doc= Document(page_content="The dog has a ball which is red")
graph=llm_transformer.convert_to_graph_documents([doc])
print(graph)



Error Message and Stack Trace (if applicable)

No response

Description

In this code example the outcome of of processing a simple text through LLMGraphTransformer changes if I replace the definition of ChatOllama from

from langchain_community.chat_models import ChatOllama

to

from langchain_ollama import ChatOllama

Output :
from langchain_community.chat_models import ChatOllama
page_content='The dog has a ball which is red'
[GraphDocument(nodes=[Node(id='dog', type='Animal', properties={}), Node(id='ball', type='Object', properties={})], relationships=[Relationship(source=Node(id='dog', type='Animal', properties={}), target=Node(id='ball', type='Object', properties={}), type='HAS_OBJECT', properties={})], source=Document(metadata={}, page_content='The dog has a ball which is red'))]

from langchain_ollama import ChatOllama
page_content='The dog has a ball which is red'
[GraphDocument(nodes=[Node(id='John Doe', properties={}), Node(id='Red Ball', properties={})], relationships=[], source=Document(metadata={}, page_content='The dog has a ball which is red'))]

System Info

langchain_core '0.3.1'
langchain_community '0.3.0'
langchain_experimental '0.3.0'
langchain_ollama '0.2.0'

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Sep 18, 2024
@csaiedu
Copy link
Author

csaiedu commented Sep 18, 2024

It seems to be related by looking at ollama Debug logs, to an incorrect prompt syntax
"[INST] You are a top-tier algorithm designed for extracting ..."
becomes
prompt="[AVAILABLE_TOOLS] [{"type":"function","function":{"name":"DynamicGraph","description":"Represents a graph document consisting of nodes and relationships.","parameters":{"type":"object","required":["nodes","relationships"],"properties":{"nodes":{"type":"","description":"List of nodes"},"relationships":{"type":"","description":"List of relationships"}}}}}][/AVAILABLE_TOOLS][INST] # Knowledge Graph Instructions for GPT-4...

There might have been changed in chatollama to accomodate for function calling ("With Structured Output") that are not working correctly and referecing instruction for GPT4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant