Accepting an array of Langchain::Messages::*
instances when calling Langchain::LLM::*#chat(messages: [...])
#629
Replies: 4 comments 12 replies
-
@kokuyouwind I created a discussion thread to talk through it here, follow-up to #603 (comment) @drnic I think I recall you had asked something similar?! |
Beta Was this translation helpful? Give feedback.
-
@andreibondarev I think it is a very excellent idea. message_1 = Langchain::Messages::UserMessage.new("hi!")
message_2 = Langchain::Messages::AssistantMessage.new("Hey! How can I help?")
message_3 = Langchain::Messages::AssistantMessage.new("Help me debug my computer")
Langchain::LLM::Anthropic.new(...).chat(messages: [message_1, message_2, message_3]) The class names above are aligned with the role notation, but could be aligned with Python's LangChain Messages, such as |
Beta Was this translation helpful? Give feedback.
-
@kokuyouwind I also looked up what LlamaIndex does. It's got simple ChatMessage class. I think the main takeaway here is that we need a ChatMessage class, and maybe shortcuts for UserMessage, AssistantMessage, FunctionMessage and that the LLM classes themselves are responsibly for serializing the ChatMessage instance on its own. |
Beta Was this translation helpful? Give feedback.
-
@andreibondarev |
Beta Was this translation helpful? Give feedback.
-
I'm thinking that the LLM#chat() methods would be able to accept the messages: [] array that is a collection of
Langchain::Messages::*
instances. The benefits of this are that people won't need to know how the message hash is structured.Examples:
Proposed usage:
Beta Was this translation helpful? Give feedback.
All reactions