You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I added a very descriptive title to this question.
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
Commit to Help
I commit to help with one of those options 👆
Example Code
fromlangchain_awsimportChatBedrockllm=ChatBedrock(
model_id="<ARN>",
model_kwargs=dict(temperature=0),
region_name="<region>",
provider="imported-model"
)
messages= [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
# response = llm.predict(text="Translate to French: I love programming.")# llm.invoke(messages)# llm.invoke(message="Hi")
Description
Error while trying to use custom imported models in bedrock
Traceback (most recent call last):
File "src/llm-bedrock.py", line 18, in <module>
llm.invoke(messages)
File ".venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 286, in invoke
self.generate_prompt(
File ".venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 786, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 643, in generate
raise e
File ".venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate
self._generate_with_cache(
File ".venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/langchain_aws/chat_models/bedrock.py", line 563, in _generate
prompt = ChatPromptAdapter.convert_messages_to_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".venv/lib/python3.11/site-packages/langchain_aws/chat_models/bedrock.py", line 362, in convert_messages_to_prompt
raise NotImplementedError(
NotImplementedError: Provider imported-model model does not support chat.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Checked other resources
Commit to Help
Example Code
Description
Error while trying to use custom imported models in bedrock
System Info
langchain-aws==0.2.9
langchain-core==0.3.22
langsmith==0.1.147
python3.11
Beta Was this translation helpful? Give feedback.
All reactions