Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tongyi llm call error with model "qwen-long" #28923

Open
5 tasks done
niuguy opened this issue Dec 25, 2024 · 0 comments
Open
5 tasks done

Tongyi llm call error with model "qwen-long" #28923

niuguy opened this issue Dec 25, 2024 · 0 comments

Comments

@niuguy
Copy link

niuguy commented Dec 25, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_community.llms.tongyi import Tongyi
import os

tongyi = Tongyi(model="qwen-turbo", api_key=api_key)

response = tongyi.invoke("Who are you?")

print(response)

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Users/feng/Work/my/tests/qwen/api_test.py", line 33, in <module>
    response = tongyi.invoke("Who are you?")
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/language_models/llms.py", line 390, in invoke
    self.generate_prompt(
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/language_models/llms.py", line 755, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/language_models/llms.py", line 950, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/language_models/llms.py", line 792, in _generate_helper
    raise e
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/language_models/llms.py", line 779, in _generate_helper
    self._generate(
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_community/llms/tongyi.py", line 327, in _generate
    [Generation(**self._generation_from_qwen_resp(completion))]
     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/langchain_core/load/serializable.py", line 125, in __init__
    super().__init__(*args, **kwargs)
  File "/Users/feng/Work/my/tests/qwen/.venv/lib/python3.12/site-packages/pydantic/main.py", line 214, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for Generation
text
  Input should be a valid string [type=string_type, input_value=None, input_type=NoneType]
    For further information visit https://errors.pydantic.dev/2.10/v/string_type

Description

I tried other models listed here [qwen-max, qwen-turbo, qwen-plus] which are all working fine

I debugged and stepped into the source code and found that this line should be the root cause as for "qwen-long" the api returns "choices" instead of "text" which results in a Nonetype

you can debug and replicate it like

from langchain_community.llms.tongyi import Tongyi
import os

original_generate = Tongyi._generate

def debug_generate(self, prompts, stop=None, run_manager=None, **kwargs):
    breakpoint()  
    return original_generate(self, prompts, stop, run_manager, **kwargs)

# Patch the method
Tongyi._generate = debug_generate
api_key = os.getenv("DASHSCOPE_API_KEY")

tongyi = Tongyi(model="qwen-turbo", api_key=api_key)

response = tongyi.invoke("Who are you?")

print(response)

This issue could potentially be resolved by checking both resp["output"]["text"] and resp["output"]["choices"] for the response. However, it’s unclear if this inconsistency stems from the Tongyi server. The module’s maintainer might be able to provide clarification.

System Info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 24.1.0: Thu Oct 10 21:03:15 PDT 2024; root:xnu-11215.41.3~2/RELEASE_ARM64_T6000
> Python Version:  3.12.5 (main, Aug 14 2024, 04:32:18) [Clang 18.1.8 ]

Package Information
-------------------
> langchain_core: 0.3.28
> langchain: 0.3.13
> langchain_community: 0.3.13
> langsmith: 0.2.6
> langchain_text_splitters: 0.3.4

Optional packages not installed
-------------------------------
> langserve

Other Dependencies
------------------
> aiohttp: 3.11.11
> async-timeout: Installed. No version info available.
> dataclasses-json: 0.6.7
> httpx: 0.28.1
> httpx-sse: 0.4.0
> jsonpatch: 1.33
> langsmith-pyo3: Installed. No version info available.
> numpy: 2.2.1
> orjson: 3.10.12
> packaging: 24.2
> pydantic: 2.10.4
> pydantic-settings: 2.7.0
> PyYAML: 6.0.2
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> SQLAlchemy: 2.0.36
> tenacity: 9.0.0
> typing-extensions: 4.12.2
> zstandard: Installed. No version info available.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant