Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] completions接口是否支持传入提示词模版id? #5136

Open
edwardlzh opened this issue Dec 13, 2024 · 5 comments
Open

[BUG] completions接口是否支持传入提示词模版id? #5136

edwardlzh opened this issue Dec 13, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@edwardlzh
Copy link

edwardlzh commented Dec 13, 2024

问题描述 / Problem Description
kb_chat接口支持传入prompt_name,completions接口呢?

复现问题的步骤 / Steps to Reproduce
我正在使用知识库对话(手动传入参数)的方式调用completions接口,在请求根节点下加prompt_name或者在tool_input中加好像都不支持。

预期的结果 / Expected Result
知识库对话(手动传入参数)的方式调用completions接口,支持传prompt_name,并能找到配置文件中的提示词模版

实际结果 / Actual Result
好像不生效

环境信息 / Environment Information

  • Langchain-Chatchat 版本 / commit 号:0.3.1.3
    @948024326
@edwardlzh edwardlzh added the bug Something isn't working label Dec 13, 2024
Copy link

dosubot bot commented Dec 13, 2024

@edwardlzh 你好!我是Dosu,一个可以帮助你解决bug、回答问题和成为贡献者的机器人。在等待人类维护者的期间,我将会尽力帮助你。

The completions interface in Langchain-Chatchat does support the prompt_name parameter. It is defined with a default value of "default" and is used in the completion_iterator function to retrieve the prompt template from the configuration [1]. If you are experiencing issues with it not working as expected, you might want to ensure that the prompt_name you are using is correctly configured in the configs/prompt_config.py file.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@edwardlzh
Copy link
Author

@edwardlzh 你好!我是Dosu,一个可以帮助你解决bug、回答问题和成为贡献者的机器人。在等待人类维护者的期间,我将会尽力帮助你。

The completions interface in Langchain-Chatchat does support the prompt_name parameter. It is defined with a default value of "default" and is used in the completion_iterator function to retrieve the prompt template from the configuration [1]. If you are experiencing issues with it not working as expected, you might want to ensure that the prompt_name you are using is correctly configured in the configs/prompt_config.py file.Langchain-Chachat 中的completions接口确实支持prompt_name参数。它使用默认值"default"进行定义,并在completion_iterator函数中使用,以从配置中检索提示模板[1] 。如果您遇到它无法按预期工作的问题,您可能需要确保您使用的prompt_nameconfigs/prompt_config.py文件中正确配置。

To continue talking to Dosu, mention @dosu.继续跟Dosu说话,提一下。

Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other帮助Dosu学习!给予反馈:反应很好|不相关的答案|错误来源|太冗长|幻觉|其他

@dosu

  • completions请求如何传参,配置如何配置才能实现呢?
  • 多个知识库存在不同的提示词,又如何传参如何配置呢?

Copy link

dosubot bot commented Dec 13, 2024

To configure and pass parameters in the completions request for different prompt templates and multiple knowledge bases in Langchain-Chatchat, you can use the /knowledge_base/chat/completions API. Here's how you can set it up:

  1. Specify the prompt_name Parameter: Use the prompt_name parameter to select the desired prompt template. Ensure that the prompt_name you are using is correctly configured in the configs/prompt_config.py file.

  2. Set the mode Parameter: Choose the retrieval mode using the mode parameter. For example, use "local_kb" for local knowledge bases.

  3. Provide the kb_name Parameter: Specify the kb_name to indicate which knowledge base to use.

Here is an example of how to configure the request using the openai SDK:

base_url = "http://127.0.0.1:7861/knowledge_base/local_kb/samples"
data = {
    "model": "qwen2-instruct",
    "messages": [
        {"role": "user", "content": "你好"},
        {"role": "assistant", "content": "你好,我是人工智能大模型"},
        {"role": "user", "content": "如何高质量提问?"},
    ],
    "stream": True,
    "temperature": 0.7,
    "extra_body": {
      "top_k": 3,
      "score_threshold": 2.0,
      "return_direct": True,
      "prompt_name": "your_prompt_template_name",  # Specify your prompt template name here
      "mode": "local_kb",  # Specify the retrieval mode
      "kb_name": "your_knowledge_base_name"  # Specify the knowledge base name
    },
}

import openai
client = openai.Client(base_url=base_url, api_key="EMPTY")
resp = client.chat.completions.create(**data)
for r in resp:
    print(r)

This setup allows you to pass the prompt_name and other relevant parameters to configure the request for different prompt templates and knowledge bases [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@948024326
Copy link

没很看懂你需求,可以建议用dify看看能不能满足需求

@edwardlzh
Copy link
Author

edwardlzh commented Dec 16, 2024

没很看懂你需求,可以建议用dify看看能不能满足需求

@948024326 你好,是这样的,我之前是通过/chat/kb_chat接口的prompt_name来选择配置文件中的提示词的,所以我理解prompt_name是一个key,来找到prompt_settings.yaml中配置的提示词模版。现在我切到/chat/chat/completions接口了,如果还是使用prompt_name这种功能(根据prompt_name自动匹配prompt_settings.yaml中配置的提示词模版)该怎么做?completions接口好像不支持prompt_name了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants