Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

希望【设置】模型服务商选项,增加 ollama本地大模型 选项 #5370

Open
bingshan2024 opened this issue Sep 6, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@bingshan2024
Copy link

🥰 需求描述

希望【设置】模型服务商选项,增加 ollama本地大模型 选项
这个需求是在不与互联网连接的局域网内,假设A电脑部署了ollama,下载了一些本地大模型
本机用户可以在A电脑上直接选中ollama本地大模型,设置接口地址为http://localhost:11434,直接对话
其他用户或者更多的用户都可以在本机安装nextchat客户端,然后设置接口地址为http://A电脑的IP:11434,进行对话
用户选择A电脑的ollama本地大模型、接口后
模型(model)选项中出现的是 A电脑上ollama本地大模型的名字(当然如果A电脑的nextchat管理端 管理员可以有权直接设置统一使用某个模型也行)

🧐 解决方案

希望【设置】模型服务商选项,增加 ollama本地大模型 选项
这个需求是在不与互联网连接的局域网内,假设A电脑部署了ollama,下载了一些本地大模型
本机用户可以在A电脑上直接选中ollama本地大模型,设置接口地址为http://localhost:11434,直接对话
其他用户或者更多的用户都可以在本机安装nextchat客户端,然后设置接口地址为http://A电脑的IP:11434,进行对话
用户选择A电脑的ollama本地大模型、接口后
模型(model)选项中出现的是 A电脑上ollama本地大模型的名字(当然如果A电脑的nextchat管理端 管理员可以有权直接设置统一使用某个模型也行)

📝 补充信息

No response

@bingshan2024 bingshan2024 added the enhancement New feature or request label Sep 6, 2024
@nextchat-manager
Copy link

Please follow the issue template to update title and description of your issue.

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Title: I hope to [Settings] the model service provider option and add the ollama local large model option

🥰 Description of requirements

I hope that the [Settings] model service provider option will add the ollama local large model option.
This requirement is within a local area network that is not connected to the Internet. Assume that computer A has deployed ollama and downloaded some large local models.
Local users can directly select the ollama local large model on computer A, set the interface address to http://localhost:11434, and have a direct conversation
Other users or more users can install the nextchat client on this machine, and then set the interface address to http://A's computer's IP: 11434 to have a conversation.
After the user selects the ollama local large model and interface of computer A,
What appears in the model option is the name of the ollama local large model on computer A (of course, if the administrator of the nextchat management terminal of computer A has the right to directly set the unified use of a certain model)

🧐 Solution

I hope that the [Settings] model service provider option will add the ollama local large model option.
This requirement is within a local area network that is not connected to the Internet. Assume that computer A has deployed ollama and downloaded some large local models.
Local users can directly select the ollama local large model on computer A, set the interface address to http://localhost:11434, and have a direct conversation
Other users or more users can install the nextchat client on this machine, and then set the interface address to http://A's computer's IP: 11434 to have a conversation.
After the user selects the ollama local large model and interface of computer A,
What appears in the model option is the name of the ollama local large model on computer A (of course, if the administrator of the nextchat management terminal of computer A has the right to directly set the unified use of a certain model)

📝 Supplementary information

No response

@williamwa
Copy link

that's a good idea!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants