Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Allow Config of Local LLM to use with AI Chat feature #653

Open
moringaman opened this issue Oct 18, 2024 · 1 comment
Open

[FEATURE] Allow Config of Local LLM to use with AI Chat feature #653

moringaman opened this issue Oct 18, 2024 · 1 comment

Comments

@moringaman
Copy link

I love the fact that we can now use an Open Ai api key within the new Ai Chat feature but also think that it would be great if you could add support for using an LLM hosted locally for this also. I like many developers use things like Ollama to run various models locally and this would be a great cost saving alternative, which would allow us to use models which have been trained on our own codebases.

I also wouldn't have thought that this would be that big a lift for the RunJs team.

Thanks :-)

@lukehaas
Copy link
Owner

lukehaas commented Oct 19, 2024

Thanks @moringaman. I'd like to provide local LLM support at some point.

I also wouldn't have thought that this would be that big a lift for the RunJs team.

This might be true if there were a RunJS team 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants