Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we use a local LLm #156

Closed
amintt2 opened this issue Oct 9, 2023 · 1 comment
Closed

Can we use a local LLm #156

amintt2 opened this issue Oct 9, 2023 · 1 comment

Comments

@amintt2
Copy link

amintt2 commented Oct 9, 2023

One that run locally on a computer with code llama 2 ?

@Alphamasterliu
Copy link
Contributor

Hello! Regarding the configuration for using other models, you may want to try out this PR#53. You can find the link here: #53. While our team hasn't fully tested it due to time constraints, it has received positive feedback for its effectiveness. Thank you very much for your support and attention to ChatDev. Please feel free to raise any questions or ideas, and we'll do our best to address your needs and continue improving our product.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants