Skip to content

Issues: alexrozanski/LlamaChat

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

llama3 support
#46 opened Apr 23, 2024 by pentateu
Llamachat is spouting gibberish
#45 opened Nov 16, 2023 by jdblack
Broken link in README
#44 opened Nov 15, 2023 by mdznr
Ollama support
#43 opened Oct 8, 2023 by aptonline
support for amd gpu (macos)
#42 opened Sep 4, 2023 by sukualam
[Feature Request] Support InternLM
#40 opened Aug 29, 2023 by vansinhu
installation environment problem
#37 opened Aug 14, 2023 by p5ydn0
Error using pth format model
#36 opened Jul 30, 2023 by realalexsun
Support ggmlv3
#34 opened Jul 19, 2023 by jingsam
Add iOS support ?
#33 opened Jul 13, 2023 by realcarlos
Support for Open LLaMa? models Additional support for other models
#32 opened Jun 22, 2023 by KnowledgeGarden v2.0
Support for Falcon future Potential future support models Additional support for other models
#31 opened Jun 11, 2023 by alelordelo
Feature Request: detect models in folder (and subfolders) enhancement New feature or request future Potential future support
#27 opened Apr 25, 2023 by mkellerman Future
Failed to load model for eachadea/ggml-vicuna-7b-1.1 bug Something isn't working
#15 opened Apr 16, 2023 by fakechris v2.0
Support downloadable models enhancement New feature or request ux-improvements Improvements to the app's UI or UX
#6 opened Apr 12, 2023 by umaar v2.0
ProTip! no:milestone will show everything without a milestone.