Skip to content

Issues: sobelio/llm-chain

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Is there a way to run GGUF models?
#295 opened Jun 23, 2024 by tirithen
Add dev container to repo
#290 opened May 5, 2024 by dfberry
Pre-compiled llama.cpp
#263 opened Jan 26, 2024 by CMorrison82z
support LCEL in langchain?
#253 opened Jan 7, 2024 by npuichigo
add support for Mistral using TGI / vllm / candle enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
#225 opened Oct 21, 2023 by pabl-o-ce
LLM Memory
#214 opened Sep 5, 2023 by ragyabraham
Access intermediary step results
#168 opened Jun 29, 2023 by b0xtch
Implement ReACt agent enhancement New feature or request
#151 opened Jun 6, 2023 by Pablo1785
Add support for llm-mpt help wanted Extra attention is needed
#144 opened Jun 1, 2023 by panosAthDBX
Add usage info to output
#109 opened May 2, 2023 by williamhogman
ProTip! Exclude everything labeled bug with -label:bug.