Skip to content

Issues: ModelTC/lightllm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[BUG] cannot launch server after builting from source bug Something isn't working
#679 opened Dec 24, 2024 by MisterBrookT
Serving VLM VILA
#644 opened Dec 4, 2024 by anhnhust
支持encoder-only模型 bug Something isn't working
#573 opened Oct 20, 2024 by EvanSong77
CPU Inference
#563 opened Oct 13, 2024 by JocelynPanPan
请支持minicpmv2.5 bug Something isn't working
#482 opened Aug 1, 2024 by LDLINGLINGLING
有不通过http的其他推理入口吗 bug Something isn't working
#466 opened Jul 15, 2024 by mmdbhs
[Feature]: Suport for InternVL-Chat-V1-5 bug Something isn't working
#462 opened Jul 10, 2024 by JingofXin
Add Support to Florence-2 ! bug Something isn't working
#456 opened Jul 5, 2024 by KaifAhmad1
[BUG]Ask aboout Qwen models with weight quantization . bug Something isn't working
#408 opened May 15, 2024 by Cesilina
1 task
[BUG] There already is a lightllm in pypi bug Something isn't working
#380 opened Mar 26, 2024 by rlippmann
1 task
[BUG] stop_words bug Something isn't working
#326 opened Feb 2, 2024 by baisechundu
[BUG] Support for DeepSeek? bug Something isn't working
#325 opened Feb 2, 2024 by suhjohn
ProTip! Exclude everything labeled bug with -label:bug.