Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement inference server by using vllm #624

Merged
merged 7 commits into from
Oct 24, 2024

fix

960e18a
Select commit
Loading
Failed to load commit list.
Merged

feat: implement inference server by using vllm #624

fix
960e18a
Select commit
Loading
Failed to load commit list.
Codecov / codecov/project failed Oct 23, 2024 in 0s

52.85% (-5.34%) compared to 5c30038

View this Pull Request on Codecov

52.85% (-5.34%) compared to 5c30038

Details

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 52.85%. Comparing base (5c30038) to head (960e18a).
Report is 42 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #624      +/-   ##
==========================================
- Coverage   58.18%   52.85%   -5.34%     
==========================================
  Files          30       34       +4     
  Lines        2987     4221    +1234     
==========================================
+ Hits         1738     2231     +493     
- Misses       1149     1870     +721     
- Partials      100      120      +20     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.