Skip to content

feat: implement inference server by using vllm #3090

feat: implement inference server by using vllm

feat: implement inference server by using vllm #3090

Triggered via pull request October 23, 2024 23:34
Status Success
Total duration 10m 26s
Artifacts

tests.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Deployment protection rules

Reviewers, timers, and other rules protecting deployments in this run
Event Environments Comment
zhuangqh
approved Oct 23, 2024
unit-tests