Skip to content

Add offline inference with vllm backend #104

Add offline inference with vllm backend

Add offline inference with vllm backend #104

Re-run triggered June 11, 2024 07:48
Status Failure
Total duration 3m 47s
Artifacts

test.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

5 errors and 2 warnings
test
RPC failed; curl 16 Error in the HTTP2 framing layer
test
error reading section header 'acknowledgments'
test
unable to access 'https://github.com/FlagOpen/FlagScale/': HTTP/2 stream 1 was not closed cleanly before end of the underlying stream
test
unable to access 'https://github.com/FlagOpen/FlagScale/': Failed to connect to github.com port 443 after 130761 ms: Connection timed out
test
The process '/usr/bin/git' failed with exit code 128
test
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v2. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
test
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/