Skip to content

Optimum neuron LLM inference cache builder #72

Optimum neuron LLM inference cache builder

Optimum neuron LLM inference cache builder #72

Triggered via schedule November 10, 2024 00:26
Status Failure
Total duration 3h 43m 25s
Artifacts
Matrix: Create optimum-neuron inference cache
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 1 warning
Create optimum-neuron inference cache (mistral)
Process completed with exit code 1.
Create optimum-neuron inference cache (mixtral)
The self-hosted runner: aws-inf2-48xlarge-use1-public-80-xgb8v-runner-zmzcm lost communication with the server. Verify the machine is running and has a healthy network connection. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.
Create optimum-neuron inference cache (mistral)
This job failure may be caused by using an out of date self-hosted runner. You are currently using runner version 2.319.1. Please update to the latest version 2.320.0