Add flash attention support for FP8 Mistral #4437
fast_tests.yml
on: pull_request
Run tests for optimum.habana.transformers
3m 8s
Run tests for optimum.habana.diffusers
26m 39s
Annotations
4 warnings