Add flash attention support for FP8 Mistral #4440
fast_tests.yml
on: pull_request
Run tests for optimum.habana.transformers
3m 9s
Run tests for optimum.habana.diffusers
24m 58s
Annotations
4 warnings