Skip to content

Add flash attention support for FP8 Mistral #4440

Add flash attention support for FP8 Mistral

Add flash attention support for FP8 Mistral #4440

Run tests for optimum.habana.transformers

succeeded Jul 26, 2024 in 3m 9s