Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

from flash_attn.flash_attention import FlashAttention . Where is FlashAttention?Why can't I import it? #64

Open
yaoyaoleY opened this issue Jun 16, 2024 · 5 comments

Comments

@yaoyaoleY
Copy link

yaoyaoleY commented Jun 16, 2024

Tasks

No tasks being tracked yet.
@RICKand-MORTY
Copy link

is a package.use pip install flash_attn==0.2.8 to install

@yaoyaoleY
Copy link
Author

Thanks But i install it when i come to a problem
' note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash_attn
Running setup.py clean for flash_attn
Failed to build flash_attn
ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects'

how can i solve it ?
Uploading 截屏2024-06-17 20.57.38.png…

@RICKand-MORTY
Copy link

Maybe your environment is not suitable. Here is my environment:
python:3.11.5
torch:2.0.0
torchvision: 0.15.1
flash-attn:0.2.8
If still has problem in installing flash_attn,try installing manually,see here:https://github.com/Dao-AILab/flash-attention

@yaoyaoleY
Copy link
Author

Thank you very much

@RICKand-MORTY
Copy link

环境配置有点麻烦,之前遇到过:#57

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants