We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
0a18450
add ability to use flash attention if using pytorch 2.0, thanks to @c…