-
Notifications
You must be signed in to change notification settings - Fork 369
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does AIT support BF16 inference now? #899
Comments
@sanbuphy Yes, bfloat16 is now supported: see https://github.com/facebookincubator/AITemplate/blame/2f61d70c4b14d59f557c1fd9a98cbb5030307f9d/python/aitemplate/compiler/dtype.py#L28 |
Thanks , is that mean I can open a switch and it can make the all model run in bf16? |
Yes, in theory that should work. I imagine some ops not having implementations for some of the data types though. Let us know if you notice any hiccups |
Thank's cool ! Let me have a try |
Hi, I would like to explore the model training in BF16 to high performance inference, the model only can run in bf16, if change it to fp16 it will out of normal precision. I try to search and found this issue: #110
The text was updated successfully, but these errors were encountered: