You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bitwise operations are very common, and most deep learning frameworks support these operations. However, Burn seems to lack support for them.
For me, if bitwise operations were available, I could easily implement NumPy's packbits and unpackbits functions in Burn. These functions would help me save 87.5% of the space when storing tensors in one-hot or multi-hot representations.
The text was updated successfully, but these errors were encountered:
Bitwise operations are very common, and most deep learning frameworks support these operations. However, Burn seems to lack support for them.
For me, if bitwise operations were available, I could easily implement NumPy's
packbits
andunpackbits
functions in Burn. These functions would help me save 87.5% of the space when storing tensors in one-hot or multi-hot representations.The text was updated successfully, but these errors were encountered: