-
Hello Community, I have a NN and I want to convert each layer to Brevitas version. Is there a list of the Brevitas layers? I could not find it in the website and GitHub. Additionally, could you provide any source that explains how to change each layer with the Brevitas one? I think the website is not enough. Best Regards, |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You are right, we need to work on expanding that part of our documentation. Regarding what layers we support, they are in the folder brevitas/nn, but in general we support all the most commonly used PyTorch layers, including RNN and LSTM. Regarding how to convert floating point layers to quantized one, Brevitas offers the possibility of automatically converting a floating point model to its quantized counterpart, preserving all the weights and structure. You can take a look at To see how we do that for imagenet models. Another option would be to manually recreate the network but you'd need to preserve hierarchy so that you can re-load the state dict from the floating point one. Feel free to reach out in case more help is needed. |
Beta Was this translation helpful? Give feedback.
You are right, we need to work on expanding that part of our documentation.
Regarding what layers we support, they are in the folder brevitas/nn, but in general we support all the most commonly used PyTorch layers, including RNN and LSTM.
I will open an issue to add a full list of supported layers to our documentation.
Regarding how to convert floating point layers to quantized one, Brevitas offers the possibility of automatically converting a floating point model to its quantized counterpart, preserving all the weights and structure.
You can take a look at
https://github.com/Xilinx/brevitas/tree/master/src/brevitas_examples/imagenet_classification/ptq/ptq_evaluate.py
To see how we do tha…