Replies: 2 comments
-
Dear all, I am also interested in this topic. How does the QAT work here? Is quantization applied in forward and backward pass like in Pytorch or only in forward path? Kind regards, |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yes, we use STE during QAT |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear all,
I can't find any mention about how brevitas layers are trained. Does model use Straight Through Estimator(STE) when it encounters quantization layer in its backward path?
Beta Was this translation helpful? Give feedback.
All reactions