You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currenly if A and B are inputs, we always compute the gradient with respect to both A and B in the backward pass irrespective of whether this is requested by the caller.
Currenly if A and B are inputs, we always compute the gradient with respect to both A and B in the backward pass irrespective of whether this is requested by the caller.
As shown in the extending pytorch example
https://pytorch.org/docs/stable/notes/extending.html#example
we should make use of a pattern invoving
to avoid unnecessary computation.
The text was updated successfully, but these errors were encountered: