You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def__getstate__(self):
d= {k: getattr(self, k) forkinself.__slots__}
d["_raw_tensor"] =None# do not store the TF tensorsreturnddef__setstate__(self, state):
fork, vinstate.items():
setattr(self, k, v)
Why is that?
First of all, the reason should be documented. Because I don't exactly remember anymore why we do it this way.
But further, this causes some problems in RF modules where we keep some auxiliary constant tensors around, e.g.rf.PiecewiseLinear. If there is a good reason to keep it this way, we need to think about how to solve it in rf.PiecewiseLinear. One solution is to just use rf.Parameter (with auxiliary=True) instead. But a developer might run into this problem again and the error was very confusing. Namely, in the __init__, I was double checking that raw_tensor is set, but then in __call__, raw_tensor was not set anymore, and this caused raw_backend to be None, so many RF functions will fail with AttributeError: 'NoneType' object has no attribute ....
Maybe it makes sense to control the behavior (e.g. via a context scope) to switch between copying raw_tensor and not copying it. We first should understand the reason why we do not copy it.
(Maybe we can just make a dummy draft PR where we remove this line of code, i.e. where we do copy it, and see what tests are failing...)
I assume when I wrote this, I thought mostly about rf.Parameter, where I thought that you never want to copy the content but instead the ParamInit (which could be raw values, in which case it would be copied, but usually it's some random init scheme, so only the scheme is copied but not the values).
deepcopy
onTensor
will not copy theraw_tensor
:Why is that?
First of all, the reason should be documented. Because I don't exactly remember anymore why we do it this way.
But further, this causes some problems in RF modules where we keep some auxiliary constant tensors around, e.g.
rf.PiecewiseLinear
. If there is a good reason to keep it this way, we need to think about how to solve it inrf.PiecewiseLinear
. One solution is to just userf.Parameter
(withauxiliary=True
) instead. But a developer might run into this problem again and the error was very confusing. Namely, in the__init__
, I was double checking thatraw_tensor
is set, but then in__call__
,raw_tensor
was not set anymore, and this causedraw_backend
to beNone
, so many RF functions will fail withAttributeError: 'NoneType' object has no attribute ...
.Maybe it makes sense to control the behavior (e.g. via a context scope) to switch between copying
raw_tensor
and not copying it. We first should understand the reason why we do not copy it.(Maybe we can just make a dummy draft PR where we remove this line of code, i.e. where we do copy it, and see what tests are failing...)
Test case (e.g. for
test_torch_frontend.py
):The text was updated successfully, but these errors were encountered: