-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VarianceScaling Initializer Is Unseeded #218
Comments
fix would be to change 'initializer' to be a function - which constructs a VarianceScaling with appropriate parameters - so each FFN gets its own instance rather than sharing the same. |
creating two instances of the initializer does not get rid of the warning. It seems a seed is required to get rid of the warning even if seed is hardcoded to 0 |
hmm - maybe there is more than one broken place? |
I'm giving this a test now... |
def _warn_reuse(self):
if getattr(self, "_used", False):
if getattr(self, "seed", None) is None:
warnings.warn(
f"The initializer {self.__class__.__name__} is unseeded "
"and being called multiple times, which will return "
"identical values each time (even if the initializer is "
"unseeded). Please update your code to provide a seed to "
"the initializer, or avoid using the same initalizer "
"instance more than once."
)
else:
self._used = True seems the source code is only checking for seed is None |
yes - because they assume if you are setting the seed you know what you are doing... |
I also see no improvement with the fix... |
oh ffn creates 2 dense... |
@Tilps ran into this issue after upgrading TensorFlow:
From Tilps:
I assume it is this code the above warning references.
@masterkni6, can you look at this when you get a chance?
The text was updated successfully, but these errors were encountered: