-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Accessing the current training iteration step in a custom Layer class #20261
Comments
That is fine. Another thing you can do is make the layer keep its own local iteration counter, incremented every time the layer is called in training mode. If you want access to the optimizer, you can always create the optimizer before model construction and pass it to your custom layer. Then call |
What If the layers are called more than once in a single training iteration? Anyway my point is that it is fine if you only have to pass the counter to a single layer, but It can be cumbersome if there are nested layers. But I guess that it is not trivial to achive this given the actual relations between |
Optimizers live at the model level. For a layer to be aware of the optimizer, the optimizer must be provided to the layer manually. Or else, place the logic that needs to be optimizer-aware at the model level (e.g. in |
Ok, thanks. |
Hi everyone,
Is there a way to access to the current training iteration step when using a custom
Layer
class? Currently the only way I found is to pass it from theModel
class when calling the layer, like this:This is ok, but in my opinion it is a little bit redundant if the custom layer defines other layers that need the parameter as well. It will be convenient to have the optimizer available at layer-level for example. I think that this will be useful in multi-objective scenarios for example, where often you want to anneal the hyperparameter that multiplies a term in the total loss function.
The text was updated successfully, but these errors were encountered: