You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was thinking that adding the option to use a PyTorch learning rate scheduler could improve results and wont be hard to implement.
I guess one could do it like that:
parameters = [p for net in nets for p in net.parameters()] # list of paramters of all networks
MY_LEARNING_RATE = 5e-3
optimizer = torch.optim.Adam(parameters, lr=MY_LEARNING_RATE, ...)
scheduler = ExponentialLR(optimizer, gamma=0.9)
solver = Solver1D(..., nets=nets, optimizer=optimizer, lr_scheduler=scheduler)
And then do a scheduler step after one train and valid epoch like that:
for local_epoch in loop:
self.run_train_epoch()
self.run_valid_epoch()
if self.lr_scheduler:
self.lr_scheduler.step()
Is this needed?
The text was updated successfully, but these errors were encountered:
I was thinking that adding the option to use a PyTorch learning rate scheduler could improve results and wont be hard to implement.
I guess one could do it like that:
And then do a scheduler step after one train and valid epoch like that:
Is this needed?
The text was updated successfully, but these errors were encountered: