We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我想以StepLR为调度器, epoch数为步长更新学习率, 由于我使用了课程学习因此随着epoch增多每个epoch中批数并不一样多, 但当前代码中调度器的学习率更新是在StepRunner中进行的, 这导致我没法以torch.optim.lr_scheduler.StepLR(optimizer, step_size=epoch_step*batch_size, gamma=0.5)的变通方式实现我的想法. 想问问有什么调用方式可以实现以epoch数为步长更新学习率吗? 还是我只能自己覆写StepRunner和EpochRunner的行为呢
torch.optim.lr_scheduler.StepLR(optimizer, step_size=epoch_step*batch_size, gamma=0.5)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
我想以StepLR为调度器, epoch数为步长更新学习率, 由于我使用了课程学习因此随着epoch增多每个epoch中批数并不一样多, 但当前代码中调度器的学习率更新是在StepRunner中进行的, 这导致我没法以
torch.optim.lr_scheduler.StepLR(optimizer, step_size=epoch_step*batch_size, gamma=0.5)
的变通方式实现我的想法. 想问问有什么调用方式可以实现以epoch数为步长更新学习率吗? 还是我只能自己覆写StepRunner和EpochRunner的行为呢The text was updated successfully, but these errors were encountered: