Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

调度器学习率的更新是否该在EpochRunner中进行 #94

Open
LeoJhonSong opened this issue May 29, 2024 · 0 comments
Open

调度器学习率的更新是否该在EpochRunner中进行 #94

LeoJhonSong opened this issue May 29, 2024 · 0 comments

Comments

@LeoJhonSong
Copy link

我想以StepLR为调度器, epoch数为步长更新学习率, 由于我使用了课程学习因此随着epoch增多每个epoch中批数并不一样多, 但当前代码中调度器的学习率更新是在StepRunner中进行的, 这导致我没法以torch.optim.lr_scheduler.StepLR(optimizer, step_size=epoch_step*batch_size, gamma=0.5)的变通方式实现我的想法. 想问问有什么调用方式可以实现以epoch数为步长更新学习率吗? 还是我只能自己覆写StepRunner和EpochRunner的行为呢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant