Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

high learning rate in later stages #66

Open
coldgemini opened this issue Dec 8, 2017 · 1 comment
Open

high learning rate in later stages #66

coldgemini opened this issue Dec 8, 2017 · 1 comment

Comments

@coldgemini
Copy link

Hi,
why did you use much higher (4X) learning rate from 2 stages and leave only the first stage a small learning rate? If it's kind of a fine tune strategy, why did you not just freeze the first stage? Is it due to some empirical results?

Thank you a lot.

@1Rookiechang
Copy link

have you download the pretrained model?can you share it with me.
because the download site cannot open.thank you very much!
[email protected]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants