You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
Hey thanks so much for making this code public. I'm trying to use this in a project and would love to be able to continue the training process. Could you please upload a checkpoint with all of the optimizer state? Thanks so much for your help!
The text was updated successfully, but these errors were encountered:
It'd also be great if the finetuned checkpoint can be released. I tried to finetune MSN ViTB-16 with MAE's finetuning configs (as mentioned here) but it only reaches to 82.2% accuracy on IN1k validation set, as opposed to 83.4% mentioned in Table 4 of the paper, and also does worse on the distribution shifts.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hey thanks so much for making this code public. I'm trying to use this in a project and would love to be able to continue the training process. Could you please upload a checkpoint with all of the optimizer state? Thanks so much for your help!
The text was updated successfully, but these errors were encountered: