Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Include full checkpoint #12

Open
mhamilton723 opened this issue Jul 30, 2022 · 1 comment
Open

Include full checkpoint #12

mhamilton723 opened this issue Jul 30, 2022 · 1 comment

Comments

@mhamilton723
Copy link

Hey thanks so much for making this code public. I'm trying to use this in a project and would love to be able to continue the training process. Could you please upload a checkpoint with all of the optimizer state? Thanks so much for your help!

@Aaditya-Singh
Copy link

It'd also be great if the finetuned checkpoint can be released. I tried to finetune MSN ViTB-16 with MAE's finetuning configs (as mentioned here) but it only reaches to 82.2% accuracy on IN1k validation set, as opposed to 83.4% mentioned in Table 4 of the paper, and also does worse on the distribution shifts.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants