Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Performance when small batch size #15

Open
SongDoHou opened this issue Sep 1, 2022 · 0 comments
Open

Performance when small batch size #15

SongDoHou opened this issue Sep 1, 2022 · 0 comments

Comments

@SongDoHou
Copy link

Hi, thank you for providing a awesome self-supervised learning research!

I'm wonder that how the performance will decrease when we use small batch size like between 128 ~ 512 for DEIT base model.

If we cannot use large batch size (ex: 1024) on base model, is it better to use smaller model with large batch size?

Thanks in advance!!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant