Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LabelSmoothingCrossEntropy中的疑问 #4

Open
luxuantao opened this issue Sep 11, 2021 · 2 comments
Open

LabelSmoothingCrossEntropy中的疑问 #4

luxuantao opened this issue Sep 11, 2021 · 2 comments

Comments

@luxuantao
Copy link

LabelSmoothingCrossEntropy这个函数最终返回的总loss的前半部分: loss*self.eps/c ,这里c是类别个数,我发现有的公式里写的这里应该是除以类别个数减一。
请教一下到底要不要减一

@MuQiuJun-AI
Copy link
Owner

确实有些实现是类别数减一,不过我认为不需要减一,原因有三:
1、labelsmooth的原始论文中的公式,分母就是类别数
2、在fastai中,实现的LabelSmoothingCrossEntropy也是没有减一的
3、事实上减不减一,对最终的优化效果并没有什么影响,唯一影响的只是每个batch的loss大小而已

我更坚持fastai的实现

@luxuantao
Copy link
Author

非常感谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants