Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regularization Techniques #8

Open
twinkle485 opened this issue Oct 13, 2024 · 0 comments
Open

Regularization Techniques #8

twinkle485 opened this issue Oct 13, 2024 · 0 comments
Labels
hacktoberfest For participants of hacktoberfest

Comments

@twinkle485
Copy link
Contributor

Participants can apply L2 weight decay or dropout on LoRA layers. These techniques help mitigate overfitting and ensure better generalization of the model.
Ensure that you've read the guidelines present in CONTRIBUTING.md as well as the CODE_OF_CONDUCT.md.

@sohambuilds sohambuilds added the hacktoberfest For participants of hacktoberfest label Oct 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hacktoberfest For participants of hacktoberfest
Projects
None yet
Development

No branches or pull requests

2 participants