-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error Arises during pretraining #1
Comments
Thanks for reporting the issue.
to
At the time when this notebook was created, the model returned tuple by default. Probably this has changed some time later, so that it needs to be explicitly configured now. I will update this notebook to implement the changes. This should fix the issue you've got. |
aramkus Hi |
Hi saugatabose28, |
Thanks aramakus. One final thing I would like to draw your attention. Is RoBERTa a voracious RAM eater? is it a slow learner. Actually, I am trying to run our model on one of the dataset, and RoBERTa has been continuing training on train data for over 48 hours (out of 12 epochs, its completed 6 right now.) have you faced the similar issue? |
Hi
I have been trying implementing your text classification code. But I have encountered an issue during pretraining phase. The following error has arisen during execution:
"
"
Have you ever faced such issue during execution?
The text was updated successfully, but these errors were encountered: