You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I notice that the longer(the more batches)that I've trained the more perfect that generated sentences will be,but I find that some generated sentences can be totally the same as some sentences in my train corpus,is it possible?I just wonder whether it just generate the sentences like that or 'copy' like that.
The text was updated successfully, but these errors were encountered:
That sounds like overfitting. You may try adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set).
@spiglerg I am facing overfitting issue . as you said " adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set) " are you referring to below to increase or decrease ? if yes, please give the exact value
I notice that the longer(the more batches)that I've trained the more perfect that generated sentences will be,but I find that some generated sentences can be totally the same as some sentences in my train corpus,is it possible?I just wonder whether it just generate the sentences like that or 'copy' like that.
The text was updated successfully, but these errors were encountered: