We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我在参看代码的时候发现,crf.py 中的 neg_log_likelihood_loss 函数里有: if self.average_batch: return (forward_score - gold_score) / batch_size return forward_score - gold_score 而在调用它的 sequence_labeling_model.py 中的 loss 函数里也有: if not self.use_crf: batch_size, max_len = feats.size(0), feats.size(1) lstm_feats = feats.view(batch_size * max_len, -1) tags = tags.view(-1) return self.loss_function(lstm_feats, tags) else: loss_value = self.loss_function(feats, mask, tags) print ('loss_value:', loss_value) if self.average_batch: batch_size = feats.size(0) loss_value /= float(batch_size) return loss_value 这样是不是就多求了一次平均呢?
The text was updated successfully, but these errors were encountered:
同问,我也发现了这里~
Sorry, something went wrong.
我感觉也是哎
No branches or pull requests
您好,我在参看代码的时候发现,crf.py 中的 neg_log_likelihood_loss 函数里有:
if self.average_batch:
return (forward_score - gold_score) / batch_size
return forward_score - gold_score
而在调用它的 sequence_labeling_model.py 中的 loss 函数里也有:
if not self.use_crf:
batch_size, max_len = feats.size(0), feats.size(1)
lstm_feats = feats.view(batch_size * max_len, -1)
tags = tags.view(-1)
return self.loss_function(lstm_feats, tags)
else:
loss_value = self.loss_function(feats, mask, tags)
print ('loss_value:', loss_value)
if self.average_batch:
batch_size = feats.size(0)
loss_value /= float(batch_size)
return loss_value
这样是不是就多求了一次平均呢?
The text was updated successfully, but these errors were encountered: