You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to get Precision and Recall metrics during the training of the Varmisuse task. I have tried modifying varmisuse_task.py with the following code:
Inside make_task_output_model, at approximately line 438
`predicted = tf.argmax(tf.nn.softmax(logits), 1, output_type=tf.int32)
prediction_is_correct = tf.equal(predicted, correct_choices)
accuracy = tf.reduce_mean(tf.cast(prediction_is_correct, tf.float32))
acc = sum([m['num_correct_predictions'] for m in task_metric_results]) / float(num_graphs) precision = sum([m['precision'] for m in task_metric_results]) / float(num_graphs) recall = sum([m['recall'] for m in task_metric_results]) / float(num_graphs) return "Accuracy: %.3f | Precision: %.3f | Recall: %.3f" % (acc, precision, recall)
However, this code outputs nan for both Precision and Recall. If anyone knows why this happens and could point me in the right direction, I would greatly appreciate it.
The text was updated successfully, but these errors were encountered:
I would like to get Precision and Recall metrics during the training of the Varmisuse task. I have tried modifying varmisuse_task.py with the following code:
Inside make_task_output_model, at approximately line 438
`predicted = tf.argmax(tf.nn.softmax(logits), 1, output_type=tf.int32)
prediction_is_correct = tf.equal(predicted, correct_choices)
accuracy = tf.reduce_mean(tf.cast(prediction_is_correct, tf.float32))
Inside pretty_print_epoch_task_metrics:
acc = sum([m['num_correct_predictions'] for m in task_metric_results]) / float(num_graphs) precision = sum([m['precision'] for m in task_metric_results]) / float(num_graphs) recall = sum([m['recall'] for m in task_metric_results]) / float(num_graphs) return "Accuracy: %.3f | Precision: %.3f | Recall: %.3f" % (acc, precision, recall)
However, this code outputs nan for both Precision and Recall. If anyone knows why this happens and could point me in the right direction, I would greatly appreciate it.
The text was updated successfully, but these errors were encountered: