-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Now use no pretrained word embedding setting. Add in pretrained embedding #1
Comments
The micro-recall is very low.
|
The recall is off from the paper result, the reason might be the paper recall consider the "negative instance" might need to take a look. |
after consider O in precision and recall, the recall is quite high in a micro way. on the dev-dataset. processed 193229 tokens with 192575 phrases; found: 192849 phrases; correct: 175886. |
without using the pretrained embedding submit using bilstm-crf submit to leadboard get this result on testing:
|
No description provided.
The text was updated successfully, but these errors were encountered: