-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Arguments: (AttributeError("'TransformerWordEmbeddings' object has no attribute 'tokenize'"),) #4
Comments
I am not with my computer, but by memory ut should be 0.11.x |
Do not hesitate if you have additional help Did not have much time to polish the code 🙏 |
Thank you for your reply. I used the flair=0.11.0 version and the same error still occurred. |
thank you |
Hi Urchade, Thanks a lot for this great work. 😃 Sincerely, Thanks |
I have used A100 80G or V100 32 G, depending on available in our cluster |
@xyf-495800804 Did you successfully run this code for reproduction? Can I add your WeChat to have a further discussion? |
@edzq Did you successfully run this code for reproduction? Could I add your WeChat to have a further discussion? |
Yes. My id is Maxwell0115 But I am not interested this work too much. HGERE is the current state-of-the-art of this task. But this generative style model may be still interesting. |
Hi urchade,
Thanks a lot for this great work. 😃
I'm trying to reproduce your results on CONLL04. But I found that the code reported an error:
Arguments: (AttributeError("'TransformerWordEmbeddings' object has no attribute 'tokenize'"),)
Through debugging, it was found that the flair library was referenced:
from flair.embeddings import TransformerWordEmbeddings
Then pass this code:
self.bert_layer = TransformerWordEmbeddings(model_name, fine_tune=fine_tune, subtoken_pooling=subtoken_pooling)
Among them model_name="bert-base-cased", fine_tune=True, subtoken_pooling="first"
Execute to the next code:
embedder = self.bert_layer
tokenized_sentences, all_token_subtoken_lengths, subtoken_lengths = embedder._gather_tokenized_strings(sentences)
The reason for the error is that the _gather_tokenized_strings method under the embedder cannot be found. I would like to ask what the reason is. It is because my flair version is wrong.
My flair version is: flair==0.13.1
Looking forward to your reply, thank you
Sincerely
The text was updated successfully, but these errors were encountered: