Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Arguments: (AttributeError("'TransformerWordEmbeddings' object has no attribute 'tokenize'"),) #4

Open
Student-xyf opened this issue Apr 24, 2024 · 10 comments

Comments

@Student-xyf
Copy link

Hi urchade,

Thanks a lot for this great work. 😃
I'm trying to reproduce your results on CONLL04. But I found that the code reported an error:
Arguments: (AttributeError("'TransformerWordEmbeddings' object has no attribute 'tokenize'"),)
Through debugging, it was found that the flair library was referenced:
from flair.embeddings import TransformerWordEmbeddings
Then pass this code:
self.bert_layer = TransformerWordEmbeddings(model_name, fine_tune=fine_tune, subtoken_pooling=subtoken_pooling)
Among them model_name="bert-base-cased", fine_tune=True, subtoken_pooling="first"

Execute to the next code:
embedder = self.bert_layer
tokenized_sentences, all_token_subtoken_lengths, subtoken_lengths = embedder._gather_tokenized_strings(sentences)

The reason for the error is that the _gather_tokenized_strings method under the embedder cannot be found. I would like to ask what the reason is. It is because my flair version is wrong.

My flair version is: flair==0.13.1

Looking forward to your reply, thank you

Sincerely

@urchade
Copy link
Owner

urchade commented Apr 24, 2024

I am not with my computer, but by memory ut should be 0.11.x

@urchade
Copy link
Owner

urchade commented Apr 24, 2024

Do not hesitate if you have additional help

Did not have much time to polish the code 🙏

@Student-xyf
Copy link
Author

Thank you for your reply. I used the flair=0.11.0 version and the same error still occurred.

@urchade
Copy link
Owner

urchade commented Apr 24, 2024

image

@Student-xyf
Copy link
Author

thank you

@Student-xyf
Copy link
Author

Hi Urchade,

Thanks a lot for this great work. 😃
I would also like to ask what graphics card (how much memory) is your model trained on?

Sincerely,

Thanks

@urchade
Copy link
Owner

urchade commented Apr 26, 2024

I have used A100 80G or V100 32 G, depending on available in our cluster

@edzq
Copy link

edzq commented May 18, 2024

@xyf-495800804 Did you successfully run this code for reproduction? Can I add your WeChat to have a further discussion?

@YaxinCountry
Copy link

@edzq Did you successfully run this code for reproduction? Could I add your WeChat to have a further discussion?

@edzq
Copy link

edzq commented Jun 3, 2024

@edzq Did you successfully run this code for reproduction? Could I add your WeChat to have a further discussion?

Yes. My id is Maxwell0115

But I am not interested this work too much. HGERE is the current state-of-the-art of this task. But this generative style model may be still interesting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants