-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
experimental result #3
Comments
Hello, |
Hi, |
Maybe he is only good at theoretical analysis, but its hybrid method may not be really effective. We need to come up with some new ways to create really useful hard negatives
…---Original---
From: ***@***.***>
Date: Fri, Jun 10, 2022 16:03 PM
To: ***@***.***>;
Cc: ***@***.******@***.***>;
Subject: Re: [BDBC-KG-NLP/MixCSE_AAAI2022] experimental result (Issue #3)
The result I got was only 65. I don't know what was wrong.
Hi,
Do you find the reason for this result?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Yes, the theoretical analysis is very valuable, but then I'm curious as to how the results in the paper were derived. Because I basically followed the ReadMe to reproduce it, yet has a large gap with the paper result |
Sorry, I have already seen it. Could you please show your hyparameters for training? |
Hi, thank you for your reply, here is my hyperparameters setting: python train.py |
hello, thanks for your reminder, I just set the lambda =0.2 as the paper, then I got an average STS = 77.20 using "cls" pooling, and a higher result STS = 77.90 using "cls_before_pooler", but I think I should follow your ReadMe file, and adopt "cls" pooling, right? |
Yeah, I find the "cls" pooling is more robutness. And the script is updated now. Thank you for your reminder。 |
The result I got was only 65. I don't know what was wrong.
The text was updated successfully, but these errors were encountered: