You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thank you for sharing this repo and useful hugging face model cards!
I am interested in T5 generators for query generation, and trying to extend this to other datasets/tasks.
For doing so, I would like to reproduce T5 generators, specifically BeIR/query-gen-msmarco-t5-large-v1.
I am wondering if the training script and training configurations for the generators can be shared,
including
Thank you for the information!
I have tried a few different configurations, e.g., different learning rates (1e-5, 3e-5, 5e-5, 1e-4) either with or without warmup steps. But I have failed to reproduce.
It worked relatively well for MS MARCO, but not for BEIR.
At this point, I am also wondering if other datasets, such as NQ, were used for training the generator by any chance?
Hello, thank you for sharing this repo and useful hugging face model cards!
I am interested in T5 generators for query generation, and trying to extend this to other datasets/tasks.
For doing so, I would like to reproduce T5 generators, specifically BeIR/query-gen-msmarco-t5-large-v1.
I am wondering if the training script and training configurations for the generators can be shared,
including
Best regards,
Jihyuk Kim
The text was updated successfully, but these errors were encountered: