Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2023-IEEE Access-A Smaller and Better Word Embedding for Neural #238

Open
thangk opened this issue Jun 24, 2024 · 0 comments
Open

2023-IEEE Access-A Smaller and Better Word Embedding for Neural #238

thangk opened this issue Jun 24, 2024 · 0 comments
Assignees
Labels
literature-review Summary of the paper related to the work

Comments

@thangk
Copy link
Collaborator

thangk commented Jun 24, 2024

Link: IEEE Access

Main problem

The traditional methods don’t account for relations between word embeddings thus leaving room for more inaccuracy in translation results.

Proposed method

The research paper author aims to fix that problem by proposing a method that introduces two key components of relation embedding and shared embedding. The author claims that it’s key to enhancing the result of the tasks, especially for low-resource tasks.

My Summary

The researchers in this paper proposed their own word embedding method to be used with Neural Machine Translation (NMT) systems. One of the key differences between the method this paper claims to deliver opposed to other methods is that this paper’s method retains the “knowledge of the association between words to the training process” which helps with improving the BLEU performance in several datasets which are low-resource tasks such as WMT’ 14 English->German, and Global Voices v2018q4 Spanish->Czech (i.e., 15k sentence pairs). The paper’s proposed method also delivers a smaller parameter model as a “bonus” (as much as 15%) compared to the baselines. The researcher claims the method works for various NMT systems. However, in future works, it is yet to be tested in other NLP tasks such as dialogue generation and question answering.

Datasets

WMT14 English-German
Global Voices v2018q4 Spanish-Czech
WMT14 English-French
Russian-Spanish

@hosseinfani hosseinfani added the literature-review Summary of the paper related to the work label Jun 25, 2024
@thangk thangk changed the title A Smaller and Better Word Embedding for Neural (2023) 2023-IEEE Access-A Smaller and Better Word Embedding for Neural Jun 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
literature-review Summary of the paper related to the work
Projects
None yet
Development

No branches or pull requests

2 participants