Skip to content

Multi-Head Attention for counting difference of the number of occurrences of X and Y in XY[0-5]+ pattern. Implemented in PyTorch

License

Notifications You must be signed in to change notification settings

javadr/MHA-DifferentialCounting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

PyTorch-MHA-DifferentialCounting

Multi-Head Attention for counting difference of the number of occurrences of X and Y in XY[0-5]+ pattern.


The notebook trains a neural network that solves the following task:

Given an input sequence XY[0-5]+ where X and Y are two given digits, the task is to count the number of occurrences of X and Y in the remaining substring and then calculates the difference #X - #Y.

Example:
Input: 1213211
Output: 2 (3 - 1)

The problem is solved with a multi-head attention network.

Attention Is All You Need

"Attention is All You Need"(Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017) is a transformer model, based solely on attention mechanisms, which has pushed the Natural Language Processing (NLP) world to new frontiers, especially sequence to sequence translation. It achieved the SOTA performance on WMT2014 English-to-German translation.

About

Multi-Head Attention for counting difference of the number of occurrences of X and Y in XY[0-5]+ pattern. Implemented in PyTorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published