Skip to content

hardaatbaath/transformers-pytorch

Repository files navigation

Transformer

This is the implementation of the paper Attention Is All You Need. The code has been written in Python, so make the necessary changes to the scripts to run them. Thanks to Umar Jamilai and his videos on the topic, they were extremely helpful and valuable.

More comments will be added to make understanding the code easier.

I have used the opus_books dataset for this code.

Citation:

Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017).
Attention Is All You Need.
ArXiv.

About

Implementation of "Attention is all you need."

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published