Skip to content

A from-scratch neural network and transformers library, with speeds rivaling PyTorch

Notifications You must be signed in to change notification settings

Lucasc-99/NoTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NoTorch

A 'from scratch' implementation of a deep learning library (no pytorch/tensorflow) built only on NumPy.

This is a learning project heavily inspired by and based on Andrej Karpathy's micrograd: https://github.com/karpathy/micrograd

FEATURES:

  • A matrix valued autograd engine, allowing for 24 (and counting) differentiable matrix operations

  • Neural Networks

  • Transformers

  • Extremely fast performance, speeds similar to PyTorch. Orders of magnitude faster than micrograd (see speed_test.py)

IN PROGRESS:

  • GPU support via CuPy

How to Use

Install poetry: https://python-poetry.org/docs/#installation

To run speed_test.py: (OSX/Linux)

$ git clone https://github.com/Lucasc-99/NoTorch
$ cd NoTorch
$ poetry install 
$ poetry run python speed_test.py

About

A from-scratch neural network and transformers library, with speeds rivaling PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages