Skip to content

A package built on top of Hugging Face's transformers library that makes it easy to utilize state-of-the-art NLP models

License

Notifications You must be signed in to change notification settings

swcrazyfan/happy-transformer

 
 

Repository files navigation

License Downloads Website shields.io PyPI

Happy Transformer

Documentation and news: happytransformer.com

New Course: Create a text generation web app. Also learn how to fine-tune GPT-Neo link

Join our Discord server: Support Server

HappyTransformer

Happy Transformer is an package built on top of Hugging Face's transformer library that makes it easy to utilize state-of-the-art NLP models.

Features

Public Methods Basic Usage Training
Text Generation
Text Classification
Word Prediction
Question Answering
Text-to-Text
Next Sentence Prediction
Token Classification

Quick Start

pip install happytransformer
from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token)  # am

Maintainers

Tutorials

Text generation with training (GPT-Neo)

Text classification (training)

Text classification (hate speech detection)

Text classification (sentiment analysis)

Word prediction with training (DistilBERT, RoBERTa)

Top T5 Models

Grammar Correction

Fine-tune a Grammar Correction Model

About

A package built on top of Hugging Face's transformers library that makes it easy to utilize state-of-the-art NLP models

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%