Github repo with tutorials to fine tune transformers for diff NLP tasks
-
Updated
Apr 1, 2024 - Jupyter Notebook
Github repo with tutorials to fine tune transformers for diff NLP tasks
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
An all-in-one AI audio playground using Cloudflare AI Workers to transcribe, analyze, summarize, and translate any audio file.
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Transformers 3rd Edition
Build and train state-of-the-art natural language processing models using BERT
Pytorch-Named-Entity-Recognition-with-transformers
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Distillation of BERT model with catalyst framework
FoodBERT: Food Extraction with DistilBERT
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Compares the DistilBERT and MobileBERT architectures for mobile deployments.
NLP model that predicts subreddit based on the title of a post
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."