Overview
In this project, I utilized PyTorch, the Transformers library, and Hugging Face's BERT model for fine-tuning on a text classification task. The training and evaluation loops were written in PyTorch, allowing efficient GPU utilization for model training. The primary goal was to achieve optimal performance on various metrics such as precision, recall, accuracy, and F1 score.
Dependencies
- PyTorch
- Transformers Library
- Hugging Face BERT Model
- GPU for accelerated fine-tuning