Skip to content

LLaMA Arbitrary Low Bits Lora Fine Tuning With Quantization

Juncong Moo edited this page Apr 6, 2023 · 5 revisions

Training Setup

Data

Stanford Alpaca

Tensorboard

pyllama$ tensorboard --logdir=logs

Command

python finetune.py

The generated checkpoints exist at folder lora-alpaca

2Bits

image

Trained 1 epoch in 20 hours

The model is saved into lora-alpaca/checkpoint-1620/.

Load model and verify the weights:

image

Clone this wiki locally