Skip to content

v0.5.0 Knowledge distillation trainer & ONNX exporter

Compare
Choose a tag to compare
@lewtun lewtun released this 14 Dec 15:03
· 347 commits to main since this release

This release comes with two main features:

  • A DistillationSetFitTrainer class that allows users to use unlabeled data to significantly boost the performance of small models like MiniLM. See this workshop for an end-to-end example.
  • An ONNX exporter that converts the SetFit model instances into ONNX graphs for downstream inference + optimisation. Checkout the notebooks folder for an end-to-end example.

Kudos to @orenpereg and @nbertagnolli for implementing both of these features 🔥

Bug fixes and improvements

Significant community contributions

The following contributors have made significant changes to the library over the last release: