Skip to content

v1.1.0 - Sentence Transformers as the finetuning backend; tackle deprecations of other dependencies

Latest
Compare
Choose a tag to compare
@tomaarsen tomaarsen released this 19 Sep 09:28

This release introduces a new backend to finetune embedding models, based on the Sentence Transformers Trainer, tackles deprecations of other dependencies like transformers, deprecates Python 3.7 while adding support for new Python versions, and applies some other minor fixes. There shouldn't be any breaking changes.

Install this version with

pip install -U setfit

Defer the embedding model finetuning phase to Sentence Transformers (#554)

In SetFit v1.0, the old model.fit training from Sentence Transformers was replaced by a custom training loop that has some features the former was missing, such as loss logging, useful callbacks, etc. However, since then, Sentence Transformers v3 has released, which also added all of the features that were previously lacking. To simplify the training moving forward, the training is now (once again) deferred to Sentence Transformers.

Because both the old and new training approach are inspired by the transformers Trainer, there should not be any breaking changes. The primary notable change is that training now requires accelerate (as Sentence Transformers requires it), and we benefit from some of the Sentence Transformers training features, such as multi-GPU training.

Solve discrepancies with new versions of dependencies

To ensure compatibility with the latest versions of dependencies, the following issues have been addressed:

  • Follow the (soft) deprecation of evaluation_strategy to eval_strategy (#538). This previously resulted in crashes if your transformers version was too new.
  • Avoid the now-deprecated DatasetFilter (#527). This previously resulted in crashes if your huggingface-hub version was too new.

Python version support

  • Following Python 3.7 its deprecation by the Python team, Python 3.7 is now also deprecated by SetFit moving forward. (#506)
  • We've added official support for Python 3.11 and 3.12 now that both are included in our test suite. (#550)

Minor changes

  • Firm up max_steps and eval_max_steps: rather than being a rough maximum limit, the limit is now exact. This can be helpful to avoid memory overflow, especially in situations with notable dataset imbalances. (#549)
  • Training and validation losses are now nicely logged in notebooks. (#557)

Minor bug fixes

  • Fix bug where device parameter in SetFitHead is ignored if CUDA is not available. (#518)

All Changes

New Contributors

Full Changelog: v1.0.3...v1.1.0