Skip to content

Commit

Permalink
[chore] If Transformers 4.46.0, use processing_class instead of tok…
Browse files Browse the repository at this point in the history
…enizer when saving (#3038)
  • Loading branch information
tomaarsen authored Nov 6, 2024
1 parent cb81136 commit 5bccba3
Showing 1 changed file with 7 additions and 2 deletions.
9 changes: 7 additions & 2 deletions sentence_transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -866,8 +866,13 @@ def _save(self, output_dir: str | None = None, state_dict=None) -> None:

self.model.save_pretrained(output_dir, safe_serialization=self.args.save_safetensors)

if self.tokenizer is not None:
self.tokenizer.save_pretrained(output_dir)
# Transformers v4.46.0 changed the `tokenizer` attribute to a more general `processing_class` attribute
if parse_version(transformers_version) >= parse_version("4.46.0"):
if self.processing_class is not None:
self.processing_class.save_pretrained(output_dir)
else:
if self.tokenizer is not None:
self.tokenizer.save_pretrained(output_dir)

# Good practice: save your training arguments together with the trained model
torch.save(self.args, os.path.join(output_dir, TRAINING_ARGS_NAME))
Expand Down

0 comments on commit 5bccba3

Please sign in to comment.