Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to Transformers v4.45 #1359

Merged
merged 27 commits into from
Oct 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
5e88ce7
Upgrade to commit 74e19e81e2a23809af192532b9b0e7ea202be6f2
regisss Aug 28, 2024
43bc4eb
Merge branch 'main' into transformers_future
regisss Sep 2, 2024
8eea643
Add specific commit in setup.py
regisss Sep 3, 2024
a7be363
Upgrade to commit e48e5f1f13e05380e24f4f31f5fee07aa6f959eb
regisss Sep 6, 2024
f0b909a
Merge branch 'main' into transformers_future
regisss Sep 6, 2024
d99f18f
Fix default cache
regisss Sep 9, 2024
39b7a76
Merge branch 'main' into transformers_future
regisss Sep 9, 2024
da66ecf
Merge branch 'main' into transformers_future
regisss Sep 9, 2024
5547767
Merge branch 'main' into transformers_future
regisss Sep 10, 2024
bf89e41
Merge branch 'main' into transformers_future
regisss Sep 10, 2024
98b0da5
Merge branch 'main' into transformers_future
regisss Sep 24, 2024
47ad03c
Upgrade to commit 238b13478df209ab534f2195a397dc64a3930883
regisss Sep 24, 2024
94c23ba
Fix
regisss Sep 24, 2024
c19dedd
Upgrade to v4.45.0
regisss Sep 25, 2024
c12fd7e
Merge branch 'main' into transformers_future
regisss Sep 25, 2024
fc399fa
Fix
regisss Sep 25, 2024
9216159
Add bias to gptj (#1363)
jiminha Sep 26, 2024
679365a
Switch roberta from sdpa to eager attn (#1361)
skaulintel Sep 26, 2024
1abd6ee
Update bloom attention forward reshape follwing the transformer chang…
yeonsily Sep 26, 2024
8043d2c
Workaround for Llava/Llava-next
regisss Sep 26, 2024
047e7ff
Fix reshape error in mamba (#1369)
hsubramony Sep 28, 2024
f89e03b
Merge branch 'main' into transformers_future
regisss Sep 30, 2024
2ae546a
Merge branch 'main' into transformers_future
regisss Sep 30, 2024
1b8a3f7
Fix contrastive search
regisss Oct 1, 2024
2332afb
Fix local variable 'image_features' referenced before assignment (#1383)
vidyasiv Oct 1, 2024
f62ecde
Use model.generation_config instead of model.config (#1384)
hsubramony Oct 2, 2024
a8fb8ac
Make style
regisss Oct 2, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.14.0", "To fix: pip install -r examples/pytorch/audio-classification/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/contrastive-image-text/run_bridgetower.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/contrastive-image-text/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/contrastive-image-text/run_clip.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/contrastive-image-text/requirements.txt")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=2.14.0", "To fix: pip install -r examples/pytorch/image-classification/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/language-modeling/run_clm.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=2.14.0", "To fix: pip install -r examples/pytorch/language-modeling/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/language-modeling/run_mlm.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=2.14.0", "To fix: pip install -r examples/pytorch/language-modeling/requirements.txt")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risk.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/language-modeling/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/language-modeling/run_prompt_tuning_clm.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/language-modeling/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/question-answering/run_qa.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/question-answering/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/question-answering/run_seq2seq_qa.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/question-answering/requirements.txt")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.18.0", "To fix: pip install -r examples/pytorch/speech-recognition/requirements.txt")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def check_optimum_habana_min_version(*a, **b):


# Will error if the minimal version of Transformers is not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.18.0", "To fix: pip install -r examples/pytorch/speech-recognition/requirements.txt")
Expand Down Expand Up @@ -580,7 +580,8 @@ def compute_metrics(pred):
# save feature extractor, tokenizer and config
feature_extractor.save_pretrained(training_args.output_dir)
tokenizer.save_pretrained(training_args.output_dir)
config.save_pretrained(training_args.output_dir)
# TODO: uncomment the line below when this is fixed in Transformers
# config.save_pretrained(training_args.output_dir)

processor = AutoProcessor.from_pretrained(training_args.output_dir)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ def check_optimum_habana_min_version(*a, **b):
return ()


check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

# Setup logging
Expand Down
2 changes: 1 addition & 1 deletion examples/summarization/run_summarization.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/summarization/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/text-classification/run_glue.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/text-classification/requirements.txt")
Expand Down
2 changes: 1 addition & 1 deletion examples/translation/run_translation.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def check_optimum_habana_min_version(*a, **b):
logger = logging.getLogger(__name__)

# Will error if the minimal version of Transformers and Optimum Habana are not installed. Remove at your own risks.
check_min_version("4.43.0")
check_min_version("4.45.0")
check_optimum_habana_min_version("1.14.0.dev0")

require_version("datasets>=1.8.0", "To fix: pip install -r examples/pytorch/translation/requirements.txt")
Expand Down
1 change: 0 additions & 1 deletion optimum/habana/transformers/generation/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
from .stopping_criteria import (
gaudi_EosTokenCriteria_call,
gaudi_MaxLengthCriteria_call,
gaudi_MaxNewTokensCriteria_call,
gaudi_MaxTimeCriteria_call,
gaudi_StoppingCriteriaList_call,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@


if TYPE_CHECKING:
from transformers.generation.logits_process import LogitsProcessorList
from transformers.modeling_utils import PreTrainedModel
from transfromers.generation.logits_process import LogitsProcessorList

from .configuration_utils import GaudiGenerationConfig

Expand Down
12 changes: 0 additions & 12 deletions optimum/habana/transformers/generation/stopping_criteria.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,18 +52,6 @@ def gaudi_MaxLengthCriteria_call(
return create_return_const_tensor(input_ids, is_done)


def gaudi_MaxNewTokensCriteria_call(
self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs
) -> Union[torch.BoolTensor, bool]:
token_idx = kwargs.get("token_idx", None)
if token_idx is not None:
assert not kwargs["needs_tensor_output"]
return token_idx >= self.max_length
else:
is_done = input_ids.shape[-1] >= self.max_length
return create_return_const_tensor(input_ids, is_done)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@libinta MaxNewTokensCriteria no longer exists in transformers.

removed in this PR: https://github.com/huggingface/transformers/pull/32659/files#diff-6e63ae0764aa864afd5bae6d512677b99b5240cb98cb210190482bdbb6a85906

It was removed as it had plans for being deprecated:

"The class MaxNewTokensCriteria is deprecated and will be removed in v4.43. "
f"Please use MaxLengthCriteria(max_length={start_length + max_new_tokens}) "



def gaudi_MaxTimeCriteria_call(
self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs
) -> Union[torch.BoolTensor, bool]:
Expand Down
Loading
Loading