You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[WARNING|trainer.py:1272] 2024-04-27 12:04:25,428 >> Activated GaLoRE fine-tuning, depending on your model size and hardware, the training might take a while before starting. Please be patient !
/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py:48: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set no_deprecation_warning=True to disable this warning
warnings.warn(
Traceback (most recent call last):
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 618, in
main()
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 534, in main
train_result = trainer.train(resume_from_checkpoint=checkpoint)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1848, in train
return inner_training_loop(
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1949, in _inner_training_loop
self.create_optimizer_and_scheduler(num_training_steps=max_steps)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 981, in create_optimizer_and_scheduler
self.create_optimizer()
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1038, in create_optimizer
self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py", line 64, in init
super().init(params, defaults)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 192, in init
self.add_param_group(param_group)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 535, in add_param_group
raise ValueError("some parameters appear in more than one parameter group")
ValueError: some parameters appear in more than one parameter group
The text was updated successfully, but these errors were encountered:
I encountered an error, how should I resolve it?
[WARNING|trainer.py:1272] 2024-04-27 12:04:25,428 >> Activated GaLoRE fine-tuning, depending on your model size and hardware, the training might take a while before starting. Please be patient !
/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py:48: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set
no_deprecation_warning=True
to disable this warningwarnings.warn(
Traceback (most recent call last):
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 618, in
main()
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 534, in main
train_result = trainer.train(resume_from_checkpoint=checkpoint)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1848, in train
return inner_training_loop(
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1949, in _inner_training_loop
self.create_optimizer_and_scheduler(num_training_steps=max_steps)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 981, in create_optimizer_and_scheduler
self.create_optimizer()
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1038, in create_optimizer
self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py", line 64, in init
super().init(params, defaults)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 192, in init
self.add_param_group(param_group)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 535, in add_param_group
raise ValueError("some parameters appear in more than one parameter group")
ValueError: some parameters appear in more than one parameter group
The text was updated successfully, but these errors were encountered: