Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: some parameters appear in more than one parameter group #41

Open
jiaohuix opened this issue Apr 27, 2024 · 0 comments
Open

Comments

@jiaohuix
Copy link

I encountered an error, how should I resolve it?

[WARNING|trainer.py:1272] 2024-04-27 12:04:25,428 >> Activated GaLoRE fine-tuning, depending on your model size and hardware, the training might take a while before starting. Please be patient !
/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py:48: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set no_deprecation_warning=True to disable this warning
warnings.warn(
Traceback (most recent call last):
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 618, in
main()
File "/home/jiahui/workspace/nmt/thesis_nmt/mnmt/multi/scripts/run_translation.py", line 534, in main
train_result = trainer.train(resume_from_checkpoint=checkpoint)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1848, in train
return inner_training_loop(
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1949, in _inner_training_loop
self.create_optimizer_and_scheduler(num_training_steps=max_steps)
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 981, in create_optimizer_and_scheduler
self.create_optimizer()
File "/home/jiahui/workspace/nmt/thesis_nmt/transformers/src/transformers/trainer.py", line 1038, in create_optimizer
self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/galore_torch/adamw.py", line 64, in init
super().init(params, defaults)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 192, in init
self.add_param_group(param_group)
File "/home/jiahui/anaconda3/envs/llm/lib/python3.10/site-packages/torch/optim/optimizer.py", line 535, in add_param_group
raise ValueError("some parameters appear in more than one parameter group")
ValueError: some parameters appear in more than one parameter group

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant