-
-
Notifications
You must be signed in to change notification settings - Fork 898
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error During Model Saving QLORA + FSDP #2149
Error During Model Saving QLORA + FSDP #2149
Comments
I was able to reproduce this on my end. Upon further digging, there is already a pending fix upstream in accelerate. huggingface/accelerate#3213. In the meantime, you can use |
Also a similar issue mentioned here huggingface/peft#2205 |
i will test the brtanch transformers-version-flexibility in my case |
using the branch version afterthe training i receive this error ; Using save_only_model: true and without it
|
Are you using lora/qlora? Looks like it's erroring that you are trying to fft a quantized model |
I m using qlora, more speceficly qlora+fsdp |
sorry, it auto-closed due to the linked PR getting merged. |
Please check that this issue hasn't been reported before.
Expected Behavior
It is supposed to save the model without issue after finishing the training.
Current behaviour
it raises an error that it couldn t find a Paramater in a list, the issue is comming from the funciton _unflatten_param_groups in python3.11/site-packages/torch/distributed/fsdp/_optim_utils.py
`
using the config yml :
Config yaml
Possible solution
No response
Which Operating Systems are you using?
Python Version
3.11
axolotl branch-commit
main
Acknowledgements
The text was updated successfully, but these errors were encountered: