What is the intended way of saving the base model when finetuning both an adapter and some layers in the base model? #1546
-
Is there are better way? I feel like I might be missing some function in the API.
This is problematic because it breaks the optimizer each time I save the model. |
Beta Was this translation helpful? Give feedback.
Answered by
younesbelkada
Mar 11, 2024
Replies: 1 comment 1 reply
-
Ended up doing this but hoping there will be a more native solution in the future.
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @samedii
in order to save both the adapter weights and some layers of the base model, that I assume are trainable, you need to declare a
modules_to_save
variable in your PeftConfig and peft will automatically take care of saving / loading correct modules.