Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About lora_vit.head bias #9

Open
zhongtao93 opened this issue Aug 5, 2024 · 1 comment
Open

About lora_vit.head bias #9

zhongtao93 opened this issue Aug 5, 2024 · 1 comment

Comments

@zhongtao93
Copy link

Thank you for opensource code!
I have a question about lora_vit.head. It seems when save lora fc weight(

fc_tensors = {f"fc_{_in}in_{_out}out": self.lora_vit.fc.weight}
) and switch lora fc weight(
self.lora_vit.head.weight = Parameter(self.fc_loras[idx])
), only save and load linear weight but no bias. But when using lora_vit, it seems bias=True in melo.lora_vit.head (examples.ipynb)
image
, which cause a random init for head.bias. Is this a bug or what?
Hoping for your reply~

@Eyvaz27
Copy link

Eyvaz27 commented Sep 19, 2024

w_a_linear_q = nn.Linear(self.dim, r, bias=False)

bias for added low rank layers are set to False, thus there is no need to save them in memory

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants