Skip to content

Commit

Permalink
Update dropout defaults for FacT finetuning method (#714)
Browse files Browse the repository at this point in the history
  • Loading branch information
anwai98 authored Oct 1, 2024
1 parent bd9f44c commit e1bf659
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions micro_sam/models/peft_sam.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def __init__(
self,
rank: int,
block: nn.Module,
dropout: Optional[float] = None,
dropout: Optional[float] = 0.1,
):
super().__init__()
self.qkv_proj = block.attn.qkv
Expand Down Expand Up @@ -104,7 +104,6 @@ def forward(self, x):
new_v = self.FacTv(new_v)

# NOTE : Scaling Factor was set to 1 as it can be tuned via the learning rate
# Does it make sense to include it, in order to have similar learning rate as the original model?
qkv[:, :, :, : self.dim] += new_q
qkv[:, :, :, -self.dim:] += new_v

Expand Down

0 comments on commit e1bf659

Please sign in to comment.