Skip to content

Commit

Permalink
Update modeling_aquila.py
Browse files Browse the repository at this point in the history
  • Loading branch information
ftgreat authored Oct 7, 2023
1 parent ae439ea commit 0ee67e5
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion flagai/model/aquila2/modeling_aquila.py
Original file line number Diff line number Diff line change
Expand Up @@ -357,7 +357,7 @@ def forward(
value_states = repeat_kv(value_states, self.num_key_value_groups)

attn_weights = torch.matmul(query_states, key_states.transpose(2, 3)) / math.sqrt(self.head_dim)
attn_weights = torch.clamp(attn_weights, min=-1024., max=1024.)
#attn_weights = torch.clamp(attn_weights, min=-1024., max=1024.)
if attn_weights.size() != (bsz, self.num_heads, q_len, kv_seq_len):
raise ValueError(
f"Attention weights should be of size {(bsz, self.num_heads, q_len, kv_seq_len)}, but is"
Expand Down

0 comments on commit 0ee67e5

Please sign in to comment.