Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

Commit

Permalink
fix attention_norm weight error
Browse files Browse the repository at this point in the history
  • Loading branch information
skirodev committed Jul 16, 2023
1 parent 71c3273 commit 565ca6d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion crates/models/falcon/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ impl KnownModel for Falcon {
input_layernorm_b: tl.load(&format!("{}.bias", input_layernorm_name))?,
attention_norm: attention_norm_name
.as_ref()
.map(|path| tl.load(&format!("{}.bias", path)))
.map(|path| tl.load(&format!("{}.weight", path)))
.transpose()?,
attention_norm_b: attention_norm_name
.map(|path| tl.load(&format!("{}.bias", path)))
Expand Down

0 comments on commit 565ca6d

Please sign in to comment.