You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear developer,
In your use case of LinearAttentionTransformerLM, dim != dim_head * heads. I am a little bit confused about that. Is that an algorithm feature?
The text was updated successfully, but these errors were encountered:
Dear developer,
In your use case of LinearAttentionTransformerLM, dim != dim_head * heads. I am a little bit confused about that. Is that an algorithm feature?
The text was updated successfully, but these errors were encountered: