Skip to content

feat: enable head_dim=256 for attention kernels (#132) #35

feat: enable head_dim=256 for attention kernels (#132)

feat: enable head_dim=256 for attention kernels (#132) #35