Skip to content

Difference between attention contrib ops #15325

Closed Answered by hariharans29
asyncth asked this question in Other Q&A
Discussion options

You must be logged in to vote

CC: @tianleiwu

Attention only support SelfAttention whereas MultiHeadAttention supports self and cross attention

Replies: 2 comments 7 replies

Comment options

You must be logged in to vote
6 replies
@tianleiwu
Comment options

@asyncth
Comment options

@asyncth
Comment options

@tianleiwu
Comment options

@asyncth
Comment options

Answer selected by asyncth
Comment options

You must be logged in to vote
1 reply
@asyncth
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants