-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remove GPTJ dma before mha #468
Conversation
Before this PR |
after this pr |
@regisss This pr is still WIP, there's another optimization pending, we could do two things in one PR: ) |
@BaihuiJin Nice! No problem, let's wait a bit for the 2nd optimization and then we'll merge. |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
Lastest Perf Result, configs are the same as above Throughput (including tokenization) = 4275.093059788448 tokens/second |
@regisss @ZhaiFeiyue Pls help review~ |
@BaihuiJin nice perf improved, few comments added |
rotary_dim = self.config.rotary_dim | ||
embed_dim = self.config.hidden_size | ||
pos_embd_dim = rotary_dim or embed_dim | ||
max_positions = self.config.max_position_embeddings | ||
embed_positions = create_sinusoidal_positions(max_positions, pos_embd_dim).to(torch.bfloat16) | ||
embed_positions = embed_positions.repeat(position_ids.shape[0], 1, 1) | ||
if embed_positions.device != position_ids.device: | ||
embed_positions = embed_positions.to(position_ids.device) | ||
repeated_position_ids = position_ids.unsqueeze(-1).repeat(1, 1, embed_positions.shape[-1]) | ||
sincos = torch.gather(embed_positions, 1, repeated_position_ids) | ||
sin, cos = torch.split(sincos, sincos.shape[-1] // 2, dim=-1) | ||
sin = sin.contiguous() | ||
cos = cos.contiguous() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand that this piece of code comes from the code blocks that were removed above. Could this be moved to a dedicated method that would be called here please?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Theoretically it can be done, but test shows that an additional memcpy occurred, perf drop detail as follow.
Throughput (including tokenization) = 3885.6094019038055 tokens/second
Memory allocated = 27.33 GB
Max memory allocated = 28.73 GB
Total memory available = 94.46 GB
Graph compilation duration = 8.958231755999805 seconds
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI, changes looks like this
def get_embed_positions(embed_positions, position_ids):
embed_positions = embed_positions.repeat(position_ids.shape[0], 1, 1)
if embed_positions.device != position_ids.device:
embed_positions = embed_positions.to(position_ids.device)
return embed_positions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's surprising as objects are passed to functions by references if I'm not mistaken.
Okay, in that case, could you just add a comment above this block saying which methods it replaces, and also add a blank line right above and below please?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's surprising as objects are passed to functions by references if I'm not mistaken. Okay, in that case, could you just add a comment above this block saying which methods it replaces, and also add a blank line right above and below please?
Surprising indeed. Anyway, changed accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's surprising as objects are passed to functions by references if I'm not mistaken. Okay, in that case, could you just add a comment above this block saying which methods it replaces, and also add a blank line right above and below please?
By the way, I think the make style
removed blank line I added below this block : )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
What does this PR do?
Reduce max mem usage and increase perf.
Fixes # (issue)
Before submitting