Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GroupNorm and LayerNorm onnx parsing #2242

Merged
merged 7 commits into from
Oct 17, 2023

Conversation

attila-dusnoki-htec
Copy link
Contributor

@attila-dusnoki-htec attila-dusnoki-htec commented Sep 26, 2023

This PR extends the normalization support with GroupNorm and LayerNorm.
Since they are very similar in implementation and usage, it is grouped in one PR.

I saw that there was already support for layernorm. This extends the parsing of the onnx node, but still uses built in primitives for calculation.
Similar support for groupnorm is not added yet. Is it needed? dnnl supports it. If so, i would recommend a follow-up PR instead of extending this.
Also MIOpen just introduced layernorm as a primitive, but it is in BETA.

Resolves migraphx-benchmark#94 and migraphx-benchmark#100

@codecov
Copy link

codecov bot commented Sep 26, 2023

Codecov Report

Merging #2242 (18fa2fa) into develop (f25606f) will increase coverage by 0.03%.
The diff coverage is 97.77%.

❗ Current head 18fa2fa differs from pull request most recent head f349a53. Consider uploading reports for the commit f349a53 to get more accurate results

@@             Coverage Diff             @@
##           develop    #2242      +/-   ##
===========================================
+ Coverage    91.30%   91.33%   +0.03%     
===========================================
  Files          436      438       +2     
  Lines        16345    16425      +80     
===========================================
+ Hits         14923    15001      +78     
- Misses        1422     1424       +2     
Files Coverage Δ
src/onnx/parse_groupnorm.cpp 100.00% <100.00%> (ø)
src/onnx/parse_layernorm.cpp 95.55% <95.55%> (ø)

... and 2 files with indirect coverage changes

@umangyadav
Copy link
Member

umangyadav commented Oct 10, 2023

Similar support for groupnorm is not added yet. Is it needed?

Would be good to add that. But not sure about priority. We already have matcher that pattern matches primitives to LayerNorm. Perhaps it can be tweaked to match for group norm as well.
cc : @pfultz2 what do you think ?

Copy link
Collaborator

@CharlieL7 CharlieL7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good overall. Doesn't support dynamic shapes; will have to make updates for that later on. Add some fp16 verify_onnx tests.

Comment on lines +110 to +122
if(skipped_axes > 0)
{
auto x_dims = x_shape.lens();
scale_bcast = info.add_instruction(
make_op("broadcast", {{"axis", skipped_axes}, {"out_lens", x_dims}}), scale);
if(not skip_bias)
{
bias_bcast = info.add_instruction(
make_op("broadcast", {{"axis", skipped_axes}, {"out_lens", x_dims}}), bias);
}
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that this will not work if x_shape is dynamic. Probably outside of what we want to get done with this PR however. So should be fine.

test/onnx/verify_onnx.cpp Outdated Show resolved Hide resolved
@causten causten merged commit 52c74f0 into ROCm:develop Oct 17, 2023
14 of 15 checks passed
@causten causten deleted the onnx_missing_norms branch October 17, 2023 18:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GroupNormalization operator is unsupported
4 participants