Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure unique module name for MLIR standalone ops #2360

Merged
merged 2 commits into from
Oct 24, 2023

Conversation

pfultz2
Copy link
Collaborator

@pfultz2 pfultz2 commented Oct 24, 2023

No description provided.

@codecov
Copy link

codecov bot commented Oct 24, 2023

Codecov Report

Merging #2360 (82a9837) into develop (7604ecf) will not change coverage.
The diff coverage is n/a.

❗ Current head 82a9837 differs from pull request most recent head 5db8108. Consider uploading reports for the commit 5db8108 to get more accurate results

@@           Coverage Diff            @@
##           develop    #2360   +/-   ##
========================================
  Coverage    91.36%   91.36%           
========================================
  Files          440      440           
  Lines        16530    16530           
========================================
  Hits         15101    15101           
  Misses        1429     1429           

@migraphx-bot
Copy link
Collaborator

Test Batch Rate new
5db810
Rate old
8f9ccb
Diff Compare
torchvision-resnet50 64 2,851.53 2,854.63 -0.11%
torchvision-resnet50_fp16 64 6,478.03 6,489.92 -0.18%
torchvision-densenet121 32 2,091.02 2,106.61 -0.74%
torchvision-densenet121_fp16 32 3,667.28 3,680.45 -0.36%
torchvision-inceptionv3 32 1,596.64 1,595.87 0.05%
torchvision-inceptionv3_fp16 32 2,590.76 2,592.37 -0.06%
cadene-inceptionv4 16 707.19 707.68 -0.07%
cadene-resnext64x4 16 697.34 698.44 -0.16%
slim-mobilenet 64 8,359.45 8,358.57 0.01%
slim-nasnetalarge 64 226.84 226.81 0.01%
slim-resnet50v2 64 2,674.31 2,676.63 -0.09%
bert-mrpc-onnx 8 825.25 824.55 0.09%
bert-mrpc-tf 1 389.97 388.46 0.39%
pytorch-examples-wlang-gru 1 300.16 296.63 1.19%
pytorch-examples-wlang-lstm 1 311.17 310.39 0.25%
torchvision-resnet50_1 1 600.30 595.23 0.85%
torchvision-inceptionv3_1 1 338.75 339.04 -0.08%
cadene-dpn92_1 1 395.05 395.45 -0.10%
cadene-resnext101_1 1 329.71 330.25 -0.17%
slim-vgg16_1 1 464.63 462.27 0.51%
slim-mobilenet_1 1 2,037.36 2,049.06 -0.57%
slim-inceptionv4_1 1 216.25 216.99 -0.34%
onnx-taau-downsample 1 306.58 306.09 0.16%
dlrm-criteoterabyte 1 21.68 21.69 -0.05%
dlrm-criteoterabyte_fp16 1 40.72 40.69 0.08%
agentmodel 1 5,847.95 5,766.41 1.41%
unet_fp16 2 55.98 55.98 0.00%
resnet50v1_fp16 1 951.39 936.62 1.58%
bert_base_cased_fp16 64 970.55 971.27 -0.07%
bert_large_uncased_fp16 32 304.96 305.19 -0.08%
bert_large_fp16 1 166.88 167.06 -0.11%
distilgpt2_fp16 16 1,279.12 1,279.62 -0.04%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

🔴torchvision-inceptionv3_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:cadene-dpn92_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

🔴slim-inceptionv4_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

    :white_check_mark:agentmodel: PASSED: MIGraphX meets tolerance

    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

    :white_check_mark:resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

@causten causten self-requested a review October 24, 2023 04:22
@causten causten merged commit d1abf06 into develop Oct 24, 2023
15 checks passed
@causten causten deleted the mlir-standalone-unique-module-name branch October 24, 2023 13:15
causten pushed a commit that referenced this pull request Oct 27, 2023
Cherry Pick ASAN build excluding additional bin files -1839 #2370
Use an older version of numpy for compatibility with Python3.6 #2369
Add space after rate mesage
Fix wrong size check when axes not present for slice (#2270)
Updated Changelog to document latest release work (#2363)
causten added a commit that referenced this pull request Nov 1, 2023
Cherry Pick ASAN build excluding additional bin files -1839 #2370
Use an older version of numpy for compatibility with Python3.6 #2369
Add space after rate message
Fix wrong size check when axes not present for slice (#2270)
Updated Changelog to document latest release work (#2363)
causten added a commit that referenced this pull request Nov 10, 2023
…2393)

Ensure unique module name for MLIR standalone ops (#2360)
Cherry Pick ASAN build excluding additional bin files -1839 #2370
Use an older version of numpy for compatibility with Python3.6 #2369
Add space after rate message
Fix wrong size check when axes not present for slice (#2270)
Updated Changelog to document latest release work (#2363)
Run simplify_qdq first before optimize_module (#2387)
fix unit tests and verification issues on navi
Set numpy to 1.21.6 and disable py3.6 test
Release notes & changelog updates (#2395)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants