Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split onnx parse tests into separate files #2550

Merged
merged 23 commits into from
Dec 19, 2023
Merged

Split onnx parse tests into separate files #2550

merged 23 commits into from
Dec 19, 2023

Conversation

pfultz2
Copy link
Collaborator

@pfultz2 pfultz2 commented Dec 13, 2023

So this puts the parse tests under test/onnx/parse. I plan to split the verify tests as well and put them into the test/onnx/verify directory. There is a cpp file for each onnx file(except the rnn tests which I am not splitting).

In the future, we could also move the onnx files to the test/onnx/models directory, but for now I just split the tests.

@codecov-commenter
Copy link

codecov-commenter commented Dec 14, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (362c5fe) 91.39% compared to head (97f474f) 91.39%.
Report is 4 commits behind head on develop.

❗ Current head 97f474f differs from pull request most recent head ec5d195. Consider uploading reports for the commit ec5d195 to get more accurate results

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #2550      +/-   ##
===========================================
- Coverage    91.39%   91.39%   -0.01%     
===========================================
  Files          454      454              
  Lines        17193    17191       -2     
===========================================
- Hits         15713    15711       -2     
  Misses        1480     1480              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Dec 14, 2023

Test Batch Rate new
ec5d19
Rate old
a11f9f
Diff Compare
torchvision-resnet50 64 2,834.49 2,833.53 0.03%
torchvision-resnet50_fp16 64 6,500.89 6,501.06 -0.00%
torchvision-densenet121 32 2,084.73 2,089.91 -0.25%
torchvision-densenet121_fp16 32 3,664.02 3,664.45 -0.01%
torchvision-inceptionv3 32 1,595.46 1,597.13 -0.10%
torchvision-inceptionv3_fp16 32 2,559.34 2,561.89 -0.10%
cadene-inceptionv4 16 722.04 722.08 -0.01%
cadene-resnext64x4 16 691.87 692.01 -0.02%
slim-mobilenet 64 8,331.81 8,336.61 -0.06%
slim-nasnetalarge 64 230.55 230.57 -0.01%
slim-resnet50v2 64 2,661.83 2,662.60 -0.03%
bert-mrpc-onnx 8 824.03 824.12 -0.01%
bert-mrpc-tf 1 388.62 387.62 0.26%
pytorch-examples-wlang-gru 1 304.43 298.79 1.89%
pytorch-examples-wlang-lstm 1 306.22 314.63 -2.67%
torchvision-resnet50_1 1 601.28 607.64 -1.05%
torchvision-inceptionv3_1 1 344.21 345.23 -0.30%
cadene-dpn92_1 1 404.32 404.54 -0.05%
cadene-resnext101_1 1 328.08 328.67 -0.18%
slim-vgg16_1 1 458.44 460.36 -0.42%
slim-mobilenet_1 1 2,127.62 2,132.36 -0.22%
slim-inceptionv4_1 1 214.68 214.32 0.17%
onnx-taau-downsample 1 306.56 306.25 0.10%
dlrm-criteoterabyte 1 21.60 21.62 -0.08%
dlrm-criteoterabyte_fp16 1 40.62 40.65 -0.06%
agentmodel 1 6,006.19 5,921.64 1.43%
unet_fp16 2 54.72 54.77 -0.09%
resnet50v1_fp16 1 942.13 920.20 2.38%
bert_base_cased_fp16 64 903.20 903.39 -0.02%
bert_large_uncased_fp16 32 285.70 285.66 0.02%
bert_large_fp16 1 166.96 166.76 0.12%
distilgpt2_fp16 16 1,280.34 1,281.29 -0.07%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


     ✅ bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

     ✅ bert-mrpc-tf: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

     ✅ torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

     ✅ torchvision-inceptionv3_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-dpn92_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-resnext101_1: PASSED: MIGraphX meets tolerance

     ✅ slim-vgg16_1: PASSED: MIGraphX meets tolerance

     ✅ slim-mobilenet_1: PASSED: MIGraphX meets tolerance

     ✅ slim-inceptionv4_1: PASSED: MIGraphX meets tolerance

     ✅ dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

     ✅ agentmodel: PASSED: MIGraphX meets tolerance

     ✅ unet: PASSED: MIGraphX meets tolerance

     ✅ resnet50v1: PASSED: MIGraphX meets tolerance

     ✅ bert_base_cased_fp16: PASSED: MIGraphX meets tolerance

     ✅ bert_large_uncased_fp16: PASSED: MIGraphX meets tolerance

     ✅ bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

Copy link
Contributor

@lakhinderwalia lakhinderwalia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving it, with some suggested cosmetic changes, below:

  1. Please spell-check the PR title.
  2. Please update the copyright timestamps. Not just for the new files.

@pfultz2 pfultz2 changed the title Split onnx parse tests into seperate files Split onnx parse tests into separate files Dec 15, 2023
@TedThemistokleous TedThemistokleous added Continous Integration Pull request updates parts of continous integration pipeline Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR labels Dec 18, 2023
@TedThemistokleous
Copy link
Collaborator

LGTM. Fix your CI errors (format/tidy) but looks good

@causten causten merged commit 1f2792a into develop Dec 19, 2023
14 of 16 checks passed
@causten causten deleted the onnx-split-tests2 branch December 19, 2023 16:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR Continous Integration Pull request updates parts of continous integration pipeline
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants