Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Print Ref compiled program in verify tests #2530

Merged
merged 1 commit into from
Dec 7, 2023
Merged

Print Ref compiled program in verify tests #2530

merged 1 commit into from
Dec 7, 2023

Conversation

umangyadav
Copy link
Member

Solution to #2525 and #2520 would mean changing compilation pipeline for ref program.
It is easier for debug purpose to print compiled ref program for comparison.

This PR allows to print it. Previously verify test printed uncompiled program for "ref".

@umangyadav umangyadav requested a review from CharlieL7 December 7, 2023 14:26
@umangyadav umangyadav self-assigned this Dec 7, 2023
@umangyadav umangyadav added the simple small or simple changes label Dec 7, 2023
@umangyadav umangyadav added the skip bot checks Skips the Performance and Accuracy CI tests label Dec 7, 2023
Copy link
Collaborator

@CharlieL7 CharlieL7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, doesn't solve the issue from #2525 however because the problem is that when ref runs with fp16 the accuracy is inherently lower than when running with higher floating point precision. Comparing other targets to ref in the verify tests then becomes as wild goose chase to figure out where the error is coming from.

@umangyadav
Copy link
Member Author

LGTM, doesn't solve the issue from #2525 however because the problem is that when ref runs with fp16 the accuracy is inherently lower than when running with higher floating point precision. Comparing other targets to ref in the verify tests then becomes as wild goose chase to figure out where the error is coming from.

Yes this doesn't solve anything. This is just useful for debugging purposes.

@migraphx-bot
Copy link
Collaborator

Test Batch Rate new
d11208
Rate old
a09dc5
Diff Compare
torchvision-resnet50 64 2,834.63 2,833.31 0.05%
torchvision-resnet50_fp16 64 6,499.61 6,501.92 -0.04%
torchvision-densenet121 32 2,093.02 2,094.20 -0.06%
torchvision-densenet121_fp16 32 3,660.75 3,665.08 -0.12%
torchvision-inceptionv3 32 1,598.67 1,594.40 0.27%
torchvision-inceptionv3_fp16 32 2,561.17 2,557.61 0.14%
cadene-inceptionv4 16 722.63 721.79 0.12%
cadene-resnext64x4 16 692.81 692.75 0.01%
slim-mobilenet 64 8,324.54 8,332.19 -0.09%
slim-nasnetalarge 64 230.67 230.63 0.02%
slim-resnet50v2 64 2,665.19 2,664.91 0.01%
bert-mrpc-onnx 8 824.34 823.07 0.15%
bert-mrpc-tf 1 390.56 389.27 0.33%
pytorch-examples-wlang-gru 1 301.15 299.82 0.44%
pytorch-examples-wlang-lstm 1 316.51 308.52 2.59%
torchvision-resnet50_1 1 607.67 602.66 0.83%
torchvision-inceptionv3_1 1 341.01 343.14 -0.62%
cadene-dpn92_1 1 402.83 404.59 -0.44%
cadene-resnext101_1 1 328.93 328.32 0.19%
slim-vgg16_1 1 459.43 460.67 -0.27%
slim-mobilenet_1 1 2,131.89 2,099.38 1.55%
slim-inceptionv4_1 1 213.71 213.62 0.05%
onnx-taau-downsample 1 305.89 304.86 0.34%
dlrm-criteoterabyte 1 21.59 21.62 -0.15%
dlrm-criteoterabyte_fp16 1 40.62 40.65 -0.08%
agentmodel 1 6,021.05 5,935.95 1.43%
unet_fp16 2 54.76 54.68 0.14%
resnet50v1_fp16 1 948.01 932.98 1.61%
bert_base_cased_fp16 64 903.62 903.11 0.06%
bert_large_uncased_fp16 32 285.79 285.63 0.06%
bert_large_fp16 1 166.57 166.70 -0.08%
distilgpt2_fp16 16 1,281.94 1,281.25 0.05%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


     ✅ bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

     ✅ bert-mrpc-tf: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

     ✅ torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

     ✅ torchvision-inceptionv3_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-dpn92_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-resnext101_1: PASSED: MIGraphX meets tolerance

     ✅ slim-vgg16_1: PASSED: MIGraphX meets tolerance

     ✅ slim-mobilenet_1: PASSED: MIGraphX meets tolerance

     ✅ slim-inceptionv4_1: PASSED: MIGraphX meets tolerance

     ✅ dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

     ✅ agentmodel: PASSED: MIGraphX meets tolerance

     ✅ unet: PASSED: MIGraphX meets tolerance

     ✅ resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


     ✅ bert_large_uncased_fp16: PASSED: MIGraphX meets tolerance

     ✅ bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

@causten causten merged commit fe61d94 into develop Dec 7, 2023
37 of 41 checks passed
@causten causten deleted the print_ref branch December 7, 2023 19:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
simple small or simple changes skip bot checks Skips the Performance and Accuracy CI tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants