Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix remove rocblas tests #3100

Merged
merged 6 commits into from
May 22, 2024
Merged

Fix remove rocblas tests #3100

merged 6 commits into from
May 22, 2024

Conversation

tvukovic-amd
Copy link
Collaborator

  • Added unsupported fp8 type for dot in case when MIGRAPHX_USE_ROCBLAS is OFF since mlir dot doesn't support fp8
  • execute gemm tune tests when MIGRAPHX_USE_ROCBLAS is ON since they use rocBLAS

@apwojcik

@tvukovic-amd tvukovic-amd requested a review from causten as a code owner May 17, 2024 07:59
Copy link

codecov bot commented May 17, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 91.82%. Comparing base (1f07af9) to head (c97572a).
Report is 150 commits behind head on develop.

Additional details and impacted files
@@           Coverage Diff            @@
##           develop    #3100   +/-   ##
========================================
  Coverage    91.82%   91.82%           
========================================
  Files          486      486           
  Lines        18991    18991           
========================================
  Hits         17438    17438           
  Misses        1553     1553           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented May 17, 2024

Test Batch Rate new
c97572
Rate old
93d77e
Diff Compare
torchvision-resnet50 64 2,790.70 2,789.01 0.06%
torchvision-resnet50_fp16 64 6,204.32 6,208.17 -0.06%
torchvision-densenet121 32 2,094.36 2,090.07 0.21%
torchvision-densenet121_fp16 32 3,615.08 3,614.88 0.01%
torchvision-inceptionv3 32 1,595.53 1,601.73 -0.39%
torchvision-inceptionv3_fp16 32 2,557.93 2,559.67 -0.07%
cadene-inceptionv4 16 716.58 716.67 -0.01%
cadene-resnext64x4 16 678.32 678.24 0.01%
slim-mobilenet 64 5,814.53 5,817.12 -0.04%
slim-nasnetalarge 64 154.28 154.27 0.01%
slim-resnet50v2 64 2,579.22 2,577.22 0.08%
bert-mrpc-onnx 8 968.90 969.57 -0.07%
bert-mrpc-tf 1 405.88 408.71 -0.69%
pytorch-examples-wlang-gru 1 391.77 390.29 0.38%
pytorch-examples-wlang-lstm 1 368.35 374.69 -1.69%
torchvision-resnet50_1 1 603.21 601.08 0.35%
cadene-dpn92_1 1 383.69 384.30 -0.16%
cadene-resnext101_1 1 325.01 323.16 0.57%
onnx-taau-downsample 1 306.96 307.14 -0.06%
dlrm-criteoterabyte 1 28.54 28.50 0.12%
dlrm-criteoterabyte_fp16 1 47.17 47.20 -0.06%
agentmodel 1 7,584.43 7,635.69 -0.67%
unet_fp16 2 56.50 56.51 -0.02%
resnet50v1_fp16 1 891.98 911.93 -2.19%
resnet50v1_int8 1 794.69 783.20 1.47%
bert_base_cased_fp16 64 1,021.04 1,021.20 -0.01%
bert_large_uncased_fp16 32 299.00 298.94 0.02%
bert_large_fp16 1 157.81 156.34 0.94%
distilgpt2_fp16 16 1,830.77 1,831.72 -0.05%
yolov5s 1 478.01 476.00 0.42%
tinyllama 1 33.01 33.00 0.03%
vicuna-fastchat 1 158.69 157.90 0.51%
whisper-tiny-encoder 1 349.66 350.12 -0.13%
whisper-tiny-decoder 1 401.76 401.76 0.00%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator

migraphx-bot commented May 17, 2024


     ✅ bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

     ✅ bert-mrpc-tf: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

     ✅ torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-dpn92_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-resnext101_1: PASSED: MIGraphX meets tolerance

     ✅ dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

     ✅ agentmodel: PASSED: MIGraphX meets tolerance

     ✅ unet: PASSED: MIGraphX meets tolerance

     ✅ resnet50v1: PASSED: MIGraphX meets tolerance

     ✅ bert_base_cased_fp16: PASSED: MIGraphX meets tolerance

🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


     ✅ bert_large: PASSED: MIGraphX meets tolerance

     ✅ yolov5s: PASSED: MIGraphX meets tolerance

     ✅ tinyllama: PASSED: MIGraphX meets tolerance

     ✅ vicuna-fastchat: PASSED: MIGraphX meets tolerance

     ✅ whisper-tiny-encoder: PASSED: MIGraphX meets tolerance

     ✅ whisper-tiny-decoder: PASSED: MIGraphX meets tolerance

     ✅ distilgpt2_fp16: PASSED: MIGraphX meets tolerance

Comment on lines 107 to 109
// mlir doesn't support fp8 dot
unsupported_fp8_ops.insert("dot");
unsupported_fp8_ops.insert("quant_dot");
Copy link
Member

@umangyadav umangyadav May 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This shouldn't be necessary.

MIGraphX doesn't pick rocMLIR for the FP8 dot/quant_dots. It would use rocBLAS for fp8 dots,quant_dots.

Do you have a case where rocBLAS is disabled in MIGraphX ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is required when rocBLAS is not used (see MIGRAPHX_USE_ROCBLAS=OFF).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is added for case where rocBLAS is disabled in MIGraphX. There is variable MIGRAPHX_USE_ROCBLAS in develop branch, so when it is set to OFF we need to add unsupported fp8 ops for dot and quant dot.

#else
// mlir doesn't support fp8 dot
unsupported_fp8_ops.insert("dot");
unsupported_fp8_ops.insert("quant_dot");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This just duplicating code for both branches. It would be better to have rocblas_fp8_available function that just always returns false when rocblas is disabled.

@tvukovic-amd tvukovic-amd force-pushed the fix_remove_rocblas_tests branch from c57faf5 to eb15631 Compare May 20, 2024 09:13
@causten causten merged commit fea6dd6 into develop May 22, 2024
43 of 44 checks passed
@causten causten deleted the fix_remove_rocblas_tests branch May 22, 2024 05:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants