Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set the SO version correctly #2355

Merged
merged 3 commits into from
Oct 25, 2023
Merged

Set the SO version correctly #2355

merged 3 commits into from
Oct 25, 2023

Conversation

pfultz2
Copy link
Collaborator

@pfultz2 pfultz2 commented Oct 20, 2023

Different migraphx versions are not binary compatible, so we need to make sure a different major version is used for every version of migraphx.

@pfultz2 pfultz2 requested review from umangyadav and causten October 20, 2023 19:47
@@ -76,7 +76,8 @@ include(ROCMSetupVersion)
option(BUILD_DEV "Build for development purpose only" OFF)

rocm_setup_version(VERSION 2.8.0)
set(MIGRAPHX_SO_VERSION ${PROJECT_VERSION_MAJOR}.${PROJECT_VERSION_MINOR})
math(EXPR MIGRAPHX_SO_MAJOR_VERSION "(${PROJECT_VERSION_MAJOR} * 1000 * 1000) + (${PROJECT_VERSION_MINOR} * 1000) + ${PROJECT_VERSION_PATCH}")
set(MIGRAPHX_SO_VERSION ${MIGRAPHX_SO_MAJOR_VERSION}.0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#2267 (comment)

Isn't the rocm_set_soversion already doing this ?

Copy link
Collaborator Author

@pfultz2 pfultz2 Oct 20, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No its not. For 2.7 it will set the SO version to 2.7.{rocm_version}, which the minor version should be part of the major version(and no minor version), so it should be 2007000.0.{rocm_version}.

Thats because between 2.6 and 2.7 we want different major versions for the SO. So we want the versions to be 2007000.0.{rocm_version1} and 2006000.0.{rocm_version2}, not 2.7.{rocm_version1} and 2.6.{rocm_version2}(which incorrectly implies they are compatible).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@frepaul can you approve this? Would it result in any incompatibilities?

@codecov
Copy link

codecov bot commented Oct 20, 2023

Codecov Report

Merging #2355 (d3188f7) into develop (d1abf06) will not change coverage.
Report is 1 commits behind head on develop.
The diff coverage is n/a.

❗ Current head d3188f7 differs from pull request most recent head a4b2521. Consider uploading reports for the commit a4b2521 to get more accurate results

@@           Coverage Diff            @@
##           develop    #2355   +/-   ##
========================================
  Coverage    91.36%   91.36%           
========================================
  Files          440      440           
  Lines        16530    16530           
========================================
  Hits         15101    15101           
  Misses        1429     1429           

@umangyadav umangyadav requested a review from frepaul October 20, 2023 21:43
@migraphx-bot
Copy link
Collaborator

migraphx-bot commented Oct 20, 2023

Test Batch Rate new
ac6a4d
Rate old
7604ec
Diff Compare
torchvision-resnet50 64 2,853.40 2,855.38 -0.07%
torchvision-resnet50_fp16 64 6,484.45 6,483.72 0.01%
torchvision-densenet121 32 2,099.50 2,107.65 -0.39%
torchvision-densenet121_fp16 32 3,680.61 3,679.81 0.02%
torchvision-inceptionv3 32 1,600.17 1,596.38 0.24%
torchvision-inceptionv3_fp16 32 2,594.10 2,590.76 0.13%
cadene-inceptionv4 16 707.96 708.30 -0.05%
cadene-resnext64x4 16 699.07 698.55 0.08%
slim-mobilenet 64 8,347.25 8,354.88 -0.09%
slim-nasnetalarge 64 227.13 226.97 0.07%
slim-resnet50v2 64 2,680.53 2,678.80 0.06%
bert-mrpc-onnx 8 825.11 825.26 -0.02%
bert-mrpc-tf 1 388.05 389.50 -0.37%
pytorch-examples-wlang-gru 1 293.21 297.44 -1.42%
pytorch-examples-wlang-lstm 1 310.22 310.76 -0.17%
torchvision-resnet50_1 1 599.19 603.77 -0.76%
torchvision-inceptionv3_1 1 339.83 337.26 0.76%
cadene-dpn92_1 1 394.01 393.84 0.04%
cadene-resnext101_1 1 329.92 329.00 0.28%
slim-vgg16_1 1 464.51 465.41 -0.19%
slim-mobilenet_1 1 2,032.76 2,060.76 -1.36%
slim-inceptionv4_1 1 215.69 217.78 -0.96%
onnx-taau-downsample 1 306.93 306.72 0.07%
dlrm-criteoterabyte 1 21.70 21.70 -0.01%
dlrm-criteoterabyte_fp16 1 40.72 40.73 -0.03%
agentmodel 1 5,909.78 5,827.27 1.42%
unet_fp16 2 56.01 56.00 0.01%
resnet50v1_fp16 1 947.02 946.70 0.03%
bert_base_cased_fp16 64 970.97 971.03 -0.01%
bert_large_uncased_fp16 32 305.31 305.19 0.04%
bert_large_fp16 1 167.27 166.63 0.38%
distilgpt2_fp16 16 1,280.65 1,279.92 0.06%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


    :white_check_mark:bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

    :white_check_mark:bert-mrpc-tf: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

    :white_check_mark:pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

    :white_check_mark:torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

🔴torchvision-inceptionv3_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:cadene-dpn92_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:cadene-resnext101_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-vgg16_1: PASSED: MIGraphX meets tolerance

    :white_check_mark:slim-mobilenet_1: PASSED: MIGraphX meets tolerance

🔴slim-inceptionv4_1: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

    :white_check_mark:agentmodel: PASSED: MIGraphX meets tolerance

    :white_check_mark:unet: PASSED: MIGraphX meets tolerance

    :white_check_mark:resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


🔴bert_large_uncased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


    :white_check_mark:bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

Copy link
Member

@umangyadav umangyadav left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approving as this change doesn't affect the API library. For the API library, it would follow whatever rocm_set_soversion scheme as defined here.
(https://github.com/ROCmSoftwarePlatform/AMDMIGraphX/blob/d1abf06fa61a2833c7b66feae4a02696f8684db7/src/api/CMakeLists.txt#L33)

@causten causten merged commit 685168f into develop Oct 25, 2023
8 of 9 checks passed
@causten causten deleted the so-version branch October 25, 2023 02:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants