Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WebNN EP] Use opSupportLimits to dynamically check data type support #22025

Merged
merged 5 commits into from
Sep 14, 2024

Conversation

Honry
Copy link
Contributor

@Honry Honry commented Sep 9, 2024

  • Remove hard code data type checks and use WebNN's opSupportLimits instead
  • Add HasSupportedOutputsImpl for output data type validation
  • Get preferred layout info from opSupportLimits
  • Move Not op to logical_op_builder.cc because it should be there. This avoid the inconsistent input names in unary_op_builder.cc.

@Honry Honry changed the title [WebNN EP] Use opSupportLimits to dynamicially check data type support [WebNN EP] Use opSupportLimits to dynamically check data type support Sep 9, 2024
@Honry
Copy link
Contributor Author

Honry commented Sep 9, 2024

@fdwr, @guschmue, PTAL, thanks!

cc/ @huningxin, @shiyi9801, @miaobin

- Remove hard code data type checks and use WebNN's opSupportLimits instead
- Add HasSupportedOutputsImpl for output data type validation
- Get preferred layout info from opSupportLimits
- Move Not op to logical_op_builder.cc because it should be there
fdwr
fdwr previously approved these changes Sep 13, 2024
Copy link
Contributor

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😎 Cool. I have some polish thoughts, but else LGTM.

@fdwr fdwr added the ep:WebNN WebNN execution provider label Sep 13, 2024
@fdwr
Copy link
Contributor

fdwr commented Sep 13, 2024

/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline

@fdwr
Copy link
Contributor

fdwr commented Sep 13, 2024

/azp run Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed,Windows x64 QNN CI Pipeline,Big Models

@fdwr
Copy link
Contributor

fdwr commented Sep 13, 2024

/azp run ONNX Runtime Web CI Pipeline,Windows GPU CI Pipeline

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Copy link

Azure Pipelines successfully started running 6 pipeline(s).

Copy link

Azure Pipelines successfully started running 9 pipeline(s).

@guschmue
Copy link
Contributor

/azp run ONNX Runtime Web CI Pipeline,Windows GPU CI Pipeline,Linux Android Emulator QNN CI Pipeline

@guschmue
Copy link
Contributor

/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline

Copy link

Azure Pipelines successfully started running 2 pipeline(s).

@guschmue
Copy link
Contributor

/azp run Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed,Windows x64 QNN CI Pipeline,Big Models

@guschmue
Copy link
Contributor

/azp run Windows GPU CUDA CI Pipeline,Windows GPU DML CI Pipeline,Windows GPU Doc Gen CI Pipeline

Copy link

Azure Pipelines successfully started running 3 pipeline(s).

Copy link

Azure Pipelines successfully started running 6 pipeline(s).

Copy link

Azure Pipelines successfully started running 9 pipeline(s).

fdwr
fdwr previously approved these changes Sep 14, 2024
Copy link
Contributor

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking nice, notably shorter than before. I have some minor perf requests that hopefully won't touch too many lines.

onnxruntime/core/providers/webnn/builders/helper.cc Outdated Show resolved Hide resolved
onnxruntime/core/providers/webnn/builders/helper.cc Outdated Show resolved Hide resolved
Copy link
Contributor

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks nice. TY.

@fdwr
Copy link
Contributor

fdwr commented Sep 14, 2024

/azp run ONNX Runtime Web CI Pipeline,Windows GPU CI Pipeline,Linux Android Emulator QNN CI Pipeline

@fdwr
Copy link
Contributor

fdwr commented Sep 14, 2024

/azp run Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU TensorRT CI Pipeline,Linux OpenVINO CI Pipeline,Linux QNN CI Pipeline,MacOS CI Pipeline,Windows ARM64 QNN CI Pipeline,Windows CPU CI Pipeline

Copy link

Azure Pipelines successfully started running 2 pipeline(s).

@fdwr
Copy link
Contributor

fdwr commented Sep 14, 2024

/azp run Windows GPU TensorRT CI Pipeline,onnxruntime-binary-size-checks-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed,Windows x64 QNN CI Pipeline,Big Models

@fdwr
Copy link
Contributor

fdwr commented Sep 14, 2024

/azp run Windows GPU CUDA CI Pipeline,Windows GPU DML CI Pipeline,Windows GPU Doc Gen CI Pipeline

Copy link

Azure Pipelines successfully started running 3 pipeline(s).

Copy link

Azure Pipelines successfully started running 9 pipeline(s).

Copy link

Azure Pipelines successfully started running 6 pipeline(s).

@fdwr fdwr merged commit c63dd02 into microsoft:main Sep 14, 2024
71 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:WebNN WebNN execution provider
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants