Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Turbine Camp #828

Open
3 of 25 tasks
pdhirajkumarprasad opened this issue Sep 12, 2024 · 2 comments
Open
3 of 25 tasks

Turbine Camp #828

pdhirajkumarprasad opened this issue Sep 12, 2024 · 2 comments
Assignees

Comments

@pdhirajkumarprasad
Copy link

pdhirajkumarprasad commented Sep 12, 2024

Introduction

This page provides information on implementing complete support of ONNX operators in Shark/IREE front-end. This effort is a part of overall ONNX quality improvement tracked by #812

Pre-requisites

Development Workspace

Create a fork of https://github.com/llvm/torch-mlir and follow build instructions to build a clone of your fork and push changes using Pull Requests

Guidelines

  • Assign an ONNX Ops from the list below to your GitHub ID by hovering your mouse on it and clicking on the bullseye (that will show to the right) to create a new issue, assign that issue to yourself and use that to track your work
  • Study (thoroughly) the specification of the operator you picked on onnx-ops
  • Use alt_e2eshark test to create multiple tests to comprehensively test all features and attributes of the operator -- this also help you understand how the operator is expected to behave
  • Test torch-mlir/iree using the tests created and make code changes in torch-mlir to have correct inference numeric for llvm-cpu target working, if an iree codgen issue, file an issue on iree and note that in issue you created and assigned to self for the op

Contact:

Onnx Ops( Known missing features)

Onnx Ops( Known Quality Issues)

Other tasks:

llvm/torch-mlir#3796

@knwng
Copy link

knwng commented Oct 3, 2024

Seems LSTM also has quality issue. But it seems the testcases themselves are buggy(wrong # of inputs & outputs). Besides, RNN only has 1 testcase rn. I'm afraid it can't cover all the corner cases.

@vivekkhandelwal1
Copy link
Contributor

Seems LSTM also has quality issue. But it seems the testcases themselves are buggy(wrong # of inputs & outputs). Besides, RNN only has 1 testcase rn. I'm afraid it can't cover all the corner cases.

Yeah, the issue for LSTM is already opened here #315.

@vivekkhandelwal1 vivekkhandelwal1 unpinned this issue Oct 8, 2024
@marbre marbre pinned this issue Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants