-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tracking] OnnxToLinalg Op Support #450
Comments
@vivekkhandelwal1 and @rsuderman , VAI ML team ran VISION CNN designs through Shark FE: List of failures are:
I will make these tests available, but it may be more productive to look for failing operator level tests amongst the three operator suites and fix those first. |
@vivekkhandelwal1 According to the triage of iree onnx tests failures, below is the list of additional ops for which OnnxToTorch lowering exists but it fails during the TorchToLinalg lowering on the following tests. Please add them to the above list. To reproduce issue:
|
|
This fixes some onnx lit tests not lowering to linalg in nod-ai/SHARK-ModelDev#450
This issue has become stale since we use #797 for tracking the ops failing during Torch->Linalg lowering. This issue was created to add the support for failing lit tests. Hence, closing this issue now. |
Below is the list of ops for which OnnxToTorch lowering exists but it fails during the TorchToLinalg lowering. Out of the 281 tests only 25 are failing.
These ops do lower to the Torch but fail during the TorchToLinalg lowering. To reproduce the error take the respective lit test for the corresponding op from the test file and try to lower that to linalg separately, you will see the error.
To fix the issue, you need to either modify the OnnxToTorch lowering of the corresponding op or add the missing support in the TorchToLinalg lowering.
Op | Original Author
High-priority Ops:
Low-priority Ops:
The text was updated successfully, but these errors were encountered: