Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pull] main from llvm:main #19

Closed
wants to merge 10 commits into from
Closed

[pull] main from llvm:main #19

wants to merge 10 commits into from

Conversation

pull[bot]
Copy link

@pull pull bot commented May 6, 2024

See Commits and Changes for more details.


Created by pull[bot]

Can you help keep this open source service alive? 💖 Please sponsor : )

This fixes some onnx lit tests not lowering to linalg in
nod-ai/SHARK-ModelDev#450
@pull pull bot added the ⤵️ pull label May 6, 2024
vivekkhandelwal1 and others added 9 commits May 6, 2024 17:26
Adds OnnxToTorch lowering for the `onnx.HammingWindow` op.
While waiting for the full resolution of feature request
pytorch/pytorch#117188
(which will propagate sparsity the right way in upstream PyTorch for all
FX Graphs), this minor change allows us to start testing sparsity
"within" a network, rather than just the parameters. Feel free to add
your own rules for testing (but within reason for what will be done
upstream).

Note, two TODOs need to be addressed to work around some pending issues
to make the JIT execution work.
…ked_twin for stricter semantics (#3071)

This PR decomposes all index_put-like op to aten.index_put.hacked_twin for stricter semantics, i.e., no None index in indices argument.
This small change seems to dramatically improve shape inference for
complex models, and consequently, improves onnx importer reliability.
@vinayakdsci vinayakdsci closed this May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants