Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proxy torch.nn.Parameter for PyTorch optimizers #3

Open
ifsheldon opened this issue Dec 30, 2021 · 2 comments
Open

Proxy torch.nn.Parameter for PyTorch optimizers #3

ifsheldon opened this issue Dec 30, 2021 · 2 comments
Labels
Milestone

Comments

@ifsheldon
Copy link
Owner

ifsheldon commented Dec 30, 2021

Now Tin and Tube are subclasses of torch.nn.Module and they can have learnable parameters in the form of values in Taichi fields. However, now these values cannot be optimized by PyTorch optimizers, since they are not PyTorch-compatible. One way to make them to be PyTorch-compatible is to use a proxy torch.nn.Parameter and sync the values of torch.nn.Parameter with those in the corresponding Taichi field.

If anyone come up with a better solution, discussions and PRs are always welcomed.

@ifsheldon ifsheldon added PyTorch-related welcome_contribution contributions are welcome labels Dec 30, 2021
@ifsheldon ifsheldon moved this to Todo in stannum 1.0 Sep 27, 2022
@ifsheldon ifsheldon added this to the v1.0 milestone Sep 27, 2022
@ifsheldon ifsheldon changed the title Proxy torch.nn.Parameter in Tin for PyTorch optimizers Proxy torch.nn.Parameter for PyTorch optimizers Jan 3, 2023
@rdesc
Copy link

rdesc commented Oct 24, 2023

Hi! I'm hoping to try doing RL with difftaichi and stannum. Any update on this issue?

@ifsheldon
Copy link
Owner Author

Hi @rdesc ! I'm sorry, I have little time to work on this. If you can make a PR, I will be happy to review it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Todo&HelpWanted
Development

No branches or pull requests

2 participants