-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add TV operator, merge ProximalGradient and AcceleratedProximalGradient, add GeneralizedProximalGradient #86
Conversation
Additional comment: I did not consider any backtracking for the GeneralizedProximalGradient. |
Thanks! These are nice additions :) A few questions/comments:
Finally, since this PR introduces a few breaking changes (no problem since we are still in version0) I want to make a release first so that people can still use their codes up till now, and then merge your PR as soon as the release is out. |
@reivilo3 I am going through your new
where I just want to find an example to show where this new TV operator is useful to add as part of this PR. |
[...]
|
I made a new commit with a tutorial like Python Notebook which uses ProximalGradient with this TV operator. |
Thanks! I see now :) I think I got confused as the way all proximal operators are implemented is slightly different from your TV. Basically to allow easy interoperability with pylops we always force the input output of a proximal to become a vector and internally we do the needed reshaping (I will adapt this to be consistent). But now it’s clear, also for the 2d case the input and output of your TV will be 2d array with same dimensions :) Regarding the notebook, can I move it to https://github.com/PyLops/pylops_notebooks/tree/master/pyproximal or even better you could make a PR. We don’t want to have notebooks in the main library repo as they don’t play well with version control and add actually quite a lot in terms of size to the repo. In coming days, I will take some parts of your notebook and included in an example of the documentation like all the others we have and do a bit of clean up of the docstring - I see some parameters are not used whilst still present in the docstring. Then we can merge this PR :) |
Sure you can. Note actually they do on VS Code, the application I'm using to code. I suggest you to have a try.
Perfect! |
Great :) I have some experience with VS code and notebooks, and I agree it is much better in terms of integration with git. However for code packages we prefer to keep notebooks out and have them in separate repositories. I finalized your PR with some minor cleaning of the code and more importantly I introduced a change such that the input and output of prox is always an array like for every other operator (which is needed to work nicely with solvers when dealing with multiple dimensions). I wanted to look if we could reduce code by using the adjoint of Gradient instead of making it by hand (but for now it works so I will leave it). Finally, you left a todo ( |
Thank you for the quick merge!
I copied this comment from PyUnLocBox so it is not mine. I guess it may be an inequality condition where |
I added the TV operator and its proximal operator, inspired by the code in PyUnLocBox.
I propose to merge ProximalGradient and AcceleratedProximalGradient, because the code is the same, it only requires to choose between
None
,fista
andvandenberghe
. IfNone
, the non-accelerated version is used. (Note I intend to merge ADMM and LinearizedADMM the same way later.)I also added an implementation of the GeneralizedProximalGradient solver, handling multiple differentiable functions gathered in
proxfs
and non-differentiable but convex functions with a prox gathered inproxgs
.I think I added a relevant test for the 1D case as well as associated docstring. I'm not adding a tutorial using these new features for available time issues, but I can certainly guide someone interested into doing so.
The TV operator still works up to 4D tensors, though.