Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add exponential fitting for ZNE #5972

Merged
merged 27 commits into from
Jul 23, 2024

Conversation

natestemen
Copy link
Contributor

@natestemen natestemen commented Jul 9, 2024

Context: As part of increasing the utility of ZNE within Catalyst, we'd like to add the option to use exponential fitting with ZNE. The issue

Description of the Change: A function which is written using JAX is added which performs an exponential fit of data, and evaluates said fit at 0.

Benefits: Increased capabilities for users to play with. This is especially important given the benefits of tuning ZNE to the problem and backend at hand.

Possible Drawbacks: The current implementation transforms data using the natural logarithm $\ln$ to linearize the data so that a linear fit can be applied to learn the necessary parameters (see section below on the exponential model). Once they transformed parameters are learned, they can be converted back into those from the exponential model quite simply. The major drawback of this approach is that they expectation values have to be shifted to be in the domain of $\ln$ (i.e. $> 0$), or thrown away if small enough (see $\varepsilon$ eps parameter introduced in expontial_extrapolate).

Related GitHub Issues: PennyLaneAI/catalyst#754

Exponential model: Assuming an exponential model with asymptote $y(x) = A\mathrm{e}^{Bx} + C$ we must first turn this into a linear equation to be able to apply a linear fit.

$$y(x) = A\mathrm{e}^{Bx} + C$$

$$y(x) - C = A\mathrm{e}^{Bx}$$

$$\ln(y(x) - C) = \ln(A) + Bx$$

$$\tilde{y}(x) = \tilde{A} + Bx$$

Once $\tilde{A}$ is learned, the hard part is over as the ZNE value is "simply" $y(0) = A + C$.

@josh146
Copy link
Member

josh146 commented Jul 11, 2024

Thanks for this PR @natestemen!

⚠️ draft PR note: I'm currently having trouble getting tests to pass on this change. I am aware and actively working on it. Putting this PR up for visibility and in case anyone has clues where I'm messing things up.

Feel free to tag us directly on the PR, or alternatively let us know if you want anyone to go through the PR.

@natestemen
Copy link
Contributor Author

Guidance would be appreciated on resolving the CI error raised here.

@natestemen natestemen marked this pull request as ready for review July 12, 2024 06:18
@natestemen
Copy link
Contributor Author

@josh146 ready for the first look!

@rmoyard rmoyard self-requested a review July 15, 2024 14:22
Copy link
Contributor

@rmoyard rmoyard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @natestemen, it looks great! I suggest you use the math module from PennyLane then the function is compatible with all frameworks and add some tests for the different ML frameworks. Otherwise make sure to parametize the unit test for the fit.

pennylane/transforms/mitigate.py Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
tests/transforms/test_mitigate.py Outdated Show resolved Hide resolved
tests/transforms/test_mitigate.py Outdated Show resolved Hide resolved
tests/transforms/test_mitigate.py Outdated Show resolved Hide resolved
tests/transforms/test_mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
tests/transforms/test_mitigate.py Outdated Show resolved Hide resolved
@natestemen natestemen requested a review from rmoyard July 17, 2024 00:55
Copy link
Contributor

@rmoyard rmoyard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks good to me! The failure seems unrelated. We need two approvals, I will tag someone from the core team.

Copy link

codecov bot commented Jul 17, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.65%. Comparing base (f454f59) to head (155e090).
Report is 294 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5972      +/-   ##
==========================================
- Coverage   99.66%   99.65%   -0.01%     
==========================================
  Files         427      427              
  Lines       41059    40776     -283     
==========================================
- Hits        40920    40636     -284     
- Misses        139      140       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@rmoyard rmoyard requested a review from a team July 18, 2024 13:45
@Alex-Preciado Alex-Preciado requested review from EmilianoG-byte and removed request for a team July 18, 2024 16:02
Copy link
Contributor

@EmilianoG-byte EmilianoG-byte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great addition 🤩 ! Just left some comments regarding typos/cosmetics.

pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
pennylane/transforms/mitigate.py Outdated Show resolved Hide resolved
@EmilianoG-byte
Copy link
Contributor

EmilianoG-byte commented Jul 18, 2024

Is there a reason why we are not checking differentiability for other backends besides jax? For instance here:

@pytest.mark.parametrize("interface", ["auto", "torch"])
def test_diffability_torch(self, interface):

why didn't we add

@pytest.mark.parametrize("extrapolate", [richardson_extrapolate, exponential_extrapolate])

like we did with JAX and JAX-JIT?

Co-Authored-By: Cristian Emiliano Godinez Ramirez <[email protected]>
@natestemen
Copy link
Contributor Author

Thanks for the docstring suggestions! Addressed in e0815da.

Is there a reason why we are not checking differentiability for other backends besides jax?

The goal of this PR was to get an exponential fit function in that would be compatible with pennylaneai/catalyst. Originally I thought the exponential_extrapolate function was required to be written in JAX and hence only ensured it was tested as such. I wasn't aware of the testing requirements of the other backends1 until getting some review on this PR.

Happy to add your suggestion. Let me know.

Footnotes

  1. I'm still unclear if the other backends (torch, tf, autograd) are required in order for this functionality to work with catalyst. Is it?

@EmilianoG-byte
Copy link
Contributor

  1. I'm still unclear if the other backends (torch, tf, autograd) are required in order for this functionality to work with catalyst. Is it?

Good question @natestemen 🤔

Strictly speaking, the other interfaces are not needed for this functionality to work with Catalyst. However, since this will be part of the Pennylane features (even without Jitting our workflows), it should be compatible with the interfaces supported by Pennylane - this includes being able to differentiate it.

The good thing is that since you are using qml.math in exponential_extrapolate, adding this support should be rather straightforward by parametrizing the differentiability tests with

@pytest.mark.parametrize("extrapolate", [richardson_extrapolate, exponential_extrapolate])

wherever only richardson_extrapolate was used previously. Let me know if there is any unforeseen difficulty 😃.

@rmoyard
Copy link
Contributor

rmoyard commented Jul 19, 2024

@natestemen The requirement for it to work with Catalyst is to be Jax-jittable. I have asked for other interfaces because in PL we always make things compatible for the different frameworks.

Copy link
Contributor

@EmilianoG-byte EmilianoG-byte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! 😄

@dime10 dime10 enabled auto-merge (squash) July 23, 2024 20:09
@dime10 dime10 disabled auto-merge July 23, 2024 20:11
@dime10 dime10 merged commit 46fa7ff into PennyLaneAI:master Jul 23, 2024
38 checks passed
@natestemen natestemen deleted the nts-exponential-extrap branch July 23, 2024 20:20
rmoyard pushed a commit to PennyLaneAI/catalyst that referenced this pull request Jul 30, 2024
**Context:** The last PR
([first](#806),
[second](PennyLaneAI/pennylane#5972)) to add
exponential extrapolation capabilities to catalyst.

**Description of the Change:** This PR ensures that
`pennylane.transforms.exponential_extrapolate` works inside catalyst's
`mitigate_with_zne` function.

**Benefits:** ZNE has increased functionality.

**Possible Drawbacks:** As part of the testing, I've removed 0.1 and 0.2
from the list of values being used to create different circuits. This is
because exponential extrapolation is not necessarily as stable as
polynomial fitting. For example, when extrapolating near constant values
exponential extrapolation can struggle due to the fact that a linear fit
is first performed to understand the "direction" (or `sign`) of the
exponential. If the slope is positive, the data is flipped to fit
something that looks like $A\mathrm{e}^{-Bx} + C$ where
$B\in\mathbb{R}_{> 0}$.

An example of this happening can be seen here.
```py
>>> exponential_extrapolate([1,2,3], [0.3894, 0.3894183, 0.38941])
-1.0000000000000059e-06
>>> richardson_extrapolate([1,2,3], [0.3894, 0.3894183, 0.38941])
0.3893551000000072
```

If we want to ensure to continue testing values 0.1 and 0.2 for
polynomial extrapolation, we can create separate tests. Let me know what
would be best.

**Related GitHub Issues:**

fixes #754
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants