Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Easy distinction between PyBOP parameters and PyBaMM parameters #396

Open
MarkBlyth opened this issue Jul 8, 2024 · 5 comments
Open

Easy distinction between PyBOP parameters and PyBaMM parameters #396

MarkBlyth opened this issue Jul 8, 2024 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@MarkBlyth
Copy link
Contributor

MarkBlyth commented Jul 8, 2024

Feature description

Okay slightly strange one here, but bare with me... It would be useful if it were possible to take the parameters proposed by PyBOP, and post-process them into the parameter set that PyBaMM sees. So, PyBOP optimises over a set of parameters, a post-processor does something to them, and the results are fed into a PyBaMM model.

There's ways around this already, eg. use a custom / standalone model, or subclassing a PyBOP model and overriding the simulate function to postprocess the parameters before they go off to PyBaMM. Neither of these are particularly convenient though, so it would be good to have the structure in PyBOP to natively handle this sort of parameter processing.

Motivation

Three cases where I've found myself trying to do this, all in the context of fitting Thevenin models:

  1. Functional parameters: idea would be for PyBOP to optimise over a set of 'dummy parameters', eg. r0_at_soc_1, r0_at_soc_0d9, ..., and have a postprocessor build these into a pybamm.Interpolant from the pairs (SOC=1, R0=r0_at_soc_1), (SOC=0.9, R0=r0_at_soc_0d9), etc. This is useful for fitting linear interpolation flavoured functional parameters, since PyBOP can't optimise over a pybamm.Interpolant and the PyBaMM model doesn't natively take series resistance R0s as a list of scalars. Instead, the post-processor can act as a layer between PyBOP and PyBaMM, to close this gap.
  2. Least squares regression of ECMs: consider fitting a 1-RC model with a fixed time-constant, using the GaussianLogLikelihood method. PyBOP can be used to identify the RC resistance $R$, but to build a complete PyBaMM model, we also need to set the RC capacitance as $C=\tau/R$. This again needs some sort of intermediate step between the parameter PyBOP suggests (in this case, $R$), and the parameters PyBaMM sees (here, $R$ and $C$).
  3. Unsure how useful this will be in practice, but it would be interesting to see if fitting $1/R$ is more numerically stable than fitting $R$ directly. For a constrained optimisation where timescale $R*C \leq \tau_{max}$, the search space for parameter $R$ can be particularly small (think the small strip close to the $y$ axis of a $y=1/x$ curve). In that case, optimising over conductance $1/R$ will have a much larger search space, and could therefore alleviate some of the difficulties the optimisers can face. Again this needs a post-processor / translation layer between the $1/R$ that PyBOP sees, and the $R$ that gets fed into the PyBaMM model.

Possible implementation

Something like an optional callback function in the optimiser class. Takes a dict of named PyBOP parameters, and outputs a dict of named PyBaMM parameters.

Additional context

No response

@MarkBlyth MarkBlyth added the enhancement New feature or request label Jul 8, 2024
@martinjrobins
Copy link
Contributor

martinjrobins commented Sep 19, 2024

I think what you are proposing is effectivly re-parameterising some base model to create a new model. I would suggest the best way to do this is to allow users to easily create and use a custom pybamm model with pybop. Taking your 2nd case, this might look something like:

custom_model = pybamm.SPM()
custom_model.substitute(pybamm.Parameter("C"), pybamm.Parameter("tau") / pybamm.Parameter("R"))
pybop_model = pybop.create_model(custom_model)
# ... use pybop_model to do the fitting

(note pybamm.BaseModel.substitute and pybop.create_model don't exist yet, but could easily be created)

@NicolaCourtier
Copy link
Member

Thanks Martin, this seems like an intuitive solution to the problem! Two questions:

  1. How would the substitution method cope with cancellation? Take the example of transforming from (R, C) to (R ,tau=RC). Do you think this implementation would substitute RC for tau (desired behaviour) or R*tau/R (more literal)?

  2. I think we should continue to support common PyBaMM parameter sets to promote data sharing. But it should be fairly straightforward for the user to convert between parameter sets, given that they are in control of the substitution. Would it be possible to generate parameter set conversion functions as well?

@martinjrobins
Copy link
Contributor

  1. How would the substitution method cope with cancellation? Take the example of transforming from (R, C) to (R ,tau=R_C). Do you think this implementation would substitute R_C for tau (desired behaviour) or R*tau/R (more literal)?

There is a bunch of simplification code in pybamm, not sure it would work (i.e. cancel the R's) in this particular example. Something to look at and test I think.

  1. I think we should continue to support common PyBaMM parameter sets to promote data sharing. But it should be fairly straightforward for the user to convert between parameter sets, given that they are in control of the substitution. Would it be possible to generate parameter set conversion functions as well?

Yea that would be nice. You could do it but it would require storing the args of subsitute in the pybop model so you would have to rearrange the code above to:

custom_model = pybamm.SPM()
pybop_model = pybop.create_model(custom_model)
pybop_model.substitute(pybamm.Parameter("C"), pybamm.Parameter("tau") / pybamm.Parameter("R"))
# ... use pybop_model to do the fitting

@NicolaCourtier
Copy link
Member

Sounds good! Thanks

@NicolaCourtier
Copy link
Member

Hi,

I've written an example (building on #533) that replaces the parameter "C1 [F]" with "tau1 [s]" before tackling the same parameter estimation problem as described in @MarkBlyth's example ecm_tau_constraints.py.

I don't think the timescale is identifiable from this constant discharge data however, so to compare the numerical performance we would need to use more dynamic data (e.g. the pulse in this notebook).

@MarkBlyth, I believe this example resolves case 2 (above) and that the method can be extended to the other cases as well, what do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants