Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multi-start functionality to optimisers #438

Open
BradyPlanden opened this issue Aug 5, 2024 · 0 comments
Open

Add multi-start functionality to optimisers #438

BradyPlanden opened this issue Aug 5, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@BradyPlanden
Copy link
Member

Feature description

Add a multi-start method to the optimiser classes with a random initial value for each start. A few open questions

  • Should multi-start be applied to all optimisers or only local optimisation algorithms?
  • Can multi-start be multiprocessed to run the optimisations in a parallel pool?

Motivation

Multistarting local optimisers provides a defence against converging to a local optimal while missing a nearby global.

Possible implementation

A potential interface could look like:

optim = pybop.AdamW(cost, multistarts=10, starting_distribution=pybop.Uniform(0,1.0))
results = optim.run()

where results includes subdictionaries of the results for each multistart optimisation, and starting_distribution is the distribution to draw initial values from, if not provided, the parameter.prior distribution is used.

Additional context

No response

@BradyPlanden BradyPlanden added the enhancement New feature or request label Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Todo
Development

No branches or pull requests

1 participant