-
Notifications
You must be signed in to change notification settings - Fork 26
Quick Start Guide
You are looking for a parameter setting that minimizes some performance metric of your algorithm (such as runtime, error, or cost). To use HyperMapper for this purpose you need to tell it about your parameters and how to evaluate your algorithm's performance. Here, we will show how to do this on a running example using a simple algorithm called the Branin function. We look for minimizing the value of this function given the two parameters x1 and x2. We note that HyperMapper always minimizes the optimized function, in order to maximize, users can take the complement of the function (-1*f(x)) instead.
Tested on Linux and MacOS. HyperMapper is all in Python it should run on other systems too.
pip install hypermapper
For a more detailed install look here.
Consider the Branin black-box function evaluation (which depends on the input variables x1 and x2), this is the objective we want to minimize:
def branin_function(X):
x1 = X['x1']
x2 = X['x2']
a = 1.0
b = 5.1 / (4.0 * math.pi * math.pi)
c = 5.0 / math.pi
r = 6.0
s = 10.0
t = 1.0 / (8.0 * math.pi)
y_value = a * (x2 - b * x1 * x1 + c * x1 - r) ** 2 + s * (1 - t) * math.cos(x1) + s
return y_value
Note that HyperMapper will pass a dictionary with the inputs to the black-box function. The keys of the dictionary will be the names of the variables as defined by the user (see next section).
An example of this code can be found in branin.py.
The inputs to HyperMapper specify an instance of the software configuration problem. In this quick start guide, we are showing how to optimize the Branin function value. The following is what needs to be specified as a json syntax to run Branin:
{
"application_name": "branin",
"optimization_objectives": ["Value"],
"optimization_iterations": 20,
"input_parameters" : {
"x1": {
"parameter_type" : "real",
"values" : [-5, 10],
"parameter_default" : 0
},
"x2": {
"parameter_type" : "real",
"values" : [0, 15],
"parameter_default" : 0
}
}
}
This is the meaning of the json fields:
- "application_name" is the name of the application in hand, "branin" in our case, however a user-defined string other than "branin" can be used.
- "optimization_objectives" informs HyperMapper about the number and name of the quality metrics (objectives) we aim to minimize, in our case it is the value of Branin. The name has to match exactly the name of the output parameters returned by the black-box function providing the evaluation.
- "optimization_iterations" specifies how many optimization iterations HyperMapper will perform.
- "input_parameters" specifies the parameters that define our serch space. In Branin, we have two parameters: x1 and x2. The type of parameter here is "real", which is a parameter that can assume any value between the two bounds specified. HyperMapper will use the names of the input parameters defined here as keys for the dictionary passed to the black-box function.
For more information on how to define a search space look here.
It is convenient to save this instance in a file and then use it as an input to HyperMapper, you can find this json in branin_scenario.json.
Several convenient options are available in the HyperMapper configuration json. For more information look here.
You are all set to run Branin and HyperMapper together!
To optimize the branin function, call HyperMapper's optimize
method with the json file and the branin function as parameters:
from hypermapper import optimizer
parameters_file = "example_scenarios/quick_start/branin_scenario.json"
optimizer.optimize(parameters_file, branin_function)
An example of stdout output can be found output_branin.txt. You can also have a look at the log file hypermapper_logfile.log generated by HyperMapper in $HYPERMAPPER_HOME, however, this is more technical.
The result of this script is a csv file called branin_output_dse_samples.csv. You can find all the samples explored by HyperMapper during the design space exploration (DSE) in this file.
See the other synthetic examples provided for more advanced usages of HyperMapper. Including different parameter types, multiple objective optimization, constrained search spaces, and different HyperMapper modes.