-
Notifications
You must be signed in to change notification settings - Fork 26
Parallel Function Evaluations
HyperMapper supports parallel function evaluations, in case users want to perform multiple evaluations of an expensive black-box function in parallel. This is controlled by the evaluations_per_optimization_iteration
field in the json:
{ "application_name": "batch_branin", "optimization_objectives": ["Value"], "optimization_iterations": 10, "evaluations_per_optimization_iteration": 2, "input_parameters" : { "x1": { "parameter_type" : "real", "values" : [-5, 10] }, "x2": { "parameter_type" : "real", "values" : [0, 15] } } }
This field dictates how many evaluations HyperMapper will request at each optimization iteration. Users can set this field to have HyperMapper request multiple configurations at once then implement a method that evaluates the requested configurations in parallel. Both Default and Client-Server HyperMapper modes support parallel function evaluations. Each optimization mode has its own interface for parallel function evaluations.
To run batch optimization in default mode, users must provide a Python method to HyperMapper with the following interface:
- Input. A single dictionary. The keys of the dictionary will be the names of the input parameters defined in the json. Each key will hold a list of values containing the value of the parameter for each configuration.
- Output. A dictionary with the names of the optimization objectives as keys. Each key holds a list of values containing the value of the objective for each configuration. If the application has feasibility constraints, the output dictionary should also contain a key and list of values for the feasibility indicator. If the application contains a single objective and no feasibility constraints, then the output can be simply a list with the function value for each configuration.
See this page for an example using the Branin function.
The Client-Server mode protocol remains the same. At each iteration, HyperMapper will send back a message with how many evaluations it is requesting (the same as evaluations_per_optimization_iteration
), followed by one configuration per line. For example:
Request 3
x1,x2
-10,12
1,6
-8,20
Note that this is identical to how HyperMapper requests evaluations during the initialization (Design of Experiment) phase. Likewise, the client replies with the configurations and value of the function for each configuration:
x1,x2,Value
-10,12,267
1,6,28,-16
-8,20,463
See this page for details on the client-server mode and implementation examples. All of the client examples support batch optimization.