Tuning of graph parameters

All tuners work with internal graph representation (also called optimization graph, see Adaptation of Graphs). To optimise custom domain graph pass adapter. If your graph class is inherited from OptGraph no adapter is needed. Tuners optimise parameters stored in OptNode.parameters.

Multi-objective optimisation is supported only by OptunaTuner.

To specify parameters search space use SearchSpace class. Initialize SearchSpace with dictionary of the form {'operation_name': {'param_name': { 'hyperopt-dist': <hyperopt distribution function>, 'sampling-scope': [sampling scope], 'type': <type of parameter>}, ...}, ...}. Three types of parameters are available: continuous, discrete and categorical.

import numpy as np
from hyperopt import hp
from golem.core.tuning.search_space import SearchSpace


params_per_operation = {
    'operation_name_1': {
        'parameter_name_1': {
            'hyperopt-dist': hp.uniformint,
            'sampling-scope': [2, 21],
            'type': 'discrete'},
        'parameter_name_2': {
            'hyperopt-dist': hp.loguniform,
            'sampling-scope': [1e-3, 1],
            'type': 'continuous'}
    },
    'operation_name_2': {
        'parameter_name_1': {
            'hyperopt-dist': hp.choice,
            'sampling-scope': [["first", "second", "third"]],
            'type': 'categorical'},
        'parameter_name_2':
            'hyperopt-dist': hp.uniform,
            'sampling-scope': [0.05, 1.0],
            'type': 'continuous'}
    }}

search_space = SearchSpace(params_per_operation)

Simultaneous

You can tune all parameters of graph nodes simultaneously using SimultaneousTuner, OptunaTuner or IOptTuner.

Note

IOptTuner implements deterministic algorithm.

IOptTuner is implemented using IOpt library. See the documentation (in Russian) to learn more about the optimisation algorithm.

class golem.core.tuning.simultaneous.SimultaneousTuner(objective_evaluate: golem.core.optimisers.objective.objective.GraphFunction[golem.core.optimisers.objective.objective.G, golem.core.optimisers.fitness.fitness.Fitness], search_space: golem.core.tuning.search_space.SearchSpace, adapter: Optional[golem.core.adapter.adapter.BaseOptimizationAdapter] = None, iterations: int = 100, early_stopping_rounds: Optional[int] = None, timeout: datetime.timedelta = datetime.timedelta(seconds=300), n_jobs: int = -1, deviation: float = 0.05, algo: Callable = <function suggest>, **kwargs)[source]

Bases: Generic[golem.core.tuning.tuner_interface.DomainGraphForTune]

Class for hyperparameters optimization for all nodes simultaneously

_tune(graph: golem.core.tuning.tuner_interface.DomainGraphForTune, show_progress: bool = True) golem.core.tuning.tuner_interface.DomainGraphForTune[source]

Function for hyperparameters tuning on the entire graph

Parameters
  • graph – graph which hyperparameters will be tuned

  • show_progress – shows progress of tuning if True

Returns

Graph with tuned hyperparameters

_search_near_initial_parameters(graph: golem.core.dag.graph_delegate.GraphDelegate, search_space: dict, initial_parameters: dict, trials: hyperopt.base.Trials, remaining_time: float, show_progress: bool = True) Tuple[hyperopt.base.Trials, int][source]

Method to search using the search space where parameters initially set for the graph are fixed. This allows not to lose results obtained while composition process

Parameters
  • graph – graph to be tuned

  • search_space – dict with parameters to be optimized and their search spaces

  • initial_parameters – dict with initial parameters of the graph

  • trials – Trials object to store all the search iterations

  • show_progress – shows progress of tuning if True

Returns

Trials object storing all the search trials init_trials_num: number of iterations made using the search space with fixed initial parameters

Return type

trials

_get_parameters_for_tune(graph: golem.core.dag.graph_delegate.GraphDelegate) Tuple[dict, dict][source]

Method for defining the search space

Parameters

graph – graph to be tuned

Returns

dict with operation names and parameters initial_parameters: dict with initial parameters of the graph

Return type

parameters_dict

_objective(parameters_dict: dict, graph: golem.core.dag.graph_delegate.GraphDelegate, unchangeable_parameters: Optional[dict] = None) float[source]

Objective function for minimization problem

Parameters
  • parameters_dict – dict which contains new graph hyperparameters

  • graph – graph to optimize

  • unchangeable_parameters – dict with parameters that should not be changed

Returns

value of objective function

Return type

metric_value

class golem.core.tuning.iopt_tuner.IOptTuner(objective_evaluate: golem.core.optimisers.objective.objective_eval.ObjectiveEvaluate, search_space: golem.core.tuning.search_space.SearchSpace, adapter: Optional[golem.core.adapter.adapter.BaseOptimizationAdapter] = None, iterations: int = 100, timeout: datetime.timedelta = datetime.timedelta(seconds=300), n_jobs: int = - 1, eps: float = 0.001, r: float = 2.0, evolvent_density: int = 10, eps_r: float = 0.001, refine_solution: bool = False, deviation: float = 0.05, **kwargs)[source]

Bases: Generic[golem.core.tuning.tuner_interface.DomainGraphForTune]

Base class for hyperparameters optimization based on hyperopt library

Parameters
  • objective_evaluate – objective to optimize

  • adapter – the function for processing of external object that should be optimized

  • iterations – max number of iterations

  • search_space – SearchSpace instance

  • n_jobs – num of n_jobs for parallelization (-1 for use all cpu’s)

  • eps – The accuracy of the solution of the problem. Less value - higher search accuracy, less likely to stop prematurely.

  • r – Reliability parameter. Higher r is slower convergence, higher probability of finding a global minimum.

  • evolvent_density – Density of the evolvent. By default \(2^{-10}\) on hypercube \([0,1]^N\), which means, that the maximum search accuracy is \(2^{-10}\).

  • eps_r – Parameter that affects the speed of solving the task. epsR = 0 - slow convergence to the exact solution, epsR>0 - quick converge to the neighborhood of the solution.

  • refine_solution – if true, then the solution will be refined with local search.

  • deviation – required improvement (in percent) of a metric to return tuned graph. By default, deviation=0.05, which means that tuned graph will be returned if it’s metric will be at least 0.05% better than the initial.

_get_parameters_for_tune(graph: golem.core.dag.graph_delegate.GraphDelegate) Tuple[golem.core.tuning.iopt_tuner.IOptProblemParameters, dict][source]

Method for defining the search space

Parameters

graph – graph to be tuned

Returns

dict with operation names and parameters initial_parameters: dict with initial parameters of the graph

Return type

parameters_dict

class golem.core.tuning.optuna_tuner.OptunaTuner(objective_evaluate: golem.core.optimisers.objective.objective.GraphFunction[golem.core.optimisers.objective.objective.G, golem.core.optimisers.fitness.fitness.Fitness], search_space: golem.core.tuning.search_space.SearchSpace, adapter: Optional[golem.core.adapter.adapter.BaseOptimizationAdapter] = None, iterations: int = 100, early_stopping_rounds: Optional[int] = None, timeout: datetime.timedelta = datetime.timedelta(seconds=300), n_jobs: int = - 1, deviation: float = 0.05, **kwargs)[source]

Bases: Generic[golem.core.tuning.tuner_interface.DomainGraphForTune]

objective(trial: optuna.trial._trial.Trial, graph: golem.core.dag.graph_delegate.GraphDelegate) Union[float, Sequence[float]][source]
early_stopping_callback(study: optuna.study.study.Study, trial: optuna.trial._frozen.FrozenTrial)[source]

Sequential

SequentialTuner allows you to tune graph parameters sequentially node by node.

class golem.core.tuning.sequential.SequentialTuner(objective_evaluate: golem.core.optimisers.objective.objective.GraphFunction[golem.core.optimisers.objective.objective.G, golem.core.optimisers.fitness.fitness.Fitness], search_space: golem.core.tuning.search_space.SearchSpace, adapter: Optional[golem.core.adapter.adapter.BaseOptimizationAdapter] = None, iterations: int = 100, early_stopping_rounds: Optional[int] = None, timeout: datetime.timedelta = datetime.timedelta(seconds=300), n_jobs: int = -1, deviation: float = 0.05, algo: Callable = <function suggest>, inverse_node_order: bool = False, **kwargs)[source]

Bases: Generic[golem.core.tuning.tuner_interface.DomainGraphForTune]

Class for hyperparameters optimization for all nodes sequentially

_tune(graph: golem.core.tuning.tuner_interface.DomainGraphForTune, **kwargs) golem.core.tuning.tuner_interface.DomainGraphForTune[source]

Method for hyperparameters tuning on the entire graph

Parameters

graph – graph which hyperparameters will be tuned

get_nodes_order(nodes_number: int) range[source]

Method returns list with indices of nodes in the graph

Parameters

nodes_number – number of nodes to get

tune_node(graph: golem.core.tuning.tuner_interface.DomainGraphForTune, node_index: int) golem.core.tuning.tuner_interface.DomainGraphForTune[source]

Method for hyperparameters tuning for particular node

Parameters
  • graph – graph which contains a node to be tuned

  • node_index – Index of the node to tune

Returns

Graph with tuned parameters in node with specified index

_optimize_node(graph: golem.core.dag.graph_delegate.GraphDelegate, node_id: int, node_params: dict, iterations_per_node: int, seconds_per_node: float) golem.core.dag.graph_delegate.GraphDelegate[source]

Method for node optimization

Parameters
  • graph – Graph which node is optimized

  • node_id – id of the current node in the graph

  • node_params – dictionary with parameters for node

  • iterations_per_node – amount of iterations to produce

  • seconds_per_node – amount of seconds to produce

Returns

updated graph with tuned parameters in particular node

_objective(node_params: dict, graph: golem.core.dag.graph_delegate.GraphDelegate, node_id: int) float[source]

Objective function for minimization problem

Parameters
  • node_params – dictionary with parameters for node

  • graph – graph to evaluate

  • node_id – id of the node to which parameters should be assigned

Returns

value of objective function