-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Search near init params and some mini fixes #985
Conversation
examples/simple/pipeline_tune.py
Outdated
pipeline_copy = deepcopy(pipeline) | ||
tuned_pipeline = tuner.tune(pipeline_copy) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
А почему тут это важно? Пусть бы и менялся in-place.
init_trials_num = min(int(self.iterations * 0.1), 10) \ | ||
if (self.iterations >= 10 and not is_init_parameters_full) else 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Тут для лучшей читаемости лучше будет на несколько строк разбить.
fmin(partial(self._objective, pipeline=pipeline), | ||
initial_parameters, | ||
trials=trials, | ||
algo=self.algo, | ||
max_evals=init_trials_num, | ||
show_progressbar=show_progress, | ||
early_stop_fn=self.early_stop_fn, | ||
timeout=self.max_seconds) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
А это так и задумано что fmin ничего не возвращает? Если да, то стоит пояснить.
try_initial_parameters = init_parameters and self.iterations > 1 | ||
|
||
if try_initial_parameters: | ||
trials, init_trials_num = self._search_near_initial_parameters(pipeline, init_parameters, | ||
is_init_params_full, trials, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Тут нужен какой-то комментарий, поясняющий что происходит. А то потом сложно будет вспомнить, зачем это.
for key in parameters_dict: | ||
if key in initial_parameters: | ||
value = initial_parameters[key] | ||
init_params_space[key] = hp.pchoice(key, [(1, value)]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Вот тут с hp.pchoice не очень понятно.
tunable_initial_params = {f'{node_id} || {operation_name} | {p}': | ||
node.parameters[p] for p in node.parameters if p in tunable_node_params} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Может генерацию строки с разделителями вынести в какую-то функцию с говорящим названием?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
А в целом вроде работает, после правок можно вливать.
Можно ещё добавить в строчку
2022-11-22 15:57:16,587 - FEDOT logger - Final pipeline: {'depth': 2, 'length': 4, 'nodes': [rf, scaling, normalization, pca]}
вывод гиперпараметров для понятности.
А также в начале работы тюнера выводить начальное приближение (тоже с гиперпараметрами).
9b620d7
to
474b642
Compare
Now PipelineTuner considers initial parameters of a pipeline. To achive that, at first search is conducted using search space with fixed initial parameters. After that search is continued on the whole initial search space.