-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Remove event based logging pattern #557
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #557 +/- ##
==========================================
+ Coverage 96.43% 98.01% +1.57%
==========================================
Files 12 10 -2
Lines 1235 1158 -77
==========================================
- Hits 1191 1135 -56
+ Misses 44 23 -21 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's generally great that we're moving on from the observer pattern, however, I think it would be better to not pass in the whole BayesianOptimization
object in these steps, but pass the specific relevant information. E.g., in log_optimization_step
, we should pass in the params
of the point, the target
value, constraint
-value, if applicable, and a boolean for is_new_max
. Similarly, the log_optimization_start
function should accept parameter names, etc.
The only argument that I can see against this is if we want to allow people to write their own loggers, but I don't really see a usecase for this.
bayes_opt/bayesian_optimization.py
Outdated
@@ -105,6 +76,27 @@ class BayesianOptimization(Observable): | |||
This behavior may be desired in high noise situations where repeatedly probing | |||
the same point will give different answers. In other situations, the acquisition | |||
may occasionally generate a duplicate point. | |||
|
|||
Attributes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@adrianmolzon could you build the docs and check whether this renders correctly?
bayes_opt/logger.py
Outdated
self._verbose = verbose | ||
self._is_constrained = is_constrained | ||
self._params_config = params_config |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not ideal that we need to copy this object over whole, but maybe there is no other way (?)
bayes_opt/logger.py
Outdated
---------- | ||
path : str or os.PathLike | ||
Path to the file to write to. | ||
def _time_metrics(self) -> tuple[str, float, float]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this used for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good question. Right now, nothing. In the old logging functionality, information about the time and the time elapsed was saved to the log file. Do you still want that information? I would have to decide where to save that information and when to present it, since that log file doesn't exist anymore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy to remove this function and the associated tests if it's not necessary information
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, let's delete that :)
bayes_opt/logger.py
Outdated
is_new_max = self._is_new_max(current_max) | ||
self._update_tracker(current_max) | ||
|
||
if self._verbose != 1 or is_new_max: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why the verbose check here 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the old system had the same functionality, and I mindlessly copied the code to the new system. I think the idea makes sense, we should only print if we have a new max or if we want more verbosity, although the check for not equals 1 is definitely weird, arguably it should be >= 1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Writing this out for my sanity:
verbose=0
: No printing to consoleverbose=1
: Printing maxima onlyverbose=2
: Print all steps
Maybe this handles the colouring, and in the verbose=1
case we don't want any colour as the whole table would then be highlighted, making that kinda pointless?
In any case, could maybe verify the behaviour is what one would expect?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah we have the same expected logic, I'll make sure we're good
@adrianmolzon if you feel like losing your sanity you could also have a look at tackling #515. |
Cute, I'll see if I can fix this. Looks like a good time to knock this out |
Yeah this sucked but I've made some changes to fix this. To make things easier, any number over a million will be converted into scientific notation, hope this doesn't ruin anybody's day. |
Hey all, I'm back with another pull request! This one removes some of the old functionality for saving logs since I've created a better way of saving state. In doing so, I've tried to simplify the file structure, mostly by deleting unnecessary files, but also by moving all the logging methods to a Logger class in logger.py. This class is now invoked on initialization of the BayesianOptimization object, and exposes the steps previously invoked by the Observable pattern, optimization_start, optimization_step, and optimization_end. I think this unification is cleaner. Changes to the format of the Logger can still be made, they are just stored under optimizer.logger.whateveryouwant.