-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Replace event based logging #557
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #557 +/- ##
==========================================
+ Coverage 96.43% 97.92% +1.49%
==========================================
Files 12 10 -2
Lines 1235 1158 -77
==========================================
- Hits 1191 1134 -57
+ Misses 44 24 -20 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's generally great that we're moving on from the observer pattern, however, I think it would be better to not pass in the whole BayesianOptimization
object in these steps, but pass the specific relevant information. E.g., in log_optimization_step
, we should pass in the params
of the point, the target
value, constraint
-value, if applicable, and a boolean for is_new_max
. Similarly, the log_optimization_start
function should accept parameter names, etc.
The only argument that I can see against this is if we want to allow people to write their own loggers, but I don't really see a usecase for this.
@adrianmolzon if you feel like losing your sanity you could also have a look at tackling #515. |
Cute, I'll see if I can fix this. Looks like a good time to knock this out |
Yeah this sucked but I've made some changes to fix this. To make things easier, any number over a million will be converted into scientific notation, hope this doesn't ruin anybody's day. |
I think this is good to go! Will merge after easter :) |
thanks! :) |
Hey all, I'm back with another pull request! This one removes some of the old functionality for saving logs since I've created a better way of saving state. In doing so, I've tried to simplify the file structure, mostly by deleting unnecessary files, but also by moving all the logging methods to a Logger class in logger.py. This class is now invoked on initialization of the BayesianOptimization object, and exposes the steps previously invoked by the Observable pattern, optimization_start, optimization_step, and optimization_end. I think this unification is cleaner. Changes to the format of the Logger can still be made, they are just stored under optimizer.logger.whateveryouwant.