Skip to content

A Constrained version of TuRBO, a bayesian optimization algorithm

License

Notifications You must be signed in to change notification settings

AirNicco8/CTuRBO

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fork

This is a fork of the code-release for the TuRBO algorithm from Scalable Global Optimization via Local Bayesian Optimization appearing in NeurIPS 2019. Which comes from an implementation for the noise-free case and may not work well if observations are noisy as the center of the trust region should be chosen based on the posterior mean in this case.

Note that TuRBO is a minimization algorithm, so please make sure you reformulate potential maximization problems.

Differences

This repo employs the TuRBO code for the UniBo AITI exam project, which entails using Bayesian Optimization for Vertical Matchmaking. In particular this code gives the possibility to launch the algorithm with bounded time and/or space constraints. The implementation is based on Scalable Constrained Bayesian Optimization, a constrained verison of TuRBO published by the same author.

Usage

Disclaimer: this is an alpha version which uses the data from the csv dataset and does not actually computes the function which should be optimized

Additional files are needed for using pre-trained GPs on different data splits: https://drive.google.com/drive/folders/1EmZI_sUcJigxTJKhmiz4nYyMEtki1BKo?usp=sharing

In particular the folder dataset_splits.

The script main.py launches the TuRBO algorithm on Anticipate or Contingency dataset, the time, solution quality and/or memory constraints they can be passed as arguments from the terminal as in this example:

python main.py --max_time 150 --max_mem 200 --trust_regions 2

The time is intended in seconds, the memory is intended in Mega Bytes. There is also the possibility to choose the number of trust regions mantained by TuRBO and run the algorithm with GPs pretrained on dataset splits (freezed or not). For other arguments customization run:

python main.py -h

Citing the original authors

The final version of the paper is available at: http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.

@inproceedings{eriksson2019scalable,
  title = {Scalable Global Optimization via Local {Bayesian} Optimization},
  author = {Eriksson, David and Pearce, Michael and Gardner, Jacob and Turner, Ryan D and Poloczek, Matthias},
  booktitle = {Advances in Neural Information Processing Systems},
  pages = {5496--5507},
  year = {2019},
  url = {http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.pdf},
}

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%