Giter Club home page Giter Club logo

dehb's Introduction

DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization

License Tests docs Coverage Status PyPI Static Badge arXiv

Welcome to DEHB, an algorithm for Hyperparameter Optimization (HPO). DEHB uses Differential Evolution (DE) under-the-hood as an Evolutionary Algorithm to power the black-box optimization that HPO problems pose.

dehb is a python package implementing the DEHB algorithm. It offers an intuitive interface to optimize user-defined problems using DEHB.

Getting Started

Installation

pip install dehb

Using DEHB

DEHB allows users to either utilize the Ask & Tell interface for manual task distribution or leverage the built-in functionality (run) to set up a Dask cluster autonomously. The following snippet offers a small look in to how to use DEHB. For further information, please refer to our getting started examples in our documentation.

optimizer = DEHB(
    f=your_target_function,
    cs=config_space, 
    dimensions=dimensions, 
    min_fidelity=min_fidelity, 
    max_fidelity=max_fidelity)

##### Using Ask & Tell
# Ask for next configuration to run
job_info = optimizer.ask()

# Run the configuration for the given fidelity. Here you can freely distribute the computation to any worker you'd like.
result = your_target_function(config=job_info["config"], fidelity=job_info["fidelity"])

# When you received the result, feed them back to the optimizer
optimizer.tell(job_info, result)

##### Using run()
# Run optimization for 1 bracket. Output files will be saved to ./logs
traj, runtime, history = optimizer.run(brackets=1, verbose=True)

Running DEHB in a parallel setting

For a more in-depth look in how-to run DEHB in a parallel setting, please have a look at our documentation.

Tutorials/Example notebooks

To run PyTorch example: (note additional requirements)

python examples/03_pytorch_mnist_hpo.py \
    --min_fidelity 1 \
    --max_fidelity 3 \
    --runtime 60 \
    --verbose

Documentation

For more details and features, please have a look at our documentation.

Contributing

Any contribution is greaty appreciated! Please take the time to check out our contributing guidelines

DEHB Hyperparameters

We recommend the default settings. The default settings were chosen based on ablation studies over a collection of diverse problems and were found to be generally useful across all cases tested. However, the parameters are still available for tuning to a specific problem.

The Hyperband components:

  • min_fidelity: Needs to be specified for every DEHB instantiation and is used in determining the fidelity spacing for the problem at hand.
  • max_fidelity: Needs to be specified for every DEHB instantiation. Represents the full-fidelity evaluation or the actual black-box setting.
  • eta: (default=3) Sets the aggressiveness of Hyperband's aggressive early stopping by retaining 1/eta configurations every round

The DE components:

  • strategy: (default=rand1_bin) Chooses the mutation and crossover strategies for DE. rand1 represents the mutation strategy while bin represents the binomial crossover strategy.
    Other mutation strategies include: {rand2, rand2dir, best, best2, currenttobest1, randtobest1}
    Other crossover strategies include: {exp}
    Mutation and crossover strategies can be combined with a _ separator, for e.g.: rand2dir_exp.
  • mutation_factor: (default=0.5) A fraction within [0, 1] weighing the difference operation in DE
  • crossover_prob: (default=0.5) A probability within [0, 1] weighing the traits from a parent or the mutant

To cite the paper or code

@inproceedings{awad-ijcai21,
  author    = {N. Awad and N. Mallik and F. Hutter},
  title     = {{DEHB}: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization},
  pages     = {2147--2153},
  booktitle = {Proceedings of the Thirtieth International Joint Conference on
               Artificial Intelligence, {IJCAI-21}},
  publisher = {ijcai.org},
  editor    = {Z. Zhou},
  year      = {2021}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.