Giter Club home page Giter Club logo

mohd-faizy / hyperparameter-tuning-with-microsoft-network-intelligence-toolkit-nni Goto Github PK

View Code? Open in Web Editor NEW
2.0 1.0 1.0 2.29 MB

Hyperparameter Tuning with Microsoft NNI to automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.

License: MIT License

Python 100.00%
hyperparameter-tuning neural-network-intelligence automl hyperparameter-optimization feature-engineering model-compression neural-architecture-search

hyperparameter-tuning-with-microsoft-network-intelligence-toolkit-nni's Introduction

author made-with-Markdown Language Platform Maintained Last Commit GitHub issues Open Source Love svg2 Stars GitHub GitHub license Size

Hyperparameter Tuning with Microsoft Neural Network Intelligence Toolkit

When you hear the words “automated machine learning”, what comes to your mind first? For me, it’s usually H2O.ai’s Driverless AI, or Google’s Cloud AutoML. Microsoft has also released an Open-source automated machine learning toolkit on GitHub that helps a user perform neural architecture search and hyperparameter tuning. Microsoft is calling the toolkit ‘Neural Network Intelligence (NNI)’.

The below diagram illustrates high-level Architecture of NNI

Overview

  • Neural Network Intelligence (NNI) is Microsoft’s open-source toolkit for automated machine learning.
  • The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments.
  • It helps you perform ML tasks like hyperparameter tuning and neural architecture search.
  • You need to have Python 3.5 or greater to use NNI

Installation

Prerequires

  • Python 3.6 (or above) 64-bit. Anaconda or Miniconda is highly recommended to manage multiple Python environments on Windows.
  • If it’s a newly installed Python environment, it needs to install Microsoft C++ Build Tools to support build NNI dependencies like scikit-learn.
pip install cython wheel

Installing NNI

  • From pip package
python -m pip install --upgrade nni
  • From source code
git clone -b v1.9 https://github.com/Microsoft/nni.git
cd nni
powershell -ExecutionPolicy Bypass -file install.ps1

🔵 What is Hyperparameter Tuning?

🔵 Hyperparameters

  • Hyperparameters are parameters that control the model training process (e.g. learning rate, batch size) • Hyperparameters are not learned from the model training process itself

🔵 Hyperparameter Tuning

  • Finding hyperparameter values that Optimize one or more evaluation metrics (e.g. Accuracy on a test set).

🔵 Neural Network Intelligence (NNI) Toolkit

Set of tools to manage automated machine learning experiments to perform:

  • Feature Engineering
  • Hyperparameter Tuning
  • Neural Architecture Search
  • Model Compression

🔵 NNI supports:

  • Most popular ML frameworks (PyTorch, TensorFlow, MXNet and more) • Local machines, remote servers, k8 clusters or cloud solutions.

🔵 NNI has several appealing properties: ease-of-use, scalability, flexibility, and efficiency.

✔️ Ease-of-use: NNI can be easily installed through python pip. Only several lines need to be added to your code in order to use NNI’s power. You can use both the commandline tool and WebUI to work with your experiments.

✔️ Scalability: Tuning hyperparameters or the neural architecture often demands a large number of computational resources, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms (e.g., OpenPAI, Kubernetes). Hundreds of trials could run in parallel by depending on the capacity of your configured training platforms.

✔️ Flexibility: Besides rich built-in algorithms, NNI allows users to customize various hyperparameter tuning algorithms, neural architecture search algorithms, early stopping algorithms, etc. Users can also extend NNI with more training platforms, such as virtual machines, kubernetes service on the cloud. Moreover, NNI can connect to external environments to tune special applications/models on them.

✔️ Efficiency: We are intensively working on more efficient model tuning on both the system and algorithm level. For example, we leverage early feedback to speedup the tuning procedure.

🔴 Creating the Hyperparameter Search Space

✅ In order to perform the hyper-parameter tunning, we first need to create the search space that describs the value range of each hyper-parameter.

✅ we can use the .json code to describe the range & this is the dictionary of all the hyper-parameter values that we want to run for our experiment.

{
  "dropout_rate": {
    "_type": "uniform",
    "_value": [0.1, 0.9]
  },

  "num_units": {
    "_type": "choice",
    "_value": [32, 64, 128, 256, 512]
  },

  "lr": {
    "_type": "choice",
    "_value": [0.0001, 0.0003, 0.0006, 0.001, 0.003, 0.006, 0.01, 0.03, 0.06]
  },

  "batch_size": {
    "_type": "choice",
    "_value": [32, 64, 128, 256, 512, 1024]
  },

  "activation": {
    "_type": "choice",
    "_value": ["relu", "sigmoid"]
  }
}

🔴 Setting up the configuration

✅ The last thing we need to do in order to perform our experiment is to create another file called config.yml file & this will contain all the information regarding the configuration information of our experiment.

authorName: mohd faizy
experimentName: mnist
trialConcurrency: 1
maxExecDuration: 1h
maxTrialNum: 10
#choice: local, remote, pai
trainingServicePlatform: local
#choice: true, false
useAnnotation: false
searchSpacePath: search_space.json
tuner:
  #choice: TPE, Random, Anneal, Evolution, BatchTuner, MetisTuner
  #SMAC (SMAC should be installed through nnictl)
  builtinTunerName: TPE
  classArgs:
    #choice: maximize, minimize
    optimize_mode: maximize
trial:
  command: python3 01_[Hyper-parameter Tuning] NNI.py
  codeDir: .

✔️ To start the experiment use nnictl & pass the config.yml in the command line.

nnictl create --config /content/config.yml

🔳 Launching the NNI Dashboard

🔳 Intermediate Results

🔳 Hyperparameter Visualization

🚩 Top 100 percent

🚩 Top 50 percent

🚩 Top 20 percent

Connect with me:

codeSTACKr | Twitter codeSTACKr | LinkedIn codeSTACKr.com


Faizy's github stats

Top Langs

hyperparameter-tuning-with-microsoft-network-intelligence-toolkit-nni's People

Contributors

mohd-faizy avatar

Stargazers

 avatar  avatar

Watchers

 avatar

Forkers

vernonalbertabc

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.