Giter Club home page Giter Club logo

quantized-nets's Introduction

Quantized-Nets

This mini-project contains code for building Binary, Ternary and N-bit Quantized Convolutional Neural Networks with Keras or Tensorflow.

Introduction

Low Precision Networks have recently gained popularity due to their applications in devices with low-compute capabilities. During the forward pass, QNNs drastically reduce memory size and accesses, and replace most arithmetic operations with bit-wise operations.

Various Binarization, Ternarization and Quantization schemes are published for weights and activations.

Image Source: Minimum Energy Quantized Neural Networks

Binarization function used in the experiment is deterministic binary-tanh which is placed in binary_ops.py

Setup Dependencies

The recommended version for running the experiments is Python3.

  1. Follow the installation guide on Tensorflow Homepage for installing Tensorflow-GPU or Tensorflow-CPU.
  2. Follow instructions outlined on Keras Homepage for installing Keras.

Project Structure

The skeletal overview of the project is as follows:

.
├── binarize/
│   ├── binary_layers.py  # Custom binary layers are defined in Keras 
│   └── binary_ops.py     # Binarization functions for weights and activations
|
├── ternarize/
│   ├── ternary_layers.py  # Custom ternarized layers are defined in Keras
│   └── ternary_ops.py     # Ternarization functions for weights and activations
|
├── quantize/
│   ├── quantized_layers.py  # Custom quantized layers are defined in Keras
│   └── quantized_ops.py     # Quantization functions for weights and activations
|
├── base_ops.py           # Stores generic operations              
├── binary_net.py         # Implementation of Binarized Neural Networks
├── ternary_net.py        # Implementation of Ternarized Neural Networks
└── quantized_net.py      # Implementation of Quantized Neural Networks

Usage

In the root directory, to run the examples use:

python3 {example}_net.py  # for binary and ternary net

python3 quantized_net.py -nb N  # For quantized net, replace N with the number of bits you want to quantize the weights and activations to. (default value of N =4)

Also, you can import the layers directly in your own Keras or Tensorflow code. Read this blog to know how to use Keras layers in Tensorflow

Thanks to

This work wouldn't have been possible without the help from the following repos:

  1. https://github.com/DingKe/nn_playground/
  2. https://github.com/BertMoons/QuantizedNeuralNetworks-Keras-Tensorflow

quantized-nets's People

Contributors

yashkant avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.