Giter Club home page Giter Club logo

ista-daslab-optimizers's Introduction

ISTA DAS Lab Optimization Algorithms Package

This repository contains optimization algorithms for Deep Learning developed by the Distributed Algorithms and Systems lab at Institute of Science and Technology Austria.

The repository contains code for the following optimizers published by DASLab @ ISTA:

Installation

To use the latest stable version of the repository, you can install via pip:

pip3 install ista-daslab-optimizers

We also provide a script install.sh that creates a new environment, installs requirements and then installs the project as a Python package following these steps:

git clone [email protected]:IST-DASLab/ISTA-DASLab-Optimizers.git
cd ISTA-DASLab-Optimizers
source install.sh

How to use optimizers?

In this repository we provide a minimal working example for CIFAR-10 for optimizers acdc, dense_mfac, sparse_mfac and micro_adam:

cd examples/cifar10
OPTIMIZER=micro_adam # or any other optimizer listed above
bash run_${OPTIMIZER}.sh

To integrate the optimizers into your own pipeline, you can use the following snippets:

MicroAdam optimizer

from ista_daslab_optimizers import MicroAdam

model = MyCustomModel()

optimizer = MicroAdam(
    model.parameters(), # or some custom parameter groups
    m=10, # sliding window size (number of gradients)
    lr=1e-5, # change accordingly
    quant_block_size=100_000, # 32 or 64 also works
    k_init=0.01, # float between 0 and 1 meaning percentage: 0.01 means 1%
    alpha=0, # 0 means sparse update and 0 < alpha < 1 means we integrate fraction alpha from EF to update and then delete it
)

# from now on, you can use the variable `optimizer` as any other PyTorch optimizer

Versions summary:


  • 1.1.2 @ August 1st, 2024:

    • [1.1.0]: added support to densify the final update: introduced parameter alpha that controls the fraction of error feedback (EF) to be integrated into the update to make it dense. Finally, the fraction alpha will be discarded from the EF at the expense of another call to Qinv and Q (and implicitly quantization statistics computation).
    • [1.0.2]: added FSDP-compatible implementation by initializing the parameter states in the update_step method instead of MicroAdam constructor
  • 1.0.1 @ June 27th, 2024:

    • removed version in dependencies to avoid conflicts with llm-foundry
  • 1.0.0 @ June 20th, 2024:

    • changed minimum required Python version to 3.8+ and torch to 2.3.0+
  • 0.0.1 @ June 13th, 2024:

    • added initial version of the package for Python 3.9+ and torch 2.3.1+

ista-daslab-optimizers's People

Contributors

ionutmodo avatar

Stargazers

Ruichen Luo avatar felix-wang avatar Mike Lasby avatar  avatar

Watchers

Jen Iofinova avatar Ilya Markov avatar Eldar Kurtic avatar Dan Alistarh avatar  avatar

ista-daslab-optimizers's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.