Giter Club home page Giter Club logo

training-free-nas's Introduction

Training-free Neural Architecture Search for RNNs and Transformers [paper]

Aaron Serianni (Princeton University) and Jugal Kalita (University of Colorado at Colorado Springs)

Neural architecture search (NAS) has allowed for the automatic creation of new and effective neural network architectures, offering an alternative to the laborious process of manually designing complex architectures. However, traditional NAS algorithms are slow and require immense amounts of computing power. Recent research has investigated training-free NAS metrics for image classification architectures, drastically speeding up search algorithms. In this paper, we investigate training-free NAS metrics for recurrent neural network (RNN) and BERT-based transformer architectures, targeted towards language modeling tasks. First, we develop a new training-free metric, named hidden covariance, that predicts the trained performance of an RNN architecture and significantly outperforms existing training-free metrics. We experimentally evaluate the effectiveness of the hidden covariance metric on the NAS-Bench-NLP benchmark. Second, we find that the current search space paradigm for transformer architectures is not optimized for training-free neural architecture search. Instead, a simple qualitative analysis can effectively shrink the search space to the best performing architectures. This conclusion is based on our investigation of existing training-free metrics and new metrics developed from recent transformer pruning literature, evaluated on our own benchmark of trained BERT architectures. Ultimately, our analysis shows that the architecture search space and the training-free metric must be developed together in order to achieve effective results.

This paper will be published as a long paper at ACL 2023.

Code

All dataset files and training-free metric results are in data/. The NAS-Bench-BERT benchmark is described in data/BERT_benchmark.json.

Required packages are listed in requirements.txt, and can be installed with pip install -r requirements.txt. The Docker file used to define the MacOS container in which code was run is also included.

Run BERT_metrics.ipyb and RNN_metrics.ipyb to reproduce results, and BERT_stats.ipyb and RNN_stats.ipyb to create figures.

nas_runner.ipynb reproduces the NAS-Bench-BERT benchmark, and requires usage of Google Colab and Google Cloud Storage for TPU training of transformers. space_maker.ipynb generates the config file that defines NAS-Bench-BERT.

Citation

If you use our code for your paper or work, please cite:

@misc{serianni2023trainingfree,
      title={Training-free Neural Architecture Search for RNNs and Transformers}, 
      author={Aaron Serianni and Jugal Kalita},
      year={2023},
      eprint={2306.00288},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Proceedings citation in ACL Anthology will added later.

training-free-nas's People

Contributors

aaronserianni avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.