Giter Club home page Giter Club logo

burn's Introduction

Discord Test Status Documentation Current Crates.io Version Rust Version license

This library aims to be a complete deep learning framework with extreme flexibility written in Rust. The goal would be to satisfy researchers as well as practitioners making it easier to experiment, train and deploy your models.

Sections

Features

  • Flexible and intuitive custom neural network module ๐Ÿ”ฅ
  • Training with full support for metric, logging and checkpointing ๐Ÿ“ˆ
  • Tensor crate with backends as pluging ๐Ÿ”ง
    • Tch backend with CPU/GPU support ๐Ÿš€
    • NdArray backend with fast compile time ๐Ÿ‘Œ
    • Autodiff backend making any backend differentiable ๐ŸŒŸ
  • Dataset crate with multiple utilities and sources ๐Ÿ“š

Get Started

The best way to get started with burn is to clone the repo and play with the examples. This may also be a good idea to take a look the main components of burn to get a quick overview of the fundamental building blocks.

Examples

  • MNIST train a model on CPU/GPU using different backends.
  • Text Classification train a transformer encoder from scratch on GPU.

Components

Knowing the main components will be of great help when starting playing with burn.

Backend

Almost everything is based on the Backend trait, which allows to run tensor operations with different implementations without having to change your code. A backend does not necessary have autodiff capabilities, the ADBackend trait is there to specify when autodiff is required.

Tensor

The Tensor struct is at the core of the burn framework. It takes two generic parameters, the Backend and the number of dimensions D,

Backpropagation is also supported on any backend by making them auto differentiable using a simple decorator.

use burn::tensor::backend::{ADBackend, Backend};
use burn::tensor::{Distribution, Tensor};
use burn_autodiff::ADBackendDecorator;
use burn_ndarray::NdArrayBackend;
use burn_tch::TchBackend;

fn simple_function<B: Backend>() -> Tensor<B, 2> {
    let x = Tensor::<B, 2>::random([3, 3], Distribution::Standard);
    let y = Tensor::<B, 2>::random([3, 3], Distribution::Standard);

    x.matmul(&y)
}

fn simple_function_grads<B: ADBackend>() -> B::Gradients {
    let z = simple_function::<B>();

    z.backward()
}

fn main() {
    let _z = simple_function::<NdArrayBackend<f32>>(); // Compiles
    let _z = simple_function::<TchBackend<f32>>(); // Compiles

    let _grads = simple_function_grads::<NdArrayBackend<f32>>(); // Doesn't compile
    let _grads = simple_function_grads::<TchBackend<f32>>(); // Doesn't compile

    type ADNdArrayBackend = ADBackendDecorator<NdArrayBackend<f32>>;
    type ADTchBackend = ADBackendDecorator<TchBackend<f32>>;

    let _grads = simple_function_grads::<ADNdArrayBackend>(); // Compiles
    let _grads = simple_function_grads::<ADTchBackend>(); // Compiles
}

Module

The Module derive let your create your own neural network modules similar to PyTorch.

use burn::nn;
use burn::module::{Param, Module};
use burn::tensor::backend::Backend;

#[derive(Module, Debug)]
struct MyModule<B: Backend> {
  my_param: Param<nn::Linear<B>>,
  repeat: usize,
}

Note that only the fields wrapped inside Param are updated during training, and the other ones should implement Clone.

Config

The Config derive lets you define serializable and deserializable configurations or hyper-parameters for your modules or any components.

use burn::config::Config;

#[derive(Config)]
struct MyConfig {
    #[config(default = 1.0e-6)]
    pub epsilon: usize,
    pub dim: usize,
}

The derive also adds useful methods to your config.

fn main() {
    let config = MyConfig::new(100);
    println!("{}", config.epsilon); // 1.0.e-6
    println!("{}", config.dim); // 100
    let config =  MyConfig::new(100).with_epsilon(1.0e-8);
    println!("{}", config.epsilon); // 1.0.e-8
}

Learner

The Learner is the main struct that let you train a neural network with support for logging, metric, checkpointing and more. In order to create a learner, you must use the LearnerBuilder.

use burn::train::LearnerBuilder;
use burn::train::metric::{AccuracyMetric, LossMetric};

fn main() {
    let dataloader_train = ...;
    let dataloader_valid = ...;

    let model = ...;
    let optim = ...;

    let learner = LearnerBuilder::new("/tmp/artifact_dir")
        .metric_train_plot(AccuracyMetric::new())
        .metric_valid_plot(AccuracyMetric::new())
        .metric_train(LossMetric::new())
        .metric_valid(LossMetric::new())
        .with_file_checkpointer::<f32>(2)
        .num_epochs(10)
        .build(model, optim);

    let _model_trained = learner.fit(dataloader_train, dataloader_valid);
}

See this example for a real usage.

License

Burn is distributed under the terms of both the MIT license and the Apache License (Version 2.0). See LICENSE-APACHE and LICENSE-MIT for details. Opening a pull request is assumed to signal agreement with these licensing terms.

burn's People

Contributors

nathanielsimard avatar visualehrmanntraut avatar quba1 avatar n8henrie avatar olgam4 avatar kepae avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.