Giter Club home page Giter Club logo

epig's Introduction

Prediction-oriented Bayesian active learning

Freddie Bickford Smith*, Andreas Kirsch*, Sebastian Farquhar, Yarin Gal, Adam Foster, Tom Rainforth
International Conference on Artificial Intelligence and Statistics (AISTATS), 2023

Python PyTorch License arXiv

Abstract

Information-theoretic approaches to active learning have traditionally focused on maximising the information gathered about the model parameters, most commonly by optimising the BALD score. We highlight that this can be suboptimal from the perspective of predictive performance. For example, BALD lacks a notion of an input distribution and so is prone to prioritise data of limited relevance. To address this we propose the expected predictive information gain (EPIG), an acquisition function that measures information gain in the space of predictions rather than parameters. We find that using EPIG leads to stronger predictive performance compared with BALD across a range of datasets and models, and thus provides an appealing drop-in replacement.

Running the code

Create a Conda environment:

conda env create --file environment.yaml

Run active learning with the default config:

python main.py

See the jobs directory for the commands used to run the experiments in the paper.

Contact

Get in touch with Freddie if you have any questions about this research or encounter any problems using the code. This repo is a partial release of a bigger internal repo, and it's possible that errors were introduced in the process of preparing this repo for release.

Contributors

Andreas Kirsch wrote the original versions of the BALD and EPIG functions in this repo, along with the dropout layers, and advised on the code in general. Adam Foster and Joost van Amersfoort advised on the Gaussian-process implementation. Jannik Kossen provided a repo template and advised on the code in general.

Citation

Please cite our paper if you use our code or ideas in your work.

@article{
    bickfordsmith2023prediction,
    author = {{Bickford Smith}, Freddie and Kirsch, Andreas and Farquhar, Sebastian and Gal, Yarin and Foster, Adam and Rainforth, Tom},
    year = {2023},
    title = {Prediction-oriented {Bayesian} active learning},
    journal = {International Conference on Artificial Intelligence and Statistics},
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.