Giter Club home page Giter Club logo

ctmkd's Introduction

CTMKD

CTMKD is a inter-VAE knowledge distillation framework where the teacher is a CombinedTM. and student is a ZeroShotTM. In particular, the proposed distillation objective is to minimize the cross-entropy of the soft labels produced by the teacher and the student models, as well as to minimize the squared 2-Wasserstein distance between the latent distributions learned by the two models.

https://github.com/AdhyaSuman/CTMKD/blob/master/misc/KD_Arch_updated_v1.png

Datasets

We have used the datasets 20NewsGroup (20NG) and M10 from in OCTIS.

How to cite this work?

This work has been accepted at ECIR 2023!

Read the paper:

  1. Springer
  2. arXiv

If you decide to use this resource, please cite:

@InProceedings{adhya2023improving,
    author="Adhya, Suman and Sanyal, Debarshi Kumar",
    editor="Kamps, Jaap and Goeuriot, Lorraine and Crestani, Fabio and Maistro, Maria and Joho, Hideo and Davis, Brian and Gurrin, Cathal and Kruschwitz, Udo and Caputo, Annalina",
    title="Improving Neural Topic Models with Wasserstein Knowledge Distillation",
    booktitle="Advances in Information Retrieval",
    year="2023",
    publisher="Springer Nature Switzerland",
    address="Cham",
    pages="321--330",
    abstract="Topic modeling is a dominant method for exploring document collections on the web and in digital libraries. Recent approaches to topic modeling use pretrained contextualized language models and variational autoencoders. However, large neural topic models have a considerable memory footprint. In this paper, we propose a knowledge distillation framework to compress a contextualized topic model without loss in topic quality. In particular, the proposed distillation objective is to minimize the cross-entropy of the soft labels produced by the teacher and the student models, as well as to minimize the squared 2-Wasserstein distance between the latent distributions learned by the two models. Experiments on two publicly available datasets show that the student trained with knowledge distillation achieves topic coherence much higher than that of the original student model, and even surpasses the teacher while containing far fewer parameters than the teacher. The distilled model also outperforms several other competitive topic models on topic coherence.",
    isbn="978-3-031-28238-6"}

Acknowledgment

All experiments are conducted using OCTIS which is an integrated framework for topic modeling.

OCTIS: Silvia Terragni, Elisabetta Fersini, Bruno Giovanni Galuzzi, Pietro Tropeano, and Antonio Candelieri. (2021). OCTIS: Comparing and Optimizing Topic models is Simple!. EACL. https://www.aclweb.org/anthology/2021.eacl-demos.31/

ctmkd's People

Contributors

adhyasuman avatar

Stargazers

 avatar Yotam avatar

Watchers

Yotam avatar  avatar

Forkers

ml-edu

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.