Topic: knowledge-distillation Goto Github
Some thing interesting about knowledge-distillation
Some thing interesting about knowledge-distillation
knowledge-distillation,Infrastructures™ for Machine Learning Training/Inference in Production.
User: 1duo
knowledge-distillation,Pytorch implementation of various Knowledge Distillation (KD) methods.
User: aberhu
knowledge-distillation,An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Organization: aimagelab
knowledge-distillation,EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Organization: alibaba
knowledge-distillation,EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Organization: alibaba
Home Page: https://www.yuque.com/easytransfer/cn/
knowledge-distillation,Code and resources on scalable and efficient Graph Neural Networks
User: chaitjo
Home Page: https://www.chaitjo.com/post/efficient-gnns/
knowledge-distillation,Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Organization: clovaai
knowledge-distillation,SlimSAM: 0.1% Data Makes Segment Anything Slim
User: czg1225
knowledge-distillation,Awesome Knowledge Distillation
User: dkozlov
knowledge-distillation,This repository aims at providing efficient CNNs for Audio Tagging. We provide AudioSet pre-trained models ready for downstream training and extraction of audio embeddings.
User: fschmid56
knowledge-distillation,A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
User: haitongli
knowledge-distillation,Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)
User: hikaritju
knowledge-distillation,A curated list for Efficient Large Language Models
User: horseee
knowledge-distillation,Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
User: hoytta0
knowledge-distillation,Efficient computing methods developed by Huawei Noah's Ark Lab
Organization: huawei-noah
knowledge-distillation,Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Organization: huawei-noah
knowledge-distillation,"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
Organization: idea-research
knowledge-distillation,Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
Organization: ilovepose
knowledge-distillation,Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
User: imirzadeh
knowledge-distillation,SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
Organization: intel
Home Page: https://intel.github.io/neural-compressor/
knowledge-distillation,Collection of AWESOME vision-language models for vision tasks
User: jingyi0000
knowledge-distillation,A large scale study of Knowledge Distillation.
User: karanchahal
knowledge-distillation,Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
User: lenscloth
knowledge-distillation,knowledge distillation papers
User: lhyfst
knowledge-distillation,IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)
User: ltkong218
knowledge-distillation,利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
User: lxztju
knowledge-distillation,The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
Organization: megvii-research
knowledge-distillation,This is a collection of our NAS and Vision Transformer work.
Organization: microsoft
knowledge-distillation,NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Organization: microsoft
knowledge-distillation,Collection of recent methods on (deep) neural network compression and acceleration.
User: mingsun-tse
knowledge-distillation,FasterAI: Prune and Distill your models with FastAI and PyTorch
User: nathanhubens
Home Page: https://nathanhubens.github.io/fasterai/
knowledge-distillation,OpenMMLab Model Compression Toolbox and Benchmark.
Organization: open-mmlab
Home Page: https://mmrazor.readthedocs.io/en/latest/
knowledge-distillation,A treasure chest for visual classification and recognition powered by PaddlePaddle
Organization: paddlepaddle
knowledge-distillation,[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
Organization: picsart-ai-research
knowledge-distillation,Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
User: sebastian-hofstaetter
Home Page: https://neural-ir-explorer.ec.tuwien.ac.at/
knowledge-distillation,Segmind Distilled diffusion
Organization: segmind
Home Page: https://discord.gg/p2MdJqZXnb
knowledge-distillation,A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Organization: sforaidl
Home Page: https://kd-lib.readthedocs.io/
knowledge-distillation,Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
User: sseung0703
knowledge-distillation,An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
User: ssisyphustao
knowledge-distillation,[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
User: sunshangquan
knowledge-distillation,Improving Convolutional Networks via Attention Transfer (ICLR 2017)
User: szagoruyko
Home Page: https://arxiv.org/abs/1612.03928
knowledge-distillation,This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
User: tebmer
knowledge-distillation,[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51 using Sparsity"; [ICML 2023] "Are Large Kernels Better Teachers than Transformers for ConvNets?"
Organization: vita-group
knowledge-distillation,2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) :fire:
User: yanx27
knowledge-distillation,A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
User: yoshitomo-matsubara
Home Page: https://yoshitomo-matsubara.net/torchdistill/
knowledge-distillation,Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
User: yuanli2333
knowledge-distillation,Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
User: yuanyuanli85
knowledge-distillation,Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
User: yzd-v
knowledge-distillation,Masked Generative Distillation (ECCV 2022)
User: yzd-v
knowledge-distillation,(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
User: yzhao062
Home Page: https://www.andrew.cmu.edu/user/yuezhao2/papers/20-preprint-suod.pdf
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.