The PyTorch implementation of variants of loss for text classification & text matching.
基于PyTorch实现的文本分类与文本匹配损失函数变种
- Distilling the Knowledge in a Neural Network (NIPS 2014 DeepLearning Workshop) [paper] - Soft Target & Soft Softmax Loss
- FaceNet: A Unified Embedding for Face Recognition and Clustering (CVPR 2015) [paper] - Triplet Loss
- Applying Deep Learning to Answer Selection: A Study and An Open Task (ASRU 2015) [paper]
- Holistically-Nested Edge Detection (ICCV 2015) [paper] - Weigted Softmax Loss
- V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation (3DV 2016) [paper] - Dice Loss
- UnitBox: An Advanced Object Detection Network (ACM Multimedia 2016) [paper] - IoU Loss
- Rethinking the Inception Architecture for Computer Vision (CVPR 2016) [paper] - Label Smoothing
- A Discriminative Feature Learning Approach for Deep Face Recognition (ECCV 2016) [paper] - Center Loss
- Large-Margin Softmax Loss for Convolutional Neural Networks (ICML 2016) [paper] - L-softmax Loss
- SphereFace: Deep Hypersphere Embedding for Face Recognition (CVPR 2017) [paper] - A-softmax Loss
- Focal Loss for Dense Object Detection (ICCV 2017) [paper] - Focal Loss
- The Lovász-Softmax Loss: A Tractable Surrogate for The Optimization of The Intersection-over-Union Measure in Neural Networks (CVPR 2018) [paper] [code] - Lovasz Softmax Loss
- Island Loss for Learning Discriminative Features in Facial Expression Recognition (FG 2018) [paper] - Island Loss
- Feature Incay for Representation Regularization (ICLR 2018 Workshop) [paper]
- Additive Margin Softmax for Face Verification (ICLR 2018 Workshop) [paper] [code] - AM-softmax Loss
- AnatomyNet: Deep Learning for Fast and Fully Automated Whole-volume Segmentation of Head and Neck Anatomy (Medical Physics 2018) [paper] - Exponential Logarithmic Loss (Focal Loss + Dice Loss)
- ArcFace: Additive Angular Margin Loss for Deep Face Recognition (CVPR 2019) [paper] - AA-softmax Loss
- Mixtape: Breaking the Softmax Bottleneck Efficiently (NeurIPS 2019) [paper] - Mixtape
- When Does Label Smoothing Help? (NeuIPS 2019) [paper]
- Complement Objective Training (ICLR 2019) [paper] - COT
- Dice Loss for Data-imbalanced NLP Tasks (CoRR 2019) [paper] - Dice Loss
- Tversky Loss Function for Image Segmentation Using 3D Fully Convolutional Deep Networks (MLMI@MICCAI 2017) [paper] - Tversky Loss
- label smoothing 如何让正确类与错误类在 logit 维度拉远的?
- 如何理解soft target这一做法? - 知乎
- 有哪些「魔改」loss函数,曾经拯救了你的深度学习模型? - 知乎
- 图像分割中的loss--处理数据极度不均衡的状况
- 【损失函数合集】超详细的语义分割中的Loss大盘点 - 言有三
- 【技术综述】一文道尽softmax loss及其变种 - 言有三
- Softmax理解之从最优化的角度看待Softmax损失函数 - 王峰
- Softmax理解之二分类与多分类 - 王峰
- Softmax理解之Smooth程度控制 - 王峰
- Softmax理解之margin - 王峰
- Softmax理解之margin的自动化设置 - 王峰
- Softmax理解之被忽略的Focal Loss变种 - 王峰
- 文本情感分类(四):更好的损失函数 - 科学空间
- 从loss的硬截断、软化到focal loss - 科学空间
- Keras中自定义复杂的loss函数 - 科学空间
- 基于GRU和am-softmax的句子相似度模型 - 科学空间