Giter Club home page Giter Club logo

distanceweightedsampling's Introduction

Distance Weighted Sampling

This repo is a pytorch implementation of the ICCV paper Sampling Matters in Deep Embedding Learning . The code is mainly based on mxnet version.

Usage

See train.sh and train.py

Optional arguments of train.py:

optional arguments:
  -h, --help            show this help message and exit
  --start-epoch N       manual epoch number (useful on restarts)
  --workers N           number of data loading workers (default: 4)
  --data-path DATA_PATH
                        path of data, which contains train,val subdirectory
  --embed-dim EMBED_DIM
                        dimensionality of image embedding. default is 128.
  --feat-dim FEAT_DIM   dimensionality of base_net output. default is 512.
  --classes CLASSES     number of classes in dataset
  --batch-num BATCH_NUM
                        number of batches in one epoch
  --batch-size BATCH_SIZE
                        total batch_size on all gpus.
  --batch-k BATCH_K     number of images per class in a batch. default is 5.
  --gpus GPUS           list of gpus to use, e.g. 0 or 0,2,5.
  --epochs EPOCHS       number of training epochs. default is 20.
  --lr LR               learning rate. default is 0.0001.
  --lr-beta LR_BETA     learning rate for the beta in margin based loss.
                        default is 0.1.
  --margin MARGIN       margin for the margin based loss. default is 0.2.
  --momentum MOMENTUM   momentum
  --beta BETA           initial value for beta. default is 1.2.
  --nu NU               regularization parameter for beta. default is 0.0.
  --factor FACTOR       learning rate schedule factor. default is 0.5.
  --steps STEPS         epochs to update learning rate. default is 20,40,60.
  --resume RESUME       path to checkpoint
  --wd WD               weight decay rate. default is 0.0001.
  --seed SEED           random seed to use
  --model {alexnet,densenet121,densenet161,densenet169,densenet201,inception_v3,resnet101,resnet152,resnet18,resnet34,resnet50,squeezenet1_0,squeezenet1_1,vgg11,vgg11_bn,vgg13,vgg13_bn,vgg16,vgg16_bn,vgg19,vgg19_bn}
                        type of model to use. see vision_model for options.
  --use-pretrained      enable using pretrained model from gluon.
  --print-freq PRINT_FREQ
                        number of batches to wait before logging.

distanceweightedsampling's People

Contributors

suruoxi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

distanceweightedsampling's Issues

Distance function produces nans

Hello. I am getting a weird effect where an epoch or so into training, I will get nan's as part of the get_distance output (and thus nan's for all the weights). In the process of debugging but I'm wondering if you ever encountered this in training? Thanks.

EDIT: Fixed. Not sure why, but occasionally the line 2-2*sim will have small negative values, which produces nan with the sqrt. Masking out small values to 0 fixes the problem.

I have a question about Distance Weighted Sampling.

First of all, thank you very much for coding as Pytorch.

I have a question in that code.

I have a question about the expression for getting log_weights in the Distance Weighted Sampling Class.

First, in this paper, the expression is written as follows:

The current code is as follows.
log_weights = ((2.0 - float(d)) * distance.log() - (float(d-3)/2)torch.log(torch.clamp(1.0 - 0.25(distance*distance), min=1e-8)))

Is this code correct to code the formula for giving weight?:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.