Giter Club home page Giter Club logo

pelu.resnet.torch's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

sheuan

pelu.resnet.torch's Issues

Data

Great study! I'm going to try to use this activation type. =)

Just a few questions:

  1. "Global pixel-wise mean subtraction" = global mean subtraction, or pixel-wise mean subtraction as in measuring distinct mean of each pixel along training data (which should be awful from theoretical conv net perspective, but is one of the standard things in Caffe, I belive)? Edit: from your code, I understand that it's the first one, which is good =)

  2. Have you tried applying this thing per channel (feature map)? I've seen that you wrote you were "trying to prevent overfitting", but it's not like fitting the data is a bad thing, it's actually very good and doing it via parameters of activation functions may be even "safer" in terms of learning the right thing than doing it via getting a wider/longer network and is likely more efficient in terms of flops than those measures.

Cannot reproduce the results of ResNet-56 on CIFAR-10

Hi,

I tried to reproduce the results of ResNet-56 on CIFAR-10, which is reported as 5.65 error rate in the paper.

I ran the following scripts for reproduction. But cannot achieve the same result. My results are higher. I ran five times and got 5.93, 5.93, 5.79, 6.03, 5.97 respectively. The average is 5.93.

th main.lua -dataset cifar10 -nGPU 4 -batchSize 128 -nEpochs 200 -depth 56 -shortcutType A -weightDecay 0.001 -nThreads 8

Did I miss any thing? Thank you for any help.

AMAZING

Results of first epoch with

PELU 128-batch
test_acc : 58.77
loss : 1.2705653896699
train_acc : 53.553685897436

ReLU 128-batch
test_acc : 37.76
loss : 1.2564023118753
train_acc : 54.284855769231

Retested again just to make sure, got myself

PELU 64-batch
test_acc : 55.72
loss : 1.3610394395573
train_acc : 50.140044814341

RELU 128-batch
test_acc : 31.65
loss : 1.7290771948986
train_acc : 35.326522435897

This is quite amazing! I wonder though about the HUGE memory burden. Is it inherent, or is it just because it's an easy-going for-testing implementation? Maybe, the problem is with not activating "in-place"?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.