Giter Club home page Giter Club logo

Comments (4)

jakc4103 avatar jakc4103 commented on July 18, 2024 9

(b) After adding the result of the third convolution to the result from the other branch, the final results will be passed to ReLU6 and then quantized after ReLU6

I think there is no ReLU6 append after elementwise addition in mobilenetv2. Quantization might be lost as @dreambear1234 said.

from zeroq.

yaohuicai avatar yaohuicai commented on July 18, 2024

Hi,

Thanks for your interest in our work and your detailed questions. It should be noted that currently we are not doing fully quantized model since a fully quantized model needs to match specific hardware details which may vary significantly across CPUs, GPUs or FPGAs. As a general quantization method, we believe model size is important to the memory bottleneck and multiplications are the main part of the computation, so we tried to compress all weights in a neural network and make all the multiplications performed in low-precision. Please see the detailed answers below:

  1. Typically the models that we studied do not use bias. Leaving bias not quantized will not cause floating-point multiplications as there are ReLUs after convolutions where bias is merged into the activations.
  2. (a) The input images are ranged from 0-255 which are already quantized versions, so there is no need to quantize them. (b) After adding the result of the third convolution to the result from the other branch, the final results will be passed to ReLU6 and then quantized after ReLU6. Therefore, it only incurs floating-point addition. (c) The same reason as (b).
  3. Here we did not fuse batch norm and convolutional layer as the batch normalization is essentially a linear transformation, which can be fused into the scaling factor in the quantization operation and will not hurt the accuracy.
  4. We also observed the same phenomenon which we believe is normal.

from zeroq.

liming312 avatar liming312 commented on July 18, 2024

Any more comments?

from zeroq.

Amin-Azar avatar Amin-Azar commented on July 18, 2024

Thanks for the nice paper. I am assuming this is not the complete code of the paper? All of the things @dreambear1234 said are still valid and a concern.

@yaohuicai, how are you doing multiplications in low-precision? Based on this code ( https://github.com/amirgholami/ZeroQ/blob/master/classification/utils/quantize_model.py#L45 ) , only acts after ReLU is quantized and you lose quantization on Conv and FC layers. I guess your answer to Q3 is simply not correct. We have tried it and fused BN is affecting the accuracy in quantized models considerably.
Please update us on whether there is an updated code available, otherwise, I believe this code won't give a fair comparison against other methods. We are planning to move only the distillation code to the distiller framework, have the proper quantization on activations, do batch norm folding, etc.

from zeroq.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.