Giter Club home page Giter Club logo

Comments (8)

naturomics avatar naturomics commented on July 28, 2024

Even when K=2, our dynamic linear transformation is different from affine coupling layer, discussed in Section 3.1.

We found K=4, 6 for inverse dynamic linear transformation is also worse than K=2 of inverse dynamic linear transformation, so we didn't discussed it in our paper due to space constraints.

Conform it by following test if you're interested:
python main.py --results_dir results/cifar10_noCond_4parts --num_parts 4 --width 308 --decomposition 1
python main.py --results_dir results/cifar10_noCond_6parts --num_parts 6 --width 256 --decomposition 1

from dlf.

xc-7984 avatar xc-7984 commented on July 28, 2024

So the best K is 2?When k=2,Glow is h(x1)=x1,while yours is h(x1) = s1*x1+u1.Only changing this can make the results better than Glow on the Imagenet dataset?I amd confused about that.

from dlf.

naturomics avatar naturomics commented on July 28, 2024

Yeah, it turns out our best results are obtained by changing y1 = x1 in affine coupling layer to y1 = s1*x1 + u1 (Actnorm layer likes). This is reasonable. In affine coupling layer, there always a half remains unchanged, it could be a bias.

from dlf.

xc-7984 avatar xc-7984 commented on July 28, 2024

So if i replace the dynamic linear transform with a affine coupling layer and a actnorm layer,the result should be better.Glow consists of a affine coupling layer and a actnorm layer each step.I still don't understand why your model better than Glow on the Imagenet dataset.

from dlf.

lukemelas avatar lukemelas commented on July 28, 2024

Hello, I just wanted to follow up on this.

I feel as if I'm missing something important here. When K=2, is your model exactly the same as Glow, except for the fact that in the affine coupling layer, you have h(x_1) = s_1*x_1+u_1 instead of h(x_1)=x_1 in Glow?

from dlf.

naturomics avatar naturomics commented on July 28, 2024

@lukemelas The changes in our best case (K=2) compared to Glow can be concluded as three points:

  1. in the affine coupling layer, we choose h(x_1) = s_1*x_1+u_1 instead of h(x_1)=x_1. We found any simple invertible h() can improve the model very significantly, there are more choices such as invertible 1x1 conv, invertible activation function (we didn't discuss these choices because we had not yet tested them at the time publishing our DLF paper, we will discuss it together with other important contributions in our next publication).
  2. we removed actnorm layer between the 1x1 conv layer and the coupling layer, as it is the special case of dynamic linear transformation without data-parameterization.
  3. we changed the NN structure slightly for training stability (and some optimization details such as learning rate).

I think our other novel contributions are also important:

  1. conceptually we connected the affine coupling layer and the AR/IAR transformations, as they are the extreme forms of dynamic linear transformation.
  2. conditional DLF allows us to control the mapping between the latent and the observation space. I can see some excellent applications utilizing this property.

from dlf.

lukemelas avatar lukemelas commented on July 28, 2024

Thanks for the quick and thorough response!

from dlf.

yuffon avatar yuffon commented on July 28, 2024

Your response also helps me.

from dlf.

Related Issues (8)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.