Giter Club home page Giter Club logo

Comments (6)

danijar avatar danijar commented on May 19, 2024 2

Gradients can go above 65504 quite easily when predicting large images. I already switched to my own implementation of loss scaling that allows values below 1 and that works great. I just posted the issue here in case the Keras team wanted to fix it for others.

from tf-keras.

jvishnuvardhan avatar jvishnuvardhan commented on May 19, 2024

@danijar Can you please share a simple standalone code to reproduce the issue. Thanks!

from tf-keras.

danijar avatar danijar commented on May 19, 2024

I don't have a simple reproducer, sorry. But it should be easy to see that 1 is an arbitrary lower bound and breaks down for models with large gradients (e.g. many skip connections and image reconstruction loss that is a sum over thousands of pixels).

from tf-keras.

mattdangerw avatar mattdangerw commented on May 19, 2024

@reedwm could you take a look here? Thanks!

from tf-keras.

reedwm avatar reedwm commented on May 19, 2024

This is similar to tensorflow/tensorflow#38357, except that issue involved an older version of the API. The same issue still remains though: the loss scale cannot go below 1.

As mentioned in the other issue, if gradients are still nonfinite when the loss scale reaches 1, every step will be skipped and training will not progress. This behavior is not great, but I cannot think of a good solution here. Ideally, we would raise an error, but that could cause a performance penalty by transferring the loss scale to the CPU, and there isn't a clear place of where in Keras we would check if the loss scale was 1.

We could also allow the loss scale to be below 1. However, this would cause LossScaleOptimizer to no longer fulfills its sole purpose of preventing underflow, as having a loss scale below 1 increases the chance of underflow. Allowing this would still be worth it if it did help mixed precision training. However, I suspect it would not help. For gradients to still overflow with a loss scale of 1, a gradient value would have to be larger than 65504, which is a very large value. I don't know of any models with gradients this large, so I suspect if the loss scale goes to 1, instead NaNs or Infs are being generated in a way other than gradient overflow.

@danijar, there is a workaround in tensorflow/tensorflow#38357 (comment). I suspect allowing the loss scale to go below 1 will not fix the model however. You can alternative try setting some layers to float32 by passing dtype="float32" and seeing if that helps. You can try setting all or most layers to float32, then start switching layers to mixed precision and checking if the loss scale is reaching 1 or not.

/CC @nluehr

from tf-keras.

thijs-vanweezel avatar thijs-vanweezel commented on May 19, 2024

@reedwm and others, it seems like this issue has not been resolved yet. Could we add a boolean parameter prevent_overflow to LossScaleOptimizer which disables the lower bound if it is set to True?

from tf-keras.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.