Giter Club home page Giter Club logo

Comments (4)

Grvzard avatar Grvzard commented on July 17, 2024

Actually the metric values shown in the progress bar are mean values, i.e. mean AUC of 32 steps.

from keras.

fchollet avatar fchollet commented on July 17, 2024

That metric in the progress bar is a training metric, computed on the training data. There are N different reasons a training metric might turn out different from the metric you see in a call to evaluate() (e.g. the state of the model changes at each step, and training behavior might be different from inference behavior).

If you want your displayed metrics to be equivalent to a call to evaluate() done at the end of the epoch, just pass validation_data=... in fit(). This will call evaluate() at the end of the epoch on the data you passed, and it will display the result in the progress bar under the name "val_auc".

from keras.

markveilletteLL avatar markveilletteLL commented on July 17, 2024

Actually the metric values shown in the progress bar are mean values, i.e. mean AUC of 32 steps.

But the mean of AUC over batches =/= Final AUC.

from keras.

markveilletteLL avatar markveilletteLL commented on July 17, 2024

That metric in the progress bar is a training metric, computed on the training data. There are N different reasons a training metric might turn out different from the metric you see in a call to evaluate() (e.g. the state of the model changes at each step, and training behavior might be different from inference behavior).

AFAIK, .evaluate just takes one dataset so it doesn't distinguish between train/test/val. I understand the progress bar I see during .fit reflects the changing state of the model, however the model weights should be frozen during evaluate , and the model should be in inference mode. So based on what you said I don't see why the metrics in .evaluate's progress bar shouldn't converge to the final metric.

Actually the metric values shown in the progress bar are mean values, i.e. mean AUC of 32 steps.

If this is true, and the progress bar shows a running average of the metric computed over batches, then it is understandable the answers are different. However I still think it's misleading and a better choice would be to show the result of the metric after incorporating each batch. (i.e. the progress bar should show metric.update_state(...).result() )

from keras.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.