Giter Club home page Giter Club logo

Comments (3)

ozansener avatar ozansener commented on May 18, 2024 1

I think you have two questions:

  • How \nabla_z L^t can be computed in a single backward pass? We assume encoder-decoder structure as we explain in that section. Hence, the path from z to L^t is independent between tasks. Since they are independent, it is just a single backward pass through decoders.

  • Wouldn't this apply to \nabla_theta L^t? No, because \nabla_theta L^t = \nabla_{theta^sh} Z \nabla_Z L^t. You can compute all \nabla_Z L^t values in a single pass. Then, you can compute \nabla_{theta^sh} Z explicitly as well in a single pass but that computation will be much more costly than computing \nabla_{theta^sh} L^t since auto-differentiation works without explicit computation of \nabla_{theta^sh} Z. I think this is hard to explain without getting into details of AD but you can try to go through computation graph explicitly to see the difference. In other words, it is not a single-pass computation in the sense that a single-pass of AD would not be able to compute it.

from multiobjectiveoptimization.

ozansener avatar ozansener commented on May 18, 2024 1

Yes, your understanding is correct. Also, keep in mind that parameters from shared representation to task-losses are disjoint following encoder-decoder assumption. Hence, we compute gradient over any parameter only ones.

from multiobjectiveoptimization.

chanshing avatar chanshing commented on May 18, 2024

Thank you for the clarification. After going through the code more carefully and reading your reply I think I understand what you meant. I think the confusion was that by 'single-pass' you meant 'single-pass on the shared parameters', which are typically orders of magnitude more than the task-specific parameters. In your code, you are still calling backward() for each task, but only backproping till Z. So the saving resides in not having to backprop till the \theta^sh, PLUS not having to forward over them T times. Is my understanding correct?

from multiobjectiveoptimization.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.