Giter Club home page Giter Club logo

Comments (5)

chengchingwen avatar chengchingwen commented on June 11, 2024

No, it is not that the masks are broken. It is just no such a thing of "mask with all false". Attention with softmax is using really small value to remove the masked entries. In the case that everything is masked would have uniform attention score of the small value and thus everything is attent uniformly. The two lines are the same because the random initialized Transformer would usually attent uniformly. If you really need that behavior, you can use the component in NeuralAttentionlib to create new attention operator and call the Transformer constructor with that operator.

from transformers.jl.

anj1 avatar anj1 commented on June 11, 2024

The mask with all zeros is just an illustrative example to show the issue with the mask application logic.You can try with other kinds of masks and verify that the results are not as expected.

from transformers.jl.

chengchingwen avatar chengchingwen commented on June 11, 2024

There are tests that make sure the masks work as expected. Many model won't work if the CausalMask doesn't being applied. Another guess of your observation is the uniform attention score caused by the random initialization. You can try changing the distribution of the weights and rerun the model to see if the value are still the same. If you are pretty sure the mask is broken, please give a MWE with non-trivial masks (at least not mask with all zeros, as explained above).

from transformers.jl.

anj1 avatar anj1 commented on June 11, 2024

There are no tests that CausalMask actually works correctly, neither here nor in NeuralAttentionlib. In NeuralAttentionlib there are tests that CausalMask works in returning the correct value, but this is completely separate from determining if CausalMask actually works as intended during the transformer attention update.

Note that it is quite possible for the CausalMask to be failing in properly masking decoder tokens yet for the huggingface examples to still work.

from transformers.jl.

chengchingwen avatar chengchingwen commented on June 11, 2024

Note that it is quite possible for the CausalMask to be failing in properly masking decoder tokens yet for the huggingface examples to still work.

Again, you'll need to provide an example to make this discussion concrete.

The supported models are tested against huggingface transformer with the validation code in example folder and we do observe that missing masks would have huge impact on the numerical outputs.

from transformers.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.