Giter Club home page Giter Club logo

Comments (7)

WenbinLee avatar WenbinLee commented on July 29, 2024 1

Thanks.
No, DN4 does not need a pre-training stage and it is just trained from scratch in this code.
By the way, since pre-training is a general trick, you can also combine our DN4 with a pre-training by yourself.

Hope this can help you!

from dn4.

WenbinLee avatar WenbinLee commented on July 29, 2024 1

You are welcome.

As mentioned in our paper, one key reason is that we employ the much richer non-summarized local representations to represent both the query image and support class. This can be seen as a natural data augmentation, which can especially benefit the few-shot setting. On the other hand, the image-to-class measure can make full use of these local representations owing to the exchangeability of visual patterns.

You can just run our codes or use our latest implementation in ADM from https://github.com/WenbinLee/ADM.git.

from dn4.

WenbinLee avatar WenbinLee commented on July 29, 2024 1

Yes, it's a normal situation. Because DN4 use a Sum operation to aggregate all the local similarities for a query image, this will make the similarity list flat. Fortunately, the following Softmax operation will make the the similarity list somewhat sharp.

Also, if you want to explicitly make the similarity list sharp, you may use temperature or mean/weighted average operation.

Hope this can help you.

from dn4.

YiX98 avatar YiX98 commented on July 29, 2024

Thank you for your reply! Can I ask the reason why the DN4 can achieve such a great result without the pre-training? Are some of the features of DN4 that make it possible to learn quickly in a small number of samples without fine-tuning? Or is it found in your experiments that satisfactory classification performance can be obtained without pre-training?

from dn4.

YiX98 avatar YiX98 commented on July 29, 2024

Thanks a lot for your answers! I've tried with the DN4 on my dataset and it can achieve a very promising result. Although DN4 can perform very well on my dataset, I am wondering how can I further improve the performance. So far I tried with adding a Transformer block to adjust the feature maps returned by the feature extractor but this Transformer block can't help with the overall accuracy. I think maybe this specific Transformer block is ineffective. Would you please give me some suggestions on using Transformer to enhance performance of DN4? Or can you please share some recommended Transformer literature with potential for DN4 enhancement?

from dn4.

WenbinLee avatar WenbinLee commented on July 29, 2024

It's my pleasure. I am glad that DN4 works on your dataset!
Yes, Transformer is a good choice. Unfortunately, I don't have much good suggestions or experiences on this part. I guess the key reason may be that Transformer is somewhat difficult to train. Some training tricks can be seen in "Training data-efficient image transformers & distillation through attention". Also, you may need to further make some special designs.

Hope this can help you.

from dn4.

YiX98 avatar YiX98 commented on July 29, 2024

Thank you so much for the previous support!
I recently found that the output of the Similarity list shows very similar values within classes. For example, in a single episode, with the input of a single Query image and randomly selected 5-shot Support images (I have 4 classes in total). The Similarity list reported as follows:
[3846.2258, 3845.3762, 3850.9907, 3846.0186],
which should be the Similarity values of this Query and all the 4 Support classes. These similar values don’t seem to significantly discriminate each category. May I ask whether this is a common situation? Because I am expecting obvious differences in similarity although the testing result is good on my dataset.

from dn4.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.