Comments (7)
Thanks.
No, DN4 does not need a pre-training stage and it is just trained from scratch in this code.
By the way, since pre-training is a general trick, you can also combine our DN4 with a pre-training by yourself.
Hope this can help you!
from dn4.
You are welcome.
As mentioned in our paper, one key reason is that we employ the much richer non-summarized local representations to represent both the query image and support class. This can be seen as a natural data augmentation, which can especially benefit the few-shot setting. On the other hand, the image-to-class measure can make full use of these local representations owing to the exchangeability of visual patterns.
You can just run our codes or use our latest implementation in ADM from https://github.com/WenbinLee/ADM.git.
from dn4.
Yes, it's a normal situation. Because DN4 use a Sum operation to aggregate all the local similarities for a query image, this will make the similarity list flat. Fortunately, the following Softmax operation will make the the similarity list somewhat sharp.
Also, if you want to explicitly make the similarity list sharp, you may use temperature or mean/weighted average operation.
Hope this can help you.
from dn4.
Thank you for your reply! Can I ask the reason why the DN4 can achieve such a great result without the pre-training? Are some of the features of DN4 that make it possible to learn quickly in a small number of samples without fine-tuning? Or is it found in your experiments that satisfactory classification performance can be obtained without pre-training?
from dn4.
Thanks a lot for your answers! I've tried with the DN4 on my dataset and it can achieve a very promising result. Although DN4 can perform very well on my dataset, I am wondering how can I further improve the performance. So far I tried with adding a Transformer block to adjust the feature maps returned by the feature extractor but this Transformer block can't help with the overall accuracy. I think maybe this specific Transformer block is ineffective. Would you please give me some suggestions on using Transformer to enhance performance of DN4? Or can you please share some recommended Transformer literature with potential for DN4 enhancement?
from dn4.
It's my pleasure. I am glad that DN4 works on your dataset!
Yes, Transformer is a good choice. Unfortunately, I don't have much good suggestions or experiences on this part. I guess the key reason may be that Transformer is somewhat difficult to train. Some training tricks can be seen in "Training data-efficient image transformers & distillation through attention". Also, you may need to further make some special designs.
Hope this can help you.
from dn4.
Thank you so much for the previous support!
I recently found that the output of the Similarity list shows very similar values within classes. For example, in a single episode, with the input of a single Query image and randomly selected 5-shot Support images (I have 4 classes in total). The Similarity list reported as follows:
[3846.2258, 3845.3762, 3850.9907, 3846.0186],
which should be the Similarity values of this Query and all the 4 Support classes. These similar values don’t seem to significantly discriminate each category. May I ask whether this is a common situation? Because I am expecting obvious differences in similarity although the testing result is good on my dataset.
from dn4.
Related Issues (20)
- a question about ablation study HOT 6
- The results are different from the paper on miniImagenet dataset HOT 28
- About BN parameters HOT 3
- why you used --ravi HOT 3
- Some questions about resnet HOT 6
- what about multiGPU training? HOT 5
- The result of benchmark models on CUB dataset HOT 3
- Testing in a single image
- About preprocessing on Stanford_Cars HOT 2
- Question on paper figure HOT 2
- BN parameters followup question HOT 1
- Inference on one image only HOT 1
- 代码报错 HOT 1
- 细粒度数据集准确率 HOT 3
- CUB HOT 3
- 可视化
- batch_size只能设为1? HOT 1
- about ablation study HOT 1
- Experiment results HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dn4.