Comments (4)
Hi, please note that the Imagenette dataset changed their train-val split on Dec 6, 2019 (mentioned here) by which time we had already completed much of our experimentation with the older train-val split. In the older train-val split, the validation set was smaller which gave better accuracy numbers. This is one possible reason for you getting lower accuracy since you may have used the newer train-val split.
For our paper, we consistently used the older train-val split in all experiments and thus our comparisons remain valid though they won't be comparable to the new train-val split.
from stagewise-knowledge-distillation.
Hi, please note that the Imagenette dataset changed their train-val split on Dec 6, 2019 (mentioned here) by which time we had already completed much of our experimentation with the older train-val split. In the older train-val split, the validation set was smaller which gave better accuracy numbers. This is one possible reason for you getting lower accuracy since you may have used the newer train-val split.
For our paper, we consistently used the older train-val split in all experiments and thus our comparisons remain valid though they won't be comparable to the new train-val split.
Thank you for your answer.I just used the old version of Imagenette, but this may be because of my lack of experience. For the first time, I studied a relatively large data set, so the setting of super parameters for the training process was not very good, so the accuracy was extremely low.Is it convenient for you to send me your method or relevant code for training the teacher model of Imagenette?Thank you very much, because I've been stuck in this place for a long time!
My email: [email protected]
from stagewise-knowledge-distillation.
We used this Jupyter notebook for training the teacher models. This linked one loads CIFAR10 at the start, but the rest of the training code was same for all the datasets we used.
from stagewise-knowledge-distillation.
Thank you very much. I'll have a try
from stagewise-knowledge-distillation.
Related Issues (7)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from stagewise-knowledge-distillation.