Giter Club home page Giter Club logo

nsga-net's People

Contributors

anonymone avatar ianwhale avatar mikelzc1990 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nsga-net's Issues

Crossover/Mutation

The paper indicates that the crossover and mutation probabilities are 0.9 and 0.02 respectively. Pymoo, instead of using these probabilities, performs crossover and mutation on all offsprings. Moreover, the crossover and mutation that are performed in the code do not correspond to what is indicated in the article.

Thank you,
best regards.

Experiment dir

Why do I create many experimental directories during a search?

ModuleNotFoundError: No module named 'misc'

Hello,

I'm sure this is just user error, but I can't seem to find how to install this misc package. I am using python 3.8.10, Torch 1.13.1+cu116, and pymoo 0.3.0.

PS C:\Users\tstevahn\To_Laptop\Server\Server_Code> & C:/Users/tstevahn/AppData/Local/Programs/Python/Python38/python.exe c:/Users/tstevahn/To_Laptop/Server/Server_Code/Test_NAS/validation/test.py --net_type micro --arch NSGANet --init_channels 26 --filter_increment 4 --SE --auxiliary --model_path weights.pt
Traceback (most recent call last):
File "c:/Users/tstevahn/To_Laptop/Server/Server_Code/Test_NAS/validation/test.py", line 20, in
from misc import utils
ModuleNotFoundError: No module named 'misc'

error: unrecognized arguments: --task cifar100

Hello, guys,I got an error trying to run CIFAR100: unrecognized arguments: --task cifar100. I checked the source code and found that there was no parameter task. I would like to know whether it needs to modify the source code to achieve this.thanks

search slow

The paper said that
NSGA-Net + macro search space takes 8 GPU-days
NSGA-Net (6 @ 560) + cutout takes 4 GPU-days

But I run this searching code on Tesla P100-PCIE-16GB, the results are different.
Macro takes 4 days
Micro takes 41 days(30 mins for one network, total 0.5hr * 40 * 50 / 24hr = 41.6days)

This results is far away from the paper. Can you explain?
Thanks!

Question about Bayesian optimization stage

Hello,
From what I understand, the Bayesian optimization "exploitation" stage which is detailed in your paper doesn't exist in this repository. Am I correct? And if not can you point it out to me?
Thank you very much,
Elad

Changed code for 1D data, questions

Hello,
I changed the macro search of your project in order to support 1-D data. This change mainly consisted of of changing the kernel sizes, and padding sizes of all operators in the file 'macro_decoder.py'.

For example, in all places where 'kernel_size=3' was written, I changed to 'kernel_size=(3,1)'.
Thus I was able to run the method on data of size (1125, 1) instead of (32,32) like CIFAR10.
After this transformation I got bad results (lower than my naive NAS method) by running the macro search with default command line arguments (as written in the README).

Does this approach make sense to you? Do you think any additional changes are needed in order to support 1D data?

About Search Time

In the paper

We set the number of phases np to three and the number of nodes in each phase no to six.
Duringarchitecturesearch, we limit the number of filters (channels) in any node to 16 for each
one of the generated network architecture. We then train them on our training set using standard stochastic gradient descent (SGD) back-propagation algorithm and a cosine annealing learning rate schedule. Our initial learning rate is 0.025 and we train for 25 epochs, which takes about 9 minutes on a NVIDIA 1080Ti GPU implementation in PyTorch .

But in the code, the default parameter --n_nodes(number of nodes per phases) is four.
I set the channel to 16 and n_nodes to six ,but the search process is slow. So, I want to know if you have the concrete configuration about the 9 minutes.
Also , I find that the search code run slower in multiple GPU than run in a GPU,can you explain the phenomenon?
Tnank you very much!

Ask for 'check duplicate'

Hi guys, I can't find the code for checking duplicate architectures in the source code. Could you show me where it is? Thank you very much.

[ERROR REPORT] No module named 'config'

Hi, @ianwhale
Thanks for your great works on NAS and opening your codes.
Here is the error info 🐛 when I run nsga2_main.py :
ImportError: No module named 'config'
and I try to install config module with
pip install config
another problem occured.
AttributeError: module 'config' has no attribute 'parser'

Did I download wrong version of config or config is a missing python file?

pratibha

1.Runtime error: cuda is out of memory
2.perform macro and micro search simultaneously?

search

File "search/evolution_search.py", line 15, in
from search import nsganet as engine
File "/home/ai/Downloads/pratibha/nsga-net-master/search/nsganet.py", line 9, in
from pymoo.algorithms.genetic_algorithm import GeneticAlgorithm
ModuleNotFoundError: No module named 'pymoo.algorithms.genetic_algorithm'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.