Giter Club home page Giter Club logo

dance's People

Contributors

ksaito-ut avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

dance's Issues

Conda environment failure | Request for yaml file

Hey! I was trying to run this code on my machine, but I am not able to create the conda environment due to package errors.
This is the error I am greeted with when I try to create the environment. Could you provide me with a yaml file for this environment or give me some tips to resolve this error.

Thank you,

`

conda create --name dance --file requirements.txt
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed

PackagesNotFoundError: The following packages are not available from current channels:

  • simplejson==3.17.2=pypi_0
  • tensorboard-plugin-wit==1.6.0.post3=pypi_0
  • smmap==3.0.4=pypi_0
  • bravado-core==5.17.0=pypi_0
  • websocket-client==0.57.0=pypi_0
  • defcon==0.7.2=pypi_0
  • dill==0.3.2=pypi_0
  • cxxfilt==0.2.1=py36h831f99a_1
  • torchvision==0.4.0=py36_cu92
  • jsonschema==3.2.0=pypi_0
  • nvidia-apex==0.1=py36h88da601_1
  • kiwisolver==1.1.0=py36hc9558a2_0
  • importlib-metadata==1.6.0=py36h9f0ad1d_0
  • tornado==6.0.3=py36h516909a_0
  • bleach==3.2.0=pypi_0
  • tensorflow==2.0.0=pypi_0
  • _libgcc_mutex==0.1=conda_forge
  • rsa==4.0=pypi_0
  • notebook==6.1.4=pypi_0
  • oauthlib==3.1.0=pypi_0
  • densratio==0.2.2=pypi_0
  • scikit-learn==0.20.4=py36_blas_openblashebff5e3_0
  • pretrainedmodels==0.7.4=pypi_0
  • typing-extensions==3.7.4.3=pypi_0
  • dominate==2.5.1=pypi_0
  • requests-oauthlib==1.3.0=pypi_0
  • tensorflow-estimator==2.2.0=pypi_0
  • webencodings==0.5.1=pypi_0
  • libgcc-ng==9.3.0=h24d8f2e_16
  • pytorch==1.2.0=py3.6_cuda9.2.148_cudnn7.6.2_0
  • icu==58.2=hf484d3e_1000
  • msgpack-python==0.5.6=pypi_0
  • tabulate==0.8.5=pypi_0
  • python-dateutil==2.8.0=py_0
  • terminado==0.8.3=pypi_0
  • astor==0.8.0=pypi_0
  • google-pasta==0.1.8=pypi_0
  • ipykernel==5.3.4=pypi_0
  • cachetools==3.1.1=pypi_0
  • nest-asyncio==1.4.0=pypi_0
  • argon2-cffi==20.1.0=pypi_0
  • msgpack-numpy==0.4.4.3=pypi_0
  • glyphslib==2.4.0=pypi_0
  • nbclient==0.5.0=pypi_0
  • scipy==1.4.1=pypi_0
  • hdbscan==0.8.22=py36hd352d35_1
  • path==15.0.0=pypi_0
  • openssl==1.1.1g=h516909a_1
  • kmeans-pytorch==0.3=pypi_0
  • backcall==0.2.0=pypi_0
  • astunparse==1.6.3=pypi_0
  • tensorboardx==1.8=pypi_0
  • bravado==10.6.2=pypi_0
  • gitdb==4.0.5=pypi_0
  • tk==8.6.9=hed695b0_1002
  • ptyprocess==0.6.0=pypi_0
  • gitpython==3.1.8=pypi_0
  • easydict==1.9=pypi_0
  • monotonic==1.5=pypi_0
  • pyrsistent==0.17.3=pypi_0
  • webcolors==1.11.1=pypi_0
  • libcblas==3.8.0=11_openblas
  • jinja2==2.11.2=pypi_0
  • mistune==0.8.4=pypi_0
  • efficientnet-pytorch==0.7.0=pypi_0
  • pywsl==0.1.4=pypi_0
  • tensorpack==0.9.8=pypi_0
  • traitlets==4.3.3=pypi_0
  • async-generator==1.10=pypi_0
  • jsonref==0.2=pypi_0
  • pyjwt==1.7.1=pypi_0
  • ipython-genutils==0.2.0=pypi_0
  • future==0.18.2=pypi_0
  • jupyter-client==6.1.7=pypi_0
  • nbconvert==6.0.3=pypi_0
  • fontconfig==2.13.1=he4413a7_1000
  • easydl==2.0.7=pypi_0
  • send2trash==1.5.0=pypi_0
  • pexpect==4.8.0=pypi_0
  • swagger-spec-validator==2.7.3=pypi_0
  • libuuid==2.32.1=h14c3975_1000
  • click==7.1.2=pypi_0
  • liblapack==3.8.0=11_openblas
  • protobuf==3.9.2=pypi_0
  • xmltodict==0.12.0=pypi_0
  • strict-rfc3339==0.7=pypi_0
  • markupsafe==1.1.1=pypi_0
  • nbformat==5.0.7=pypi_0
  • pykeops==1.3=pypi_0
  • urllib3==1.24.3=pypi_0
  • ipdb==0.13.3=pypi_0
  • sip==4.18=py36_1
  • tensorflow-gpu==2.2.0=pypi_0
  • jupyterlab-pygments==0.1.1=pypi_0
  • jupyter-core==4.6.3=pypi_0
  • mutatormath==3.0.1=pypi_0
  • google-auth==1.7.1=pypi_0
  • cycler==0.10.0=py_1
  • msgpack==0.6.2=pypi_0
  • pyqt==4.11.4=py36_3
  • certifi==2020.6.20=py36h9f0ad1d_0
  • entrypoints==0.3=pypi_0
  • pytest==5.4.1=py36h9f0ad1d_0
  • sklearn==0.0=pypi_0
  • wcwidth==0.1.9=pyh9f0ad1d_0
  • jsonpatch==1.26=pypi_0
  • grpcio==1.25.0=pypi_0
  • neptune-client==0.4.120=pypi_0
  • fontmath==0.6.0=pypi_0
  • torchfile==0.1.0=pypi_0
  • prompt-toolkit==3.0.5=pypi_0
  • python_abi==3.6=1_cp36m
  • mlconfig==0.1.0=pypi_0
  • jsonpointer==2.0=pypi_0
  • path-py==12.5.0=pypi_0
  • jedi==0.17.2=pypi_0
  • openblas==0.3.3=h9ac9557_1001
  • blas==1.1=openblas
  • defusedxml==0.6.0=pypi_0
  • google-auth-oauthlib==0.4.1=pypi_0
  • matplotlib==3.1.0=py36_0
  • gast==0.2.2=pypi_0
  • tensorboard==2.2.2=pypi_0
  • ipython==7.16.1=pypi_0
  • werkzeug==0.16.0=pypi_0
  • visdom==0.1.8.9=pypi_0
  • absl-py==0.8.1=pypi_0
  • prometheus-client==0.8.0=pypi_0
  • pathlib2==2.3.5=pypi_0
  • seaborn==0.11.0=pypi_0
  • opt-einsum==3.1.0=pypi_0
  • pyasn1-modules==0.2.7=pypi_0
  • libxml2==2.9.9=h13577e0_2
  • neptune-notebooks==0.0.16=pypi_0
  • pyasn1==0.4.8=pypi_0
  • libgomp==9.3.0=h24d8f2e_16
  • pickleshare==0.7.5=pypi_0
  • libblas==3.8.0=11_openblas
  • pyzmq==18.1.1=pypi_0
  • rfc3987==1.3.8=pypi_0
  • testpath==0.4.4=pypi_0
  • ca-certificates==2020.6.20=hecda079_0
  • parso==0.7.1=pypi_0
  • pygments==2.6.1=pypi_0
  • faiss==1.5.3=pypi_0
  • libiconv==1.15=h516909a_1005
  • psutil==5.6.5=pypi_0
  • py3nvml==0.2.6=pypi_0
  • gputil==1.4.0=pypi_0
  • termcolor==1.1.0=pypi_0
  • libfaiss==1.6.3=h49bdc20_1_cuda
  • matplotlib-base==3.1.0=py36h5f35d83_0
  • qt==4.8.7=2
  • fonttools==4.12.1=pypi_0
  • pandocfilters==1.4.2=pypi_0

Current channels:

To search for alternate channels that may provide the conda package you're
looking for, navigate to

https://anaconda.org

and use the search bar at the top of the page.

`

Architecture for DANN

Dear authors,
Thanks for the code release. I was curious about the fact that accuracy of method DANN (in closed set for office 31) reported in the paper is much higher than accuracies reported for it in other papers like CDAN. So I wanted to know what architecture was used for the classifier and discriminator network along with other hyperparams for your DANN implementation.

Thanks

some simple questions about released code

Hi, thanks very much for submitting the good work. I have some simple questions about code in train_dance.py:

  1. in 188-189 lines : ### We do not use memory features present in mini-batch
    feat_mat[:, index_t] = -1 / conf.model.temp
    I know the meaning is calculating the similarity between min-batch and min-batch with current features not memory features, but what's the meaning about -1 / conf.model.temp?

  2. in 195-196 lines: loss_nc = conf.train.eta * entropy(torch.cat([out_t, feat_mat, feat_mat2], 1))
    I can't understand what's the effect of direct connection of feat_mat and feat_mat2. why not put the feat_mat2 into the proper index position in feat_mat, as we know, the index of feat_t in different iteration is not same.

Thanks very much and hope to get your reply

Regarding Table 4

Thank you for sharing your codes. I have a question.

Regarding Table 4, Office(10/0/11) seems to have a typo.
Which is correct, Office(10/10/11) or Office(10/0/21)?

Domain-specific Batch-Normalization

In the page of 3, in section 3, the authors have said that "In addition, we utilize domain-specific" batch normalization to ..."; However, I do not found that the code is implemented using domain-specific batch normalization. So I am a little curious about that? Can you explain why?

About the evaluation metric

Hello, Thanks for your interesting and inspiring work~

Recently, [1] and [2] proposed a new metric, harmonic mean of Known ACC and Unknown ACC, to better evaluate the capability on identifying known samples and unknown samples.

Will you consider reporting the results with this metric? Or could you release the model trained with the proposed method(DANCE) ?.

[1] Learning to Detect Open Classes for Universal Domain Adaptation
[2] On the Effectiveness of Image Rotation for Open Set Domain Adaptation

About the test Function in eval.py

Hi, thanks very much for your wonderful work about UniDA.

I have a question about the test Function in eval.py:

  • In the line 32-33. You let the prediction of some samples equal to n_share. But I think this may not be correct in some conditions. For example, In the OPDA setting of Office31, the n_share (10)th class is included in the source label set (0 - 19). And this may make the computation of the unkown class accuracy be wrong (line 38).

I don’t know if I misunderstood the code. Thanks very much and hope to get your reply.

`feat_mat2` in `loss_nc`

After carefully checking the training code, I have a little question about the feat_mat2 in loss_nc.

How does the feat_mat2 help the performance of neighborhood clustering?

I see when computing the loss_nc the entropy was fed with the concatenation of three values which are out_t, feat_mat and feat_mat2. It turns out that the out_t and feat_mat denote the $W$ and $V$ respectively illustrated by the Sec. 3.2. So I guess the feat_mat2 has the same influence with feat_mat. But it only works in the mini batch data, does it? Is there any reasonable explanation about it? Thanks in advance.

confuse about the txt file

In txt file, I find paths of some images from /research/masaito/, what the difference between images in /research/masaito/ and data/

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.