Giter Club home page Giter Club logo

variationaldeepsemantichashing's Introduction

Variational Deep Semantic Hashing (SIGIR'2017)

The implementation of the models and experiments of Variational Deep Semantic Hashing (SIGIR 2017).

Author: Suthee Chaidaroon

Platform

  • This project uses python 2.7 and Tensorflow version 1.3

Prepare dataset

The model expects the input document to be in a bag-of-words format. I provided sample dataset under dataset directory. If you want to use a new text collection, the input document collection to our model should be a matrix where each row represents one document and each column represents one unique word in the corpus.

To get the best performance

TFIDF turns out to be the best representation for our models according to our empirical results.

Training the model

The component collapsing is common in variational autoencoder framework where the KL regularizer shuts off some latent dimensions (by setting the weights to zero). We use weight annealing technique [1] to mitigate this issue during the training.

References

[1] https://arxiv.org/abs/1602.02282

Bibtex

@inproceedings{Chaidaroon:2017:VDS:3077136.3080816,
 author = {Chaidaroon, Suthee and Fang, Yi},
 title = {Variational Deep Semantic Hashing for Text Documents},
 booktitle = {Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
 series = {SIGIR '17},
 year = {2017},
 isbn = {978-1-4503-5022-8},
 location = {Shinjuku, Tokyo, Japan},
 pages = {75--84},
 numpages = {10},
 url = {http://doi.acm.org/10.1145/3077136.3080816},
 doi = {10.1145/3077136.3080816},
 acmid = {3080816},
 publisher = {ACM},
 address = {New York, NY, USA},
 keywords = {deep learning, semantic hashing, variational autoencoder},
}

variationaldeepsemantichashing's People

Contributors

unsuthee avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

variationaldeepsemantichashing's Issues

loss implementation differs from the paper

Dear authors,

When I check def calc_reconstr_error(self) in all of the three proposed methods, I see:

return -tf.reduce_sum(tf.log(tf.maximum(p_x_i_scores0 * weight_scores0, 1e-10)))

This multiplies weight_scores0 inside the logarithm. I think this differs from the log-likelihood defined in the paper e.g. (2). In particular, if I neglect max(, 1e-10) that seems to be for numerical stability, the logarithm above decomposes to
sum(log(p_x_i_scores0)) + sum(log(weight_scores0)).
Since the second part does not depend on parameters, it is a constant for the learning.

I am not very familiar with tensorflow, maybe I am missing something. Can you please explain what exactly this does even if different from the paper?

Thanks,
Alexander Shekhovtsov
Czech Technical University

Unable to load saved model

I saved the model after training using the following commands

saver = tf.train.Saver()
saver.save(sess, '../data/output/'+folder_name+'/model/var_autoencoder_model')

Then in another test script, I restored the model using the following commands

saver = tf.train.import_meta_graph('../data/output/'+folder_name+'/model/var_autoencoder_model.meta')
saver.restore(sess, '../data/output/'+folder_name+'/model/var_autoencoder_model')

But the model is failing with the following error

Caused by op 'Variable_9/read', defined at:
  File "test.py", line 581, in <module>
    train_data = initialize_autoencoder_data(folder_name, search)
  File "test.py", line 413, in initialize_autoencoder_data
    model = VDSH(sess, latent_dim, x_train.shape[1])
  File "/home/admin1/3/SourceCodeAnalytics-master/src/VDSH.py", line 27, in __init__
    self.build()
  File "/home/admin1/3/SourceCodeAnalytics-master/src/VDSH.py", line 94, in build
    self.z_enc_1 = Dense(self.hidden_dim, activation='relu')(self.input_bow)
  File "/home/admin1/3/SourceCodeAnalytics-master/src/utils.py", line 73, in __call__
    self.build(shape)
  File "/home/admin1/3/SourceCodeAnalytics-master/src/utils.py", line 61, in build
    self.W = tf.Variable(xavier_init(input_dim, self.output_dim))
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 183, in __call__
    return cls._variable_v1_call(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 146, in _variable_v1_call
    aggregation=aggregation)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 125, in <lambda>
    previous_getter = lambda **kwargs: default_variable_creator(None, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variable_scope.py", line 2444, in default_variable_creator
    expected_shape=expected_shape, import_scope=import_scope)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 187, in __call__
    return super(VariableMetaclass, cls).__call__(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 1329, in __init__
    constraint=constraint)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py", line 1491, in _init_from_args
    self._snapshot = array_ops.identity(self._variable, name="read")
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/array_ops.py", line 81, in identity
    return gen_array_ops.identity(input, name=name)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/gen_array_ops.py", line 3454, in identity
    "Identity", input=input, name=name)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
    op_def=op_def)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py", line 488, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py", line 3274, in create_op
    op_def=op_def)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py", line 1770, in __init__
    self._traceback = tf_stack.extract_stack()

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value Variable_9
	 [[node Variable_9/read (defined at /home/admin1/3/SourceCodeAnalytics-master/src/vdsh_utils.py:61)  = Identity[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Variable_9)]]

Loss is very high

The loss is very high when I ran VDSH on my dataset(~900). Is there any way to minimize it?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.