nusnlp / nea Goto Github PK
View Code? Open in Web Editor NEWNeural Essay Assessor: An Automated Essay Scoring System Based on Deep Neural Networks
License: GNU General Public License v3.0
Neural Essay Assessor: An Automated Essay Scoring System Based on Deep Neural Networks
License: GNU General Public License v3.0
When I run "python train_nea.py -tr data/fold_0/train.tsv -tu data/fold_0/dev.tsv -ts data/fold_0/test.tsv -p 1 -o output"
I got an error like this:
/Users/yanshengjia/Desktop/aes/nea/nea/models.py:63: UserWarning: Update your LSTM
call to the Keras 2 API: LSTM(300, return_sequences=True, dropout=0.5, recurrent_dropout=0.1)
model.add(RNN(args.rnn_dim, return_sequences=True, dropout_W=dropout_W, dropout_U=dropout_U))
/Users/yanshengjia/anaconda/lib/python2.7/site-packages/keras/engine/topology.py:621: UserWarning: Class nea.my_layers.MeanOverTime
defines get_output_shape_for
but does not override compute_output_shape
. If this is a Keras 1 layer, please implement compute_output_shape
to support Keras 2.
output_shape = self.compute_output_shape(input_shape)
Traceback (most recent call last):
File "train_nea.py", line 160, in
model = create_model(args, train_y.mean(axis=0), overal_maxlen, vocab)
File "/Users/yanshengjia/Desktop/aes/nea/nea/models.py", line 73, in create_model
model.layers[-1].b.set_value(bias_value)
AttributeError: 'Dense' object has no attribute 'b'
Hi,
When I try to run the code using the default parameters;
python train_nea.py -tr data/fold_0/train.tsv -tu data/fold_0/dev.tsv -ts data/fold_0/test.tsv -p 1 -o output_dir
I get the following error:
nea/nea/my_layers.py", line 59, in call
return K.cast(x.sum(axis=1) / mask.sum(axis=1, keepdims=True), K.floatx())
AttributeError: 'Tensor' object has no attribute 'sum'
Any thoughts ?
Thanks !
File "/home/deepamparmar/Music/nea-master/nea/models.py", line 132, in create_model
emb_reader = EmbReader(args.emb_path, emb_dim=args.emb_dim)
File "/home/deepamparmar/Music/nea-master/nea/w2vEmbReader.py", line 46, in init
assert self.emb_dim == emb_dim, 'The embeddings dimension does not match with the requested dimension'
AssertionError: The embeddings dimension does not match with the requested dimension
What versions of python,theano,keras and tensorflow did you use? I am facing issues with tensorflow.
Originally posted by @nahos in #13 (comment)
When I tried to run the codebase it returns such error.
I guess it's probably because of the updated Keras 2 API, but could you try to build it again?
Thanks a lot ;)
hello ~ I got a problem as the following
�[94m[INFO]�[0m (nea.models) Building a REGRESSION model with POOLING
Traceback (most recent call last):
File "train_nea.py", line 160, in <module>
model = create_model(args, train_y.mean(axis=0), overal_maxlen, vocab)
File "/data1/wbxu/AES/nea-master/nea/models.py", line 63, in create_model
model.add(RNN(args.rnn_dim, return_sequences=True, dropout_W=dropout_W, dropout_U=dropout_U))
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/keras/models.py", line 308, in add
output_tensor = layer(self.outputs[0])
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/keras/engine/topology.py", line 487, in __call__
self.build(input_shapes[0])
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/keras/layers/recurrent.py", line 710, in build
self.W = K.concatenate([self.W_i, self.W_f, self.W_c, self.W_o])
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 716, in concatenate
return tf.concat(axis, [to_dense(x) for x in tensors])
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/ops/array_ops.py", line 1043, in concat
dtype=dtypes.int32).get_shape(
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 676, in convert_to_tensor
as_ref=False)
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 741, in internal_convert_to_tensor
ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref)
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/constant_op.py", line 113, in _constant_tensor_conversion_function
return constant(v, dtype=dtype, name=name)
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/constant_op.py", line 102, in constant
tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape))
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/tensor_util.py", line 374, in make_tensor_proto
_AssertCompatible(values, dtype)
File "/home/wbxu/anaconda3/envs/nea-master/lib/python2.7/site-packages/tensorflow/python/framework/tensor_util.py", line 302, in _AssertCompatible
(dtype.name, repr(mismatch), type(mismatch).__name__))
TypeError: Expected int32, got <tf.Variable 'lstm_1_W_i:0' shape=(50, 300) dtype=float32_ref> of type 'Variable' instead.
my env is
keras 1.1.0
tensorflow 1.6.0
teano 0.8.2
nltk 3.0.0
What versions of keras and theano did you use?
I often find the most difficult part of getting a code like this to run is specifying all the unique versions of everything. I wonder if we could do some sort of container in the future (like docker)...
Really looking forward to trying it all out.
Loading a trained model requires the custom objects which were defined in the my_layers.py. Any tips for loading them with the model archs?
example code:
from keras.models import model_from_json
with open('model_arch.json', 'r') as f: model = model_from_json(file.read(f), custom_objects={'MeanOverTime':MeanOverTime}
Some documentation on the issue in others:
loading custom objects in keras
[INFO] (nea.w2vEmbReader) Loading embeddings from: data/embeddings.w2v.txt
Traceback (most recent call last):
File "train_nea.py", line 160, in
model = create_model(args, train_y.mean(axis=0), overal_maxlen, vocab)
File "/home/zetao/nea-master/nea/models.py", line 133, in create_model
emb_reader = EmbReader(args.emb_path, emb_dim=args.emb_dim)
File "/home/zetao/nea-master/nea/w2vEmbReader.py", line 47, in init
assert self.emb_dim == emb_dim, 'The embeddings dimension does not match with the requested dimension'
AssertionError: The embeddings dimension does not match with the requested dimension
I have tried to reproduce the result, by got QWK much less than that in the paper. Here is my log for prompt 1, fold_0:
Did i do something wrong?
when runing train_nea.py -tr /data/fold_1/train.tsv -tu /data/fold_1/dev.tsv -ts /data/fold_1/test.tsv -p 0 -o /output_dir
I get an error:
Traceback (most recent call last):
File "/train_nea.py", line 63, in <module>
raise NotImplementedError
NotImplementedError
when I run it for an individual prompt it works fine. Any Ideas how to solve this ?
When I run
python preprocess_asap.py -i training_set_rel3.tsv
There are the errors:
Traceback (most recent call last):
File "preprocess_asap.py", line 44, in
dataset = collect_dataset(args.input_file)
File "preprocess_asap.py", line 34, in collect_dataset
for line in f:
File "/home/wyn/anaconda3/envs/py3NLP/lib/python3.5/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x92 in position 1539: invalid start byte
I don't know why do I get it?
For help!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.