Giter Club home page Giter Club logo

relationprediction's People

Contributors

michschli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

relationprediction's Issues

How could I get a beeter result using my own data?

Sorry to bother you ,I really want to get a relatively high accurary using my own data to make link prediction, however, I get a result that accurary is almost zero. It makes me very confusing, may you give me some suggestions

ImportError: No module named 'tensorflow_backend'

I'm trying to run train.py. and I get this error:

Traceback (most recent call last):
  File "C:\Users\Axel\git\RelationPrediction\code\train.py", line 5, in <module>

    from optimization.optimize import build_tensorflow
  File "C:\Users\Axel\git\RelationPrediction\code\optimization\optimize.py", lin
e 1, in <module>
    import tensorflow_backend.algorithms as tensorflow_algorithms
ImportError: No module named 'tensorflow_backend'

I don't know which package I should install to resolve this module import error. Below is the list of packages in my python environment. I have tensorflow and keras installed.

_ipyw_jlab_nb_ext_conf 0.1.0 py35hfaa8434_0
absl-py 0.3.0
alabaster 0.7.11 py35_0
anaconda-client 1.6.14 py35_0
anaconda-navigator 1.8.7 py35_0
anaconda-project 0.8.2 py35h06aeb26_0
appdirs 1.4.3 py35_0
asn1crypto 0.24.0 py35_0
astor 0.7.1
astroid 1.6.5 py35_0
astropy 3.0.3 py35hfa6e2cd_2
atomicwrites 1.1.5 py35_0
attrs 18.1.0 py35_0
automat 0.7.0 py35_0
babel 2.6.0 py35_0
backcall 0.1.0 py35_0
backports 1.0 py35he88aa47_1
backports.shutil_get_terminal_size 1.0.0 py35h9d89c8b_2
backports.weakref 1.0rc1
beautifulsoup4 4.6.0 py35_1
bitarray 0.8.3 py35hfa6e2cd_0
bkcharts 0.2 py35h4704c85_0
blas 1.0 mkl
blaze 0.11.3 py35hae12140_0
bleach 2.1.3 py35_0
bleach 1.5.0
blosc 1.14.3 he51fdeb_0
bokeh 0.13.0 py35_0
boto 2.49.0 py35_0
bottleneck 1.2.1 py35h8a3671c_0
bzip2 1.0.6 hfa6e2cd_5
ca-certificates 2018.03.07 0
certifi 2018.4.16 py35_0
cffi 1.11.5 py35h945400d_0
chardet 3.0.4 py35_1
click 6.7 py35h10df73f_0
cloudpickle 0.5.3 py35_0
clyent 1.2.2 py35h3cd9751_1
colorama 0.3.9 py35h32a752f_0
comtypes 1.1.6 py35_0
conda 4.5.8 py35_0
conda-build 3.12.0 py35_1
conda-env 2.6.0 h36134e3_1
conda-verify 3.1.0 py35_0
console_shortcut 0.1.1 h6bb2dd7_3
constantly 15.1.0 py35_0
contextlib2 0.5.5 py35h0a97e54_0
cryptography 2.2.2 py35hfa6e2cd_0
curl 7.60.0 h7602738_0
cycler 0.10.0 py35hcc71164_0
cython 0.28.4 py35h6538335_0
cytoolz 0.9.0.1 py35hfa6e2cd_0
dask 0.18.2 py35_0
dask-core 0.18.2 py35_0
datashape 0.5.4 py35ha38994c_0
decorator 4.3.0 py35_0
distributed 1.22.0 py35_0
docutils 0.14 py35h8ccb97f_0
entrypoints 0.2.3 py35hb91ced9_2
et_xmlfile 1.0.1 py35h2c13def_0
fastcache 1.0.2 py35hfa6e2cd_2
filelock 3.0.4 py35_0
flask 1.0.2 py35_1
flask-cors 3.0.6 py35_0
freetype 2.8 h51f8f2c_1
future 0.16.0 py35_1
get_terminal_size 1.0.0 h38e98db_0
gevent 1.3.5 py35hfa6e2cd_0
glob2 0.6 py35h9eb15d0_0
greenlet 0.4.14 py35hfa6e2cd_0
grpcio 1.14.0rc1
h5py 2.8.0 py35h3bdd7fb_0
hdf5 1.10.2 hac2f561_1
heapdict 1.0.0 py35_2
html5lib 1.0.1 py35h047fa9f_0
html5lib 0.9999999
hyperlink 18.0.0 py35_0
icc_rt 2017.0.4 h97af966_0
icu 58.2 ha66f8fd_1
idna 2.7 py35_0
imageio 2.3.0 py35_0
imagesize 1.0.0 py35_0
incremental 17.5.0 py35h0b96c2c_0
intel-openmp 2018.0.0 8
ipykernel 4.8.2 py35_0
ipython 6.4.0 py35_0
ipython_genutils 0.2.0 py35ha709e79_0
ipywidgets 7.3.0 py35_0
isodate 0.6.0
isort 4.3.4 py35_0
itsdangerous 0.24 py35h99d45d4_1
jdcal 1.4 py35_0
jedi 0.12.1 py35_0
jinja2 2.10 py35_0
jpeg 9b hb83a4c4_2
jsonschema 2.6.0 py35h27d56d3_0
jupyter 1.0.0 py35_4
jupyter_client 5.2.3 py35_0
jupyter_console 5.2.0 py35hf76c22e_1
jupyter_core 4.4.0 py35h629ba7f_0
jupyterlab 0.32.1 py35_0
jupyterlab_launcher 0.10.5 py35_0
Keras 1.2.1
Keras-Applications 1.0.2
Keras-Preprocessing 1.0.1
keyring 13.2.1 py35_0
kiwisolver 1.0.1 py35hc605aed_0
lazy-object-proxy 1.3.1 py35he996729_0
libcurl 7.60.0 hc4dcbb0_0
libiconv 1.15 h1df5818_7
libpng 1.6.34 h79bbb47_0
libsodium 1.0.16 h9d3ae62_0
libssh2 1.8.0 hd619d38_4
libtiff 4.0.9 hb8ad9f9_1
libxml2 2.9.8 hadb2253_1
libxslt 1.1.32 hf6f1972_0
llvmlite 0.23.2 py35he51fdeb_0
locket 0.2.0 py35h0dfcdd0_1
lxml 4.2.3 py35hef2cd61_0
lzo 2.10 h6df0209_2
m2w64-gcc-libgfortran 5.3.0 6
m2w64-gcc-libs 5.3.0 7
m2w64-gcc-libs-core 5.3.0 7
m2w64-gmp 6.1.0 2
m2w64-libwinpthread-git 5.0.0.4634.697f757 2
Markdown 2.6.11
markupsafe 1.0 py35hc253e08_1
matplotlib 2.2.2 py35h153e9ff_1
mccabe 0.6.1 py35hcf31250_1
menuinst 1.4.14 py35hfa6e2cd_0
mistune 0.8.3 py35hfa6e2cd_1
mkl 2018.0.2 1
mkl-service 1.1.2 py35h051acba_4
mkl_fft 1.0.1 py35h452e1ab_0
mkl_random 1.0.1 py35h9258bd6_0
more-itertools 4.2.0 py35_0
mpmath 1.0.0 py35h253b693_2
msgpack-python 0.5.6 py35he980bc4_0
msys2-conda-epoch 20160418 1
multipledispatch 0.5.0 py35_0
navigator-updater 0.2.1 py35_0
nbconvert 5.3.1 py35h98d6c46_0
nbformat 4.4.0 py35h908c9d9_0
networkx 2.1 py35_0
nltk 3.3.0 py35_0
nose 1.3.7 py35h0e9586c_2
notebook 5.6.0 py35_0
numba 0.38.1 py35h830ac7b_0
numexpr 2.6.5 py35hcd2f87e_0
numpy 1.14.3 py35h9fa60d3_2
numpy-base 1.14.3 py35h5c71026_2
numpydoc 0.8.0 py35_0
odo 0.5.1 py35hc850252_0
olefile 0.45.1 py35_0
openpyxl 2.5.4 py35_0
openssl 1.0.2o h8ea7d77_0
packaging 17.1 py35_0
pandas 0.23.3 py35h830ac7b_0
pandoc 1.19.2.1 hb2460c7_1
pandocfilters 1.4.2 py35h978f723_1
parso 0.3.1 py35_0
partd 0.3.8 py35h894d1e4_0
path.py 11.0.1 py35_0
pathlib2 2.3.2 py35_0
patsy 0.5.0 py35_0
pep8 1.7.1 py35_0
pickleshare 0.7.4 py35h2f9f535_0
pillow 5.1.0 py35h0738816_0
pip 10.0.1 py35_0
pkginfo 1.4.2 py35_1
pluggy 0.6.0 py35h717ee57_0
ply 3.11 py35_0
prometheus_client 0.3.0 py35_0
prompt_toolkit 1.0.15 py35h89c7cb4_0
protobuf 3.6.0
psutil 5.4.6 py35hfa6e2cd_0
py 1.5.4 py35_0
pyasn1 0.4.3 py35_0
pyasn1-modules 0.2.2 py35_0
pycodestyle 2.4.0 py35_0
pycosat 0.6.3 py35h456c199_0
pycparser 2.18 py35h15a15da_1
pycrypto 2.6.1 py35hfa6e2cd_9
pycurl 7.43.0.2 py35h74b6da3_0
pyflakes 2.0.0 py35_0
pygments 2.2.0 py35h24c0941_0
pylint 1.9.2 py35_0
pyodbc 4.0.23 py35h6538335_0
pyopenssl 18.0.0 py35_0
pyparsing 2.2.0 py35hcabcaab_1
pyqt 5.9.2 py35h1aa27d4_0
pysocks 1.6.8 py35_0
pytables 3.4.4 py35he6f6034_0
pytest 3.6.3 py35_0
pytest-arraydiff 0.2 py35_0
pytest-astropy 0.4.0 py35_0
pytest-doctestplus 0.1.3 py35_0
pytest-openfiles 0.3.0 py35_0
pytest-remotedata 0.3.0 py35_0
python 3.5.5 h0c2934d_2
python-dateutil 2.7.3 py35_0
pytz 2018.5 py35_0
pywavelets 0.5.2 py35h7c47ace_0
pywin32 223 py35hfa6e2cd_1
pywinpty 0.5.4 py35_0
pyyaml 3.13 py35hfa6e2cd_0
pyzmq 17.0.0 py35hfa6e2cd_1
qt 5.9.5 vc14he4a7d60_0 [vc14]
qtawesome 0.4.4 py35h639d0ff_0
qtconsole 4.3.1 py35hc47b0dd_0
qtpy 1.4.2 py35_0
rdflib 4.2.2
requests 2.19.1 py35_0
rope 0.10.7 py35h5756fe0_0
ruamel_yaml 0.15.42 py35hfa6e2cd_0
scikit-image 0.14.0 py35h6538335_1
scikit-learn 0.19.1 py35h2037775_0
scipy 1.1.0 py35h672f292_0
seaborn 0.9.0 py35_0
send2trash 1.5.0 py35_0
service_identity 17.0.0 py35_0
setuptools 39.2.0 py35_0
simplegeneric 0.8.1 py35_2
singledispatch 3.4.0.3 py35h33f66b4_0
sip 4.19.8 py35h6538335_0
six 1.11.0 py35_1
snappy 1.1.7 h777316e_3
snowballstemmer 1.2.1 py35h4c55bfa_0
sortedcollections 1.0.1 py35_0
sortedcontainers 2.0.4 py35_0
sphinx 1.7.6 py35_0
sphinxcontrib 1.0 py35h45f5ca3_1
sphinxcontrib-websupport 1.1.0 py35_1
spyder 3.3.0 py35_0
spyder-kernels 0.2.4 py35_0
sqlalchemy 1.2.10 py35hfa6e2cd_0
sqlite 3.23.1 h35aae40_0
statsmodels 0.9.0 py35h452e1ab_0
sympy 1.2 py35_0
tblib 1.3.2 py35hd2cf7e1_0
tensorboard 1.8.0
tensorflow 1.2.1
tensorflow 1.10.0rc0
terminado 0.8.1 py35_1
testpath 0.3.1 py35h06cf69e_0
Theano 0.9.0
Theano 1.0.2
tk 8.6.7 hcb92d03_3
toolz 0.9.0 py35_0
tornado 5.0.2 py35_0
traitlets 4.3.2 py35h09b975b_0
twisted 17.5.0 py35_0
typing 3.6.4 py35_0
unicodecsv 0.14.1 py35h0d88516_0
urllib3 1.23 py35_0
vc 14.1 h0510ff6_3
vs2015_runtime 15.5.2 3
wcwidth 0.1.7 py35h6e80d8a_0
webencodings 0.5.1 py35h5d527fb_1
werkzeug 0.14.1 py35_0
wheel 0.31.1 py35_0
widgetsnbextension 3.3.0 py35_0
win_inet_pton 1.0.1 py35hbef1270_1
win_unicode_console 0.5 py35h56988b5_0
wincertstore 0.2 py35hfebbdb8_0
winpty 0.4.3 4
wrapt 1.10.11 py35h54666f7_0
xlrd 1.1.0 py35h22b952b_1
xlsxwriter 1.0.5 py35_0
xlwings 0.11.8 py35_0
xlwt 1.3.0 py35hd04410a_0
yaml 0.1.7 hc54c509_2
zeromq 4.2.5 hc6251cf_0
zict 0.1.3 py35hf5542e0_0
zlib 1.2.11 h8395fce_2
zope 1.0 py35_0
zope.interface 4.5.0 py35hfa6e2cd_0

OOM

Hello,

I was wondering how much GPU memory is needed to replicate the result in the paper? I tried on all three datasets but for all of them, I run into OOM issue.

missing valid_accuracy.txt

Hello,

if I add in gcn_basis.exp

[Evaluation]
	Metric=MRR
	Metric=Accuracy

I get the following error:

FileNotFoundError: [Errno 2] No such file or directory: FB-Toutanova/valid_accuracy.txt

How can I generate that missing file?

Thanks

Configuration settings for toy dataset

It would be nice to know which settings to choose for the Toy dataset:

  • internal encoder dimension?
  • regularization using basis or block-diagonal-decomposition?
  • If using basis decomposition, how many basis functions?

Ideally, it would be nice to know the specific .exp settings file to use. Thanks

How to run your code?

I'm so sorry to bother you? Can you give an example of the "configuration"? I don't know how to set it.

Best approach to read new graph after training?

The input graph in subject,predicate,object form is transformed into matrices for training. Is the output graph after training also available in subject,predicate,object form? If not, what is the easiest way to check out the new output graph and the new added links? I noticed the creation of .index and .meta files in the models folder. Do these files describe the output graph? If so, how can I read these files?

0% GPU Utility most of the time

When I trained the R-GCN model with dataset 'FB-Toutanova', using setting 'gcn_basis.exp', I used nvtop to monitor the GPU utilization and found that the GPU utilization was mostly 0% during the training process, and would not exceed 50%. Instead, memory usage is staggering, with frequent OOMs on large datasets. The training process is very slow, taking close to a day for 10,000 iterations. Perhaps there are many work for performance improvements.

image

How do adjacency matrices be used in code

  1. How does base decomposition reduce computational complexity? Is the base function the same throughout the code?
  2. How do adjacency matrices be used in code?
    How is the problem-specific normalized constant C calculated?
    Thanks for your reply !

Using training triplet index as edge ID?

In the train.py script, copied and pasted below, the first element of the tuple for the adj_list is the index of the train_triplets,

adj_list = [[] for _ in entities]
for i,triplet in enumerate(train_triplets):
    adj_list[triplet[0]].append([i, triplet[2]])
    adj_list[triplet[2]].append([i, triplet[0]])

Later on in the sample_edge_neighborhood method, you have the follow code to sample the edge,

chosen_vertex = np.random.choice(np.arange(degrees.shape[0]), p=probabilities)
chosen_adj_list = adj_list[chosen_vertex]
seen[chosen_vertex] = True

...

chosen_edge = np.random.choice(np.arange(chosen_adj_list.shape[0]))
chosen_edge = chosen_adj_list[chosen_edge]
edge_number = chosen_edge[0]

The chosen_adj_list is an array of shape num_neighbors x 2, the second dimension being the tuple you're appending in the previous code block. But here, chosen_edge[0] would give you the index of the training triplet, which is not in anyway related to the edge type of the triplet, right?

Embedding error

When using a dataset with more relations than constants (e.g. Nations) the following error raises:

InvalidArgumentError (see above for traceback): indices[3] = 16 is not in [0, 16) [[Node: embedding_lookup_1 = Gather[Tindices=DT_INT32, Tparams=DT_FLOAT, _class=["loc:@Variable"], validate_indices=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Variable/read, strided_slice_1)]]

It's a slightly extended Toy dataset (adding new 9 relations). It's most likely caused by restricting embedding's lookup values by the number of entities (which is, in this case, lower than the number of relations). Is this fix appropriate?

file: common/model_builder.py
line: 28 & 29
input_shape = [max(int(encoder_settings['EntityCount']),int(encoder_settings['RelationCount'])), int(encoder_settings['CodeDimension'])]

work on entity alignment

does r-gcn works on entity alignment (existing R-GCN seems only focus on classification and link prediction)?

If yes, May I know do you have any working code on this?

or does this link prediction model works directly on that? Since “Links between two nodes exist” is very similar to “two nodes referring to same real-world object”

recommended hardware

I ran the relational prediction training on my PC without GPU using the toy dataset and it is still running after 12+h hours. Is there any specific hardware configuration using GPU which you can recommend?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.