Comments (4)
I believe you are confused about what a graph embedding is. The objective isn't to assign one vector to the whole graph, but rather to learn a vector for each vertex. PBG does not provide any way of aggregating the embedding of different nodes in order to obtain a single embedding for the whole graph. This is not a goal.
You may have been induced into confusion by graph neural networks, which are quite a different thing. To try to highlight the difference:
- PBG (and models like it) can be seen as an algorithm that takes as input a graph structure (just the topology, vertices and edges, no features or labels about them) and trains (in an unsupervised way) a very large model where the embeddings of the vertices are the parameters of the model and the output of the model are scores between pairs of vertices.
- Graph neural networks (GNNs) are algorithms that take as input a set of data points (each with some features and labels) which happen to be connected by a graph structure, and they train (in a supervised way) a rather small model (usually convolutional) whose output is a prediction of the labels of the vertices. The embedding of a node, in this case, is usually the activation values of an intermediate layer of the network when run on that node. (Of course I had to simplify quite a bit here).
If you need more information you should read our documentation (this page and this page) and perhaps take a look at @adamlerer's talk at SysML, which you can find here.
from pytorch-biggraph.
@lerks is correct, PBG takes as input a graph and outputs embeddings of the nodes; it does not construct an embedding of a graph. I will point out though that the terminology is a little overloaded because there are other methods that construct embeddings of graphs (often for things like embeddings of protein molecules) - they often work by doing our kind of graph embedding and then aggregating the node embeddings in some way, but there are other approaches.
Anyway, I agree that PBG is not the best approach for this problem. AFAIK the state of the art for sentence embeddings would be BERT(one example I found by Google search is https://github.com/lonePatient/bert-sentence-similarity-pytorch )
from pytorch-biggraph.
Hi @Arjunsankarlal , i think what are you saying can make sense if you are using a shallow encoding approach such as using SkipGram to encode the structure of a graph (DeepWalk) .
with this approach , nodes can be seen as words and sub-graphs (set of nodes) as sentences (set of words) .
You can even train doc2vec model to encode sub-graphs as sentence embedding .
I might add to what @lerks mentioned , GNNs can be unsupervised too and the features could be just the hot-encoding of the nodes ( GraphSage for example ) .
from pytorch-biggraph.
I'm closing this as the OP has acknowledged the recent posts but there has been no further activity for almost two weeks. Feel free to reopen.
from pytorch-biggraph.
Related Issues (20)
- ComplexDiagonalDynamicOperator Error HOT 1
- [Question] Any plans to support edge weights on GPUs? HOT 2
- [Question] Are featurized entities supported on GPUs? HOT 2
- raise EOFError and RuntimeError: CUDA error: device-side assert triggered HOT 2
- Regarding non dynamic relations HOT 2
- Regarding negative batch sampling in dyamic relations HOT 2
- Entity attributes, attribute triplets and custom loss function HOT 1
- Parallelizing convert_input_data HOT 1
- How to calculate similarity score between two embeddings if affine operator was used? HOT 2
- Could Pytorch BigGraph find similarities two edges away? HOT 1
- Could you please generate a new release ? HOT 3
- [Question] Python API
- [Question] Incremental training on existing entity embeddings and relation embeddings
- [Question] Stability of embeddings on consecutive runs? HOT 3
- [Question] Dynamic relations with multiple entity types
- Behavior of same batch negative sampling ? HOT 1
- Require C++ Installation after using PBG_INSTALL_CPP=1 pip install .
- [Question] Is it possible to "freeze" embeddings of a certain entity type?
- ModuleNotFoundError: No module named 'torchbiggraph.converters.importers'
- Choice of edge weight
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch-biggraph.