Comments (6)
Hollo, I carefully read the Point transformer, and the vector attention is conducted on a local path of the point cloud( which pointed out in the paper). The application of vector attention in lcoal patch will reduce the memeory cost, as the number of points in local patch is much smaller than whole point cloud.
If the number of downed-sampled points is N, and the dim of the learned feature is D, the B represents the batch size, thus, the feature shape is BND, and the shape of grouped feature is BNKD, where K is number of points in each local patch(K of KNN).
Thus the attention may be conducted between BND and BNKD, the shape of Query , Key and Value are BND, BNKD and BNKD, respectively. The attention weight after substraction and mapping is BNKD(BND-BNKD, like the relative coordinates used in point cloud grouping in Pointnet++). Then the hadamard product is made bettwen attention weight(BNKD) and value(BNKD), following by the reduce_sum made in K dim(axis=2). Therefore, the shape of output feature is BND(with squeeze in dim 2).
In this way, the wector weight could refine every channel in Value. But it works like the attention, other than the self-attention. All these steps are based on my understandings of the paper.
The real workflow of the Point Transformer may different from my own understandings, and the truth cloud be uncoverred after the author open the source code.
By the way, some interesting things may be found in the orignal vector attention paper( the code of which is opened), which is written by the sample author of Point transformer.
Have a good day!!
from point-transformer-pytorch.
@Liu-Feng Hi Feng! Admittedly I wrote this repository and forgot to follow up and double check if it is correct
Do you mean to say it should be like https://github.com/lucidrains/point-transformer-pytorch/pull/3/files
from point-transformer-pytorch.
@lucidrains Thanks!! If the last dimension of attn_mlp is dim other 1, I think it cloud be used to modulate individual feature channel. But I cannot confirm which axis that the softmax should conduct on. Thanks for your reply!
from point-transformer-pytorch.
@Liu-Feng i'm pretty sure the softmax will be across the similarities of queries against all the keys (dimension j
)
I've merged the PR, so do let me know if this work or doesn't work in your training
Thank you!
from point-transformer-pytorch.
against
Thanks for your reply!!! I have not train the model. I am trying to construct the PT via Tensorflow.
from point-transformer-pytorch.
@Liu-Feng hello! how's your tensorflow port going? are you more certain that you had the correct hunch here?
from point-transformer-pytorch.
Related Issues (15)
- Would you provide full model, training pipeline, and pre-trained weights?
- Can you provide the full training code and pretained models?
- Issues with my wrapper code HOT 1
- Can you provide a multi-head version of point transformer layer? HOT 3
- point clouds with different number of points
- Scalar attention or vector attention in the multi-head variant HOT 2
- Did You Falsify Your Experimental Results??? HOT 1
- part segmentation
- Transition Up Module HOT 3
- Cost too much memory HOT 9
- 请问,论文中的self-Attention对基数(cardinality)不变,这怎么理解?
- 请问有完整代码吗? HOT 2
- Invariant to cardinality?
- The layer structure and mask HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from point-transformer-pytorch.