degardinbruno / kinetic-gan Goto Github PK
View Code? Open in Web Editor NEWCode for the paper "Generative Adversarial Graph Convolutional Networks for Human Action Synthesis", WACV 2022
Code for the paper "Generative Adversarial Graph Convolutional Networks for Human Action Synthesis", WACV 2022
How did you preprocess the Human3.6M dataset? I would like to replicate npy and pkl files that you provide. Do you have a code of these? Thanks in advance!
Hey @ojoaobrito,
Thanks for your impressive work, it's really interesting. About the training of the proposed model, I have two questions:
Best and meant thanks,
Hi @DegardinBruno,
I am not able to download the datasets. are the servers down ?
Thank you for your amazing work, I've managed to run some of the code using your documentation and examples too. I was hoping to extract the skeleton from Human3.6M myself, and maybe even from another popular dataset like Charades. I was wondering if you could suggest how we can obtain the 2 files (the .npy file and the accompanying label.pkl file). What will the input files required to generate these 2 files be? I'm looking to create these 2 files on both the original Human3.6M videos and the Charades dataset, to test with your model!
Thank you :)
Hi, thanks for your great work. But I found that I cannot download the data from your sharing link. Could you please help to fix it? Thanks!
Hi, thanks again for your work~
I now have some questions on the operations in Generator where you concatenate the input noise z and the embedded action label to form the real input x, as you can see in generator.py line 80 and 81. The discriminator tries to distinguish the fake skeletons from the real ones, and if we do not concatenate the embedded action label, our generator will also try to generate actions like reals so the discriminator may fail, so i think if we just input the noise z, the network will still try to generate actions like the reals. The action label is something like a signal that let we people know what action we want, but for the network, it will learn it implicitly. I am really not sure about this, so i am wondering what is the meaning of concatenating the input noise z and the embedded action label.
Looking forward to your reply! Many Thanks.
Hello,
I am an undergraduate studying synthetic Action Pose generation. While I was reading your paper I came through some terminologies which I can't understand. In (3.1) Graph Convolutional Networks Preliminaries of the paper I came across terminologies such as:
So, from what I have understood "L" means how many Nodes and edges are in the Graph but I am not sure if I am correct or not.
And for "P" I can't understand it. So could you please give me some explanation?
The download links for datasets are no longer accessable.
Good job sir.
I have a question in terms of generating sequneces of poses for action recognition.
Have you tested adding generated data with the real data for action recognition? And if possible does it help.
My second question is about stochastic variation. Does stochastic variation mean if you give noise to the network it will generate different actions?
And how can we generate stochastic actions using Kinetic GAN. I could not found the way to do it.
Thank you
Hi, first thank you for this great work!
I'm trying to training Kinetic-GAN using ntu dataset, all configurations are set by default, and i encountered negative losses when training. I'm not sure whether it's reasonable to get negative losses when training? Looking forward to your reply : )
p.s I'm using batch_size = 128 to accelerate training.
Hello,
I'm trying to run python.py today and I faced the following error message.
Traceback (most recent call last):
File "generate.py", line 24, in
out = general.check_runs('kinetic-gan', id=-1)
File "/root/Kinetic-GAN/utils/general.py", line 17, in check_runs
out = os.path.join(exps_path,'exp'+str(len(exps)+1)) if not id else os.path.join(exps_path, exps[id])
IndexError: list index out of range
But in my directory, there's no folder name 'exp0' or other things in runs/kinetic-gan folder.
Could you tell me how to solve this error problem?
Thanks for the great work @DegardinBruno.
Is there a way to generate 2d pose sequence actions? Does it work as good as 3d ones?
ps: Training on 2d features released from Openpose or other pose estimation networks.
I have been coding WGAN for 2d pose using FCN network. After seeing your work I was amazed.
Thank you!
Hope to get an answer.
links for downloading dataset and weights are no longer accessible.
I cannot download dataset from the download link
As you mention, the MMD is not a stable metric. how about adding zero values to the third dimension to get the FID? Have you tried that or do you have in mind what other metric should be used for this 2-coordinate dataset? Thanks.
Stev
Hi @DegardinBruno,
How can I change the action I want in Blender visualization?
Hi Bruno,
I tried to install blender in this way, but when I run blender.py, it comes out
And I checked the objects, there were only three:
Do you know how to fix it?
Thanks a lot.
Hi Bruno,
Do you know how to continue training after it stops in the middle (e.g., 50/200epochs)?
I tried to continue to train the model by myself, but I only found the load weights codes in generate.py.
Hey @DegardinBruno
Great work. Thanks for sharing your code!
While training on the NTU-120 the generator loss is exploding. We only changed the batchsize from 32 to 380
[Epoch 297/1200] [Batch 162/165] [D loss: 0.198222] [G loss: 4918.882324]
[Epoch 297/1200] [Batch 163/165] [D loss: 0.205873] [G loss: 4918.882324]
[Epoch 297/1200] [Batch 164/165] [D loss: 0.223392] [G loss: 4918.882324]
Do you know why this could be happening ?
Hi everyone,
The given links to the pytoched compiled models cannot be downloaded. it seems like the server is responding with time out exception. Could someone attach a new link to download at least one or two models.
Thanks,
Shadeesha
Hi, @DegardinBruno
Thank you for the repo it is of great help, but one thing I cannot understand is how to run your visualization (visualization/action_ntu.py). I mean, I run it, and it seems OK, but it only prints something like on the image below (until 63 frame 3d skeleton...) and no visualization happens.
UPDATE: I'm getting empty frames like:
I'm running the script like:
python visualization/action_ntu.py --path runs/kinetic-gan/exp2/actions/51_400_trunc0.95_gen_data.npy --indexes 51 111 171 --labels runs/kinetic-gan/exp2/actions/51_400_trunc0.95_gen_label.pkl
I'm using Ubuntu with GPU.
The download links for datasets are inaccessible
First of all, thanks for the incredible work and repo you provided. I have a questions and get confused about the path exp ID
I found weird behavior with "plot_loss.py" and "generate.py" when I try to run this script
'python visualization/plot_loss.py --batches 32 --runs kinetic-gan --exp 10' it will generate the loss.pdf inside the folder exp11 (which is not correct, It supposed to be exp10), then I try again using exp11 model to generate the samples
'python generate.py --model runs/kinetic-gan/exp11/models/generator_20000.pth --n_classes 120 --dataset ntu --gen_qtd 1000'
then it is also generate the action samples inside folder exp12
I have checked the code plot_loss.py line#18
out = general.check_runs(opt.runs, id=opt.exp) there is no (opt.exp +1)
and also in generate.py line#24-26
out = general.check_runs('kinetic-gan', id=-1) actions_out = os.path.join(out, 'actions') if not os.path.exists(actions_out): os.makedirs(actions_out)
and I am not sure about the function check_runs line#12-19 in general.py
`def check_runs(method, id=None):
exps_path = os.path.join('runs', method)
if not os.path.exists(exps_path): os.makedirs(exps_path)
exps = [exp for exp in humanSort(os.listdir(exps_path)) if 'exp' in exp]
out = os.path.join(exps_path,'exp'+str(len(exps)+1)) if not id else os.path.join(exps_path, exps[id])
if not id: os.makedirs(os.path.join(exps_path,'exp'+str(len(exps)+1)))
return out`
any clue why this behavior' occurs?
Thanks in advance
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.