Giter Club home page Giter Club logo

musdl's People

Contributors

andytang15 avatar nzl-thu avatar yuxumin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

musdl's Issues

The normalization for AQA-7 dataset

Hi! I found the value range difference in the AQA-7 dataset.
image
So what is the way for converting the hard target(Regression) to soft target(USDL)? The output_dim is same as 101 for all kinds of actions?
I didn't find details about this in the paper and codes, could you give me some suggestions? Thanks a lot!

Unable to get the proper regression results

Hello, thank you for your wonderful work!
I can get wonderful results for USDL and MUSDL settings. However, the results are terrible when I turn the labels to the initial hard targets(Our-Regression in Table 3). All outputs are around 70. I tried MSE loss and SmoothL1Loss and they all did not work.

The modification for code is as follows:
1.I modified the output_dim for "USDL" in opts.py, from 101 to 1.
output_dim = {'USDL':1, 'MUSDL': 21}
2.I removed the softmax function on layer3 in evaluator.py
output = self.layer3(x)
3.I regarded the output from the evaluator as the final predicted scores and calculated the loss.
preds = evaluator(clip_feats.mean(1)).float() #preds = compute_score(args.type, probs, data) pred_scores.extend([i.item() for i in preds]) target=torch.tensor(data['final_score']).float().cuda() if split == 'train': loss = criterion(preds,target) optimizer.zero_grad() loss.backward() optimizer.step()

If possible, could you release your code for "Our-Regression" in the paper?

Problem with JIGSAW

When trying to run with Jigsaw data, the code gives error of No CUDA devices available and on line in main.py of JIGSAW

i3ds = nn.ModuleList([InceptionI3d() for _ in range(4)]).cuda()

Although Cuda is using Nvidia Tesla from Colab and is enabled. The same environment was able to run with other dataset.

about the pick of standard deviation in Gaussian distribution

Hi Zanlin Ni,
Thanks for your wonderful work!
I was wondering whether the selection of standard deviation has an effect on the loss(Here ฯƒ is a hyper-parameter which serves as the level of uncertainty for assessing an action) since if I pick a bigger one, the loss in the training phase is close to zero, and if I choose 1, the loss is about 0.01. Will the size of the loss value have an impact on training?
Looking forward to your reply!

Spearman Correlation is too low

Hi I executed the MTL-AQA training and got only 0.68665 Sp corr. I have kept all the config setting as it is except for num_worker 1, training batch size 1 and test 2.
I am using a 2048 Nvidia GPU with 10 GB memory
CUDA 11.4
Pytorch '1.10.2'
Could you tell why the drastic decrease in Sp corr. Can you please share your pretrained weights?

Unable to get stable results for multiple iterations (JIGSAW)

Hi there,

I am using your repository to reproduce the results on the JIGSAW dataset and I am using the default parameter values as defined in the config.py file except for the batch size. I was using the batch size of 1 on colab. But I am unable to get stable results for multiple runs.

For example, I ran USDL for Suturing and got 0.3 test correlation, then I ran it again and this time I got nan value for test correlation, and for the third time, I got .19 test correlation. This behavior is there in all 3 surgical videos.

What could be the reason for this unstability? What were the hyperparameter values that you have used?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.