Giter Club home page Giter Club logo

Comments (13)

mckib2 avatar mckib2 commented on June 12, 2024

Hi @zaccharieramzi , please do send a reproducible example and I can try to troubleshoot it.

Also if you haven't already tried it, try using mdgrappa to see if you get the same results. cgrappa can be flaky sometimes.

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

Hi @mckib2 ,

I just sent you a google colaboratory notebook with a reproducible example (the data I used comes from fastMRI).
I indeed tried mdgrappa instead of cgrappa and I don't have this missing line issue.

However, I find myself with a problem I also noticed when implementing my own GRAPPA, which is that the lines filled by GRAPPA appear to be lacking energy. I didn't put it in the colab (I can if you want), but basically this results in folding artifacts.

This lack of energy is, I think, not linked to the regularisation parameter since I did a grid search over it to find the best value (in terms of SSIM-PSNR for the reconstructed image).

If you want me to open another issue for this I can since it's a different problem.

from pygrappa.

mckib2 avatar mckib2 commented on June 12, 2024

Hi @zaccharieramzi , thanks for sending that. It's on my radar, but the soonest I'll probably be able to look at this is this weekend.

Do you have any suggestions/references on how to improve SNR?

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

Hi @mckib2 ,

Unfortunately not really. I have been trying to investigate the problem with my own implementation (customized for the problem I am working with) and it's really difficult to tell where this "lack of energy" comes from.

from pygrappa.

mckib2 avatar mckib2 commented on June 12, 2024

@zaccharieramzi I was able to get your example up and running and reproduce the attenuated interpolated voxels.

I believe there is a bug in the GRAPPA implementation: the interpolated voxels should be scaled by the effective undersampling rate. I'm working on a fix for this now. This should make the results of mdgrappa better, but cgrappa still has some issues and I would avoid using it for now.

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

@mckib2 Thank you for taking the time to reproduce the example.

I am wondering where you read about the scaling. Do you have a pointer to that?
For example, in this excellent tutorial I have been following, they don't talk about scaling by the undersampling rate.

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

So I experimented a bit today with my implementation which shares this attenuation problem (some exploration showed the same thing with your implementation).

The first thing is that the scaling you were talking about didn't work that well.

The second thing I noticed, is that it's probably due to an over-regularisation of the LS solution for the kernel. This over-regularisation is the result of a grid-search I did over the regularisation parameter looking for the best PSNR-SSIM (added in the colab for pygrappa).
When I am using a lamda=0, then the attenuation disappears, but the end result (with RSS) is very noisy.

from pygrappa.

mckib2 avatar mckib2 commented on June 12, 2024

@zaccharieramzi I think you're right, scaling based on undersampling factor is not the correct thing to do. It just so happened that it fixed that particular example with extremely high regularization parameter value, but it started tripping all my other unit tests. That's good to know over-regularization can be a problem.

I am wondering where you read about the scaling. Do you have a pointer to that?

I found an additional scaling factor that gadgetron uses here, but upon further inspection this scale factor appears to always be set to one. When I saw it I checked the average ratio of interpolated voxels to measured voxels on the example you provided and found it to be about the undersampling factor so I figured I had just dropped a factor somewhere, but that does not appear to be the case as mentioned above. Upon further review, the current mdgrappa implementation seems to be correct.

For example, in this excellent tutorial I have been following, they don't talk about scaling by the undersampling rate.

I used the FMRIB GRAPPA demo/guide to validate my Python implementation, so hopefully it matches fairly well.

When I am using a lamda=0, then the attenuation disappears, but the end result (with RSS) is very noisy.

I am still looking into why this may be, I am seeing the same behavior with reasonable regularization values. When you run your data with the FMRIB MATLAB scripts does it give the same noisy result?

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

I am seeing the same behavior with reasonable regularization values

Do you mean the noisy results or the attenuation?

When you run your data with the FMRIB MATLAB scripts does it give the same noisy result?

I haven't checked, for some reason I am always reluctant to launching some Matlab, but I might have to. Do you know whether it's Octave-runnable?
However, I am going to look today into the results of Gadgetron for a prospectively undersampled scan to see if there is this attenuation effect.

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

So what I ended up doing to check the implementations was to run a scan on a phantom, and have the GRAPPA results from Siemens.
My implementation and yours reconstruct the raw data in the same way as the scanner. So the problem is not with the implementation, but with GRAPPA itself, not being able to handle this particular coil combination/number with this acceleration factor.

I am going to close this since all my questions have been answered. Thanks for your help.

from pygrappa.

mckib2 avatar mckib2 commented on June 12, 2024

I'm glad to hear the implementations were not at fault! I was beginning to suspect it was a g-factor issue and it sounds like that's what it was.

If you've made any improvements over pygrappa in your implementation, I'd be interested in knowing what they are and getting them integrated.

from pygrappa.

zaccharieramzi avatar zaccharieramzi commented on June 12, 2024

I could add you as a collaborator for you to see. I didn't make any major improvements, but went a different way.

For example I wanted to be able to separate the kernel estimation from the kernel application to be able to visualize/inspect the kernels. I also made sure to have a clearly defined extraction function in order to be able to reuse this part for other non-linear approaches (like RAKI for example).

However, my approach is tailored to the problem I was dealing with (which is a certain kind of undersampling scheme), and in the current state is not applicable to all situations. Moreover, it only allows for 2 neighbouring sampled lines in the k-space.

from pygrappa.

mckib2 avatar mckib2 commented on June 12, 2024

By extraction function do you mean extracting sources and targets for the training? The pygrappa implementation vectorizes this as much as possible for efficiency, so I don't know if it makes sense to provide a separate function for individual source/target extraction in pygrappa.

If you're looking to get at the kernels, you can also retrieve them from mdgrappa:

recon, weights = mdgrappa(..., ret_weights=True)

weights is a dict that maps unique sampling patterns to kernels. You can also pass these back into the function to reuse or avoid recomputing them.

from pygrappa.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.