davlars / ad-skull-reconstruction Goto Github PK
View Code? Open in Web Editor NEWRepository for reconstruction of simulated skull CT data for AD project
Repository for reconstruction of simulated skull CT data for AD project
Currently running plot_data on a 2d data set gives an error. Perhaps we should also support 2d?
save_data
does not seem to exist in adutils. Instead I find save_image
, which seems to do the job. Need to update documentation accordingly (see below).
Save data
To save data in a format that the clinical can review (typically nifti), use the adutils.save_data utility, with x being your reconstruction
fileName = /my/path/myFile
adutils.save_data(x, fileName, as_nii=True, as_npy=True)
Some examples currently have very large windows, these should be changed to something relevant, like [0.018, 0.022]
Currently, all of the "parameter search" functions need to calculate the FoM themselves. We should have a standard library of FoM available so that people can re-use them.
see title
I think that this commit in odl
causes crash in the pickled geometries corresponding to this package. The pickled geometries do not have a self.__stride = None
in the init so for example power_method_opnorm
does not work.
(I am not sure what stride does, but the crash occurred in power_method_opnorm
for me)
Basically this makes the load_data_from_nas
script fail, at least on windows. This is because Reference
and Users
are on different drives.
I get the following error:
File "<ipython-input-1-460109d7ac57>", line 1, in <module>
runfile('E:/Github/ad-skull-reconstruction/FBP_reco_skullCT_2D.py', wdir='E:/Github/ad-skull-reconstruction')
File "C:\Anaconda\envs\tensorflow\lib\site-packages\spyder\utils\site\sitecustomize.py", line 866, in runfile
execfile(filename, namespace)
File "C:\Anaconda\envs\tensorflow\lib\site-packages\spyder\utils\site\sitecustomize.py", line 102, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "E:/Github/ad-skull-reconstruction/FBP_reco_skullCT_2D.py", line 27, in <module>
phantom = reco_space.element(adutils.get_phantom(use_2D=True))
File "E:\Github\ad-skull-reconstruction\adutils.py", line 195, in get_phantom
nii = nib.load(path+phantomName)
File "C:\Anaconda\envs\tensorflow\lib\site-packages\nibabel\loadsave.py", line 40, in load
raise FileNotFoundError("No such file: '%s'" % filename)
FileNotFoundError: No such file: '/lcrnas/Reference/CT/GPUMCI simulations/code/AD_GPUMCI/phantoms/70100644Phantom_labelled_no_bed.nii'
This is since we have a hardcoded path to the NAS in get_phantom
There seems to be a bug in how the Tam-Danielsson window is used in the adutils
. Currently the code in the adutils
reads (see lines 115-119)
if use_window:
window = odl.tomo.tam_danielson_window(ray_trafo,
smoothing_width=0.05,
n_half_rot=3)
ray_trafo = window * ray_trafo
which will cause ray_trafo
to not be a RayTransform
anymore. Thus the standard fbp in odl
does not work. Minimal working example to reproduce the crash:
import numpy as np
import adutils
# Discretization
reco_space = adutils.get_discretization()
# Forward operator (in the form of a broadcast operator)
A = adutils.get_ray_trafo(reco_space, use_window=True)
# Define fbp
fbp = adutils.get_fbp(A)
This can be compared to how the Tam-Danielsson window is applied in the example filtered_backprojection_helical_3d
in odl
.
edit: @adler-j improved typesetting #
The current initial guess has Gibbs artifacts, we should likely find one that is more smooth.
We should add a wavelet regularized example, similar to this one.
The pickled geometries need to be updated to reflect this change.
The line label = np.flipud(label)
is redundant and causes the phantom to have the wrong orientation
As mentioned in #19, we should only copy if the data has been modified. See e.g.
http://stackoverflow.com/questions/2266234/can-we-do-a-smart-copy-in-python
The current example in the readme only works properly on windows, a short example on linux usage would probably be popular.
Currently the scripts save to disk with a hard-coded path. We should remove this so that we dont get colliding writes to the NAS.
I have problem with installing requirements. What is going wrong? Anyone else has the same issue?
I write the following command:
:~/ad-skull-reconstruction$ pip install -r requirements.txt
I get the error message:
Collecting odl>=0.6.1 (from -r requirements.txt (line 1))
Could not find a version that satisfies the requirement odl>=0.6.1 (from -r requirements.txt (line 1)) (from versions: 0.2.2, 0.2.3, 0.3.0, 0.3.1, 0.4.0, 0.5.0, 0.5.1, 0.5.2, 0.5.3, 0.5.3.post0, 0.5.3.post1, 0.5.3.post2, 0.6.0)
No matching distribution found for odl>=0.6.1 (from -r requirements.txt (line 1))
When connect to a remote computer via SSH, start an IPython session and type import adutils
(which is properly installed), I get the error message
qt.qpa.screen: QXcbConnection: Could not connect to display
Could not connect to any X display.
The following code
import odl
import adutils
# Discretization
reco_space = adutils.get_discretization(use_2D=True)
# Forward operator (in the form of a broadcast operator)
A = adutils.get_ray_trafo(reco_space, use_2D=True)
gives this error
Loading geometry
Traceback (most recent call last):
File "<ipython-input-1-e4354a8993cf>", line 8, in <module>
A = adutils.get_ray_trafo(reco_space, use_2D=True)
File "/home/aringh/git/ad-skull-reconstruction/adutils.py", line 86, in get_ray_trafo
geom = pickle.load(f, encoding='latin1')
ImportError: No module named 'odl.tomo.geometry.fanbeam'
I guess somethings has changed in odl
, but I do not know exactly what and when.
This is related to the discussion in #35, and the workaround there works for this as well.
You get the following error when you try to load the geometries
ValueError: unsupported pickle protocol: 3
To solve this, we need to regenerate these with protocol version 2. I.e. someone with python3 needs to do:
pickle.dump(pickle.load(file), file, 2)
on all the geometry files.
The __pycache__
folder has been added to .gitignore but when I did this, i forgot to remove it. We should remove it from the repo.
In several files we have stuff using saveCont
that looks like
niter = [int(i) for i in np.arange(5, 101, 5)]
for iterations in niter:
odl.solvers.conjugate_gradient_normal(A, x, rhs, niter=stepiter,
callback=callbackPrintIter)
# Store to disk
I assume this is intedended to save iterates nr 5, 10, 15, 20, etc, but since we restart the algorithm every 5:th iterate, the so called "10th" iteration is not the 10th iteration if we let conjugate_gradient_normal
run its course. This is likely even more evident for the more advanced solvers that "build momentum" towards the optimum.
My suggestion is either to port CallbackSaveToDisk, or wait until it is merged into master and use it instead. This would also make the code much easier to read (simply append a CallbackSaveToDisk
to the callback if you want to save results).
The package is becoming crowded, so we should create a folder structure. Perhaps start by adding two folders "reconstruction" and "reconstruction2d"? Naming suggestions are welcome ofc.
Add a saved FBP-reconstruction such that it can be loaded directly (without having to re-run FBP all the time)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.