Code to construct/analyze a texture statistics encoding model for fMRI data.
Related to our paper:
Henderson, M.M., Tarr, M.J., & Wehbe, L. (2023). A texture statistics encoding model reveals hierarchical feature selectivity across human visual cortex. Journal of Neuroscience. https://doi.org/10.1523/JNEUROSCI.1822-22.2023
- Clone the repository:
- git clone [email protected]:mmhenderson/texturemodel
- Edit "root" in path_defs.py to reflect the name of the folder into which you cloned this repository.
- If using our pre-computed model fits: download the data from OSF
- https://osf.io/8fsnx/
- After unzipping, you should have a folder "model_fits", which can be placed inside "texturemodel".
- You should also have a folder "rois", which should be placed at: /root/nsd/rois
- If fitting from scratch: access the fMRI dataset (NSD) and images here:
- http://naturalscenesdataset.org/
- Update path_defs.py to reflect the path where NSD is downloaded.
- Use "/code/run/prep_images.sh" to prepare the NSD images for feature extraction pipeline.
- Use the jupyter notebooks inside "notebooks" to plot the results of model fitting, and reproduce all our figures.
- Some of the plots require PyCortex, which you can download here:
- https://pypi.org/project/pycortex/
The first step of the fitting procedure is to extract texture statistics features using a steerable pyramid representation. Our code is adapted from the Matlab code available at: https://github.com/freeman-lab/metamers.
Running the feature extraction code requires PyTorch as well as PyrTools (https://pyrtools.readthedocs.io/en/latest/). Using a GPU is recommended for speed.
See "code/run/extract_texture_feats.sh" for an example of how to run the feature extraction code (adjust the paths in this script for your local filesystem).
See the scripts in "code/run/fit..." for examples of how to run fitting code (adjust the paths in these scripts for your local filesystem).
To fit the models, you'll need to first download the NSD dataset, and run the feature extraction code.
Any questions/concerns can be directed to [email protected]