Paper: Large-scale pretraining on pathological images for fine-tuning of small pathological benchmarks
We engineered three patch-based one large and two small datasets. They are designed to conduct large-scale training and downstream benchmarking, respectively. They have the same microns per pixel (MPP) of 0.39. MD5 checksums are available in the download links. Please refer the original paper for details.
Donwload Link: https://drive.google.com/drive/folders/18CmL-WLyppK1Rk29CgV7ib5MACFzg5ei?usp=drive_link
License: NIH Genomic Data Sharing (GDS) Policy
Use the snippet below to make the original archive file from divided files.
$ cat PTCGA200_p_* > PTCGA200.tar.gz
To reproduce the same training, validation, and testing split in the original paper, download and load the 3fold_dict_idx_filenames.pickle
file using dataset_utils.py.
Donwload Link: https://drive.google.com/drive/folders/1Oh7onawKsDW5ScamVO5ByXFgqdYJ39sK?usp=drive_link
Donwload Link: https://drive.google.com/drive/folders/1zg_C37B_1HR6miRFuTwPKmueaJzvO-GD?usp=drive_link
- pytorch >=1.8.1
- torchvision
- kornia
- Pillow 8.2.0
- numpy
- tqdm 3.60.0
- h5py
Download datasets and modify the config.py file. Then, create runs
folder and run the following command.The training script is designed for distributed training. If you want to train on multiple nodes, provide host name lists and master node address in the config.py file and run the script in each node.
python train.py
To train BYOL, change the config 'self_superversed' : 'byol'
.
Self-supervised learning was performed using the repo below except for BYOL.
Under permission process.
Provisional
@CoRR{PatchTCGA,
title={Large-scale pretraining on pathological images for fine-tuning of small pathological benchmarks},
author={Masataka Kawai, Nriaki Ota, Shinsuke Yamaoka},
booktitle={},
year={2023}
}
We thank the authors of the original datasets for their efforts.
We also thank the authors of the following repositories for their contributions and references.
This work is based on results obtained from a project, JPNP20006, commissioned by the New Energy and Industrial Technology Development Organization (NEDO).