Giter Club home page Giter Club logo

isinet's Introduction

ISINet

This is the Pytorch implementation of ISINet: An Instance-Based Approach for Surgical Instrument Segmentation published at MICCAI2020.

Installation

Requirements:

  • Python >= 3.6
  • Pytorch == 1.4
  • numpy
  • scikit-image
  • tqdm
  • scipy == 1.1
  • flownet2
  • Detectron v.1 (for using our pre-trained weights)

Pre-trained Weights

Pre-trained weights are publicly available on the project page.

Additional Annotations EndoVis 2018 Dataset

Additional annotations for the EndoVis 2018 Dataset are publicly available on the project page.

Data Preparation

Check the instructions detailed in data/README.md

Perform Inference

python -W ignore main.py --inference --model FlowNet2  --batch_size batch_size --number_workers num_workers \
--inference_dataset RobotsegTrackerDataset \ --inference_dataset_img_dir /path/to/images \ --inference_batch_size batch_size \
  --inference_dataset_coco_ann_path /path/to/coco/annotations/file.json \
  --inference_dataset_segm_path /path/to/mask-rcnn/inference/segm.json \
  --inference_dataset_ann_dir /path/to/annotations \
  --inference_dataset_cand_dir /path/to/save/candidates \ --inference_dataset_nms 'True' \
  --save /path/to/save/predictions \
  --inference_dataset_dataset '2017' or '2018' \
  --inference_dataset_maskrcnn_inference 'False' \
  --assignment_strategy 'weighted_mode' \ --inference_dataset_prev_frames 7 \
  --threshold 0.0 for 2017 and 0.5 for 2018 \
  --resume /path/to/flownet/checkpoint --num-classes number_of_classes

Reference

If you found our work useful in your research, please use the following BibTeX entry for citation:

@article{ISINet2020,
  title={ISINet: An Instance-Based Approach for Surgical Instrument Segmentation},
  author={Cristina Gonz{\'a}lez and Laura Bravo-S{\'a}nchez and Pablo Arbelaez},
  journal={arXiv preprint arXiv:2007.05533},
  year={2020}
}

Acknowledgements

Our code is build upon FlowNet2, we thank the authors for their contributions to the community.

isinet's People

Contributors

cigonzalez avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

isinet's Issues

How to run the inference with temporal consistency module

Dear Authors,
Thank you for sharing your work. While trying to run the code, inference.sh, which argument is corresponding for enabling the temporal consistency module so that I can reproduce the results for (T- tick). I could run the code for 2017 dataset with the following .sh instructions but the results are similar to ISINet without Temporal consistency module and without Data augmentation. Could you please help me understand how to run it with Temporal consistency module?
Also, the file inference.sh does not contain the following code but a different version.

python -W ignore main.py --inference --model FlowNet2 --batch_size batch_size --number_workers num_workers
--inference_dataset RobotsegTrackerDataset \ --inference_dataset_img_dir /path/to/images \ --inference_batch_size batch_size
--inference_dataset_coco_ann_path /path/to/coco/annotations/file.json
--inference_dataset_segm_path /path/to/mask-rcnn/inference/segm.json
--inference_dataset_ann_dir /path/to/annotations
--inference_dataset_cand_dir /path/to/save/candidates \ --inference_dataset_nms 'True'
--save /path/to/save/predictions
--inference_dataset_dataset '2017' or '2018'
--inference_dataset_maskrcnn_inference 'True'
--assignment_strategy 'weighted_mode' \ --inference_dataset_prev_frames 7
--threshold 0.0 for 2017 and 0.5 for 2018
--resume /path/to/flownet/checkpoint --num-classes number_of_classes

Confusions about the Preparation for 2018 dataset

Hi, thanks for your great work!I am preprocessing 2018 dataset according to https://github.com/BCV-Uniandes/ISINet/blob/main/data/README.md
However, some details are not very clear. I met some issues when I tried to get the coco format annotation for 2018 dataset.

I have already downloaded original 2018 dataset (part-wise annotation) and your dataset (intrument-wise annotation). And I put both dataset in the same folder. Although I noticed you mentioned "use the 2018 labels compatible with the 2017 dataset (final_labels_2017.json).", I don't know how to prepare it. When I run organize2018.py, it requires 'instrument_labels_2017' folder.
May I know how to solve it?
Can you kindly show the dataset folder structure in GitHub, it would be very helpful!

https://github.com/BCV-Uniandes/ISINet/blob/main/data/organize2018.py line 164
instr_label_folder = osp.join(args.data_dir, data_f, in_f, 'instrument_labels_2017')

Thanks for your help!

subprocess.CalledProcessError: Command '['git', 'rev-parse', 'HEAD']' returned non-zero exit status 128.

Hi,
thanks for sharing your excellent work, and there is a problem when i run main.py:

File "main.py", line 107, in
args.current_hash = subprocess.check_output(["git", "rev-parse", "HEAD"]).rstrip()
subprocess.CalledProcessError: Command '['git', 'rev-parse', 'HEAD']' returned non-zero exit status 128.
/root/ISINet-main/temp_consistency_module/run.sh: line 2: --inference: command not found
/root/ISINet-main/temp_consistency_module/run.sh: line 3: --model: command not found
/root/ISINet-main/temp_consistency_module/run.sh: line 4: --batch_size: command not found
/root/ISINet-main/temp_consistency_module/run.sh: line 5: --number_workers: command not found
/root/ISINet-main/temp_consistency_module/run.sh: line 21: --num-classes: command not found

How can i solve it?
Thank you!

Command 'git rev-parse HEAD' returned non-zero exit status 128.

While doing git commit some python files, I am getting this error.

File "C:\Repo\cpv\extern\VerFw\Src\ComponentLineImporter_init_.py", line 28, in
from .Version import version
File "C:\Repo\cpv\extern\VerFw\Src\ComponentLineImporter\Version.py", line 22, in
version = subprocess.check_output('git rev-parse HEAD', cwd=os.path.dirname(os.path.abspath(file))).decode().strip()
File "c:\program files\python36\lib\subprocess.py", line 336, in check_output
**kwargs).stdout
File "c:\program files\python36\lib\subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command 'git rev-parse HEAD' returned non-zero exit status 128.

  1. I checked the path of ComponentLineImporter module, and it seems correct.
    ComponentLineImporter.file
    'C:\Repo\cpv\extern\VerFw\Src\ComponentLineImporter\init.py'
  2. I checked the git command "git rev-parse HEAD" @ folder C:\Repo\cpv\extern\VerFw\Src\ComponentLineImporter. I get correct result.
    $ git rev-parse HEAD
    49e494f29290c39ca8af8b4c9357183c59027815

I dont know what's wrong. FYI , I am using just 1 repo no duplicate repos.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.