hfawaz / miccai18

Evaluating surgical skills from kinematic data using convolutional neural networks

Home Page:https://germain-forestier.info/src/miccai2018/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Evaluating surgical skills from kinematic data using convolutional neural networks

This is the companion repository for our paper also available on ArXiv titled "Evaluating surgical skills from kinematic data using convolutional neural networks". This paper has been accepted at the International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) 2018.

Architecture

architecture fcn

The source code

The software is developed using Python 3.6, it takes as input the kinematic data and predicts the subject's skill level. We trained the model on an NVIDIA GPU GTX 1080 (this is only necessary to speed up the calculations). You will need the JIGSAWS dataset to re-run the experiments of the paper. The source code is composed of one python file with the code needed to re-run the experiments. The hyper-parameters published in the paper are present in the source code. The empty folders (in the repository) are necessary so the code could run without the "folder not found error". The content of JIGSAWS.zip (once downloaded) should be placed in the empty folder "JIGSAWS". If you are looking for more information on how to classify time series with deep learning, also have a look to our paper "Deep learning for time series classification: a review". The code of the models compared in this review is available here.

Prerequisites

We used the versions listed above, but you should be able to run the code with much recent versions. Different versions of the above-listed packages can be found in the archives of the packages' corresponding websites.

Visualizing the movements' contribution to the classification

The proposed method uses the Class Activation Map to localize the regions of the surgical task and their corresponding contribution to a certain classification.

A video illustrating the method can be seen here.

The functions that are used to visualize the trajectories are present in the source code.

Figure 2 in the paper, illustrates the trajectory of subject (Novice) H's movements for the left master manipulator.

The colors are obtained using the Class Activation Map values of a given class and then projected on the (x,y,z) coordinates of the master left hand.

You can find here the visualizations for all the subjects and all their trials for the Suturing task for each one of the five cross-validation folds.

Frame Trajectory explained
snap cam

Reference

If you re-use this work, please cite:

@InProceedings{IsmailFawaz2018evaluating,
  Title                    = {Evaluating surgical skills from kinematic data using convolutional neural networks},
  Author                   = {Ismail Fawaz, Hassan and Forestier, Germain and Weber, Jonathan and Idoumghar, Lhassane and Muller, Pierre-Alain},
  booktitle                = {International Conference On Medical Image Computing and Computer Assisted Intervention (MICCAI)},
  Year                     = {2018},
  pages = {214--221}
}

Ipynb version of "Evaluating surgical skills from kinematic data using convolutional neural networks"

This section is added by @Hosseinhashemiir.

This modified project of @https://github.com/hfawaz/miccai18 cause I don't have a good hardware (GPU) for running this code by directly, I used "Google's Colaboratory".

you don't need any setup for the run, just refer this Repo to Colab website http://colab.research.google.com and find the miccai18.ipynb.

notice that processing need around 2 hours.

About

Evaluating surgical skills from kinematic data using convolutional neural networks

https://germain-forestier.info/src/miccai2018/

License:GNU General Public License v3.0


Languages

Language:Jupyter Notebook 74.6%Language:Python 25.4%