GITSHOHOKU's repositories
3D-Machine-Learning
A resource repository for 3D machine learning
awesome-semantic-segmentation
:metal: awesome-semantic-segmentation
blood_vessel_feature_extraction
python script to skeletonize and extract key features from retinal vasculature
boundary-loss
Official code for "Boundary loss for highly unbalanced segmentation", runner-up for best paper award at MIDL 2019. Extended version in MedIA, volume 67, January 2021.
gmmreg
Implementations of the robust point set registration algorithm described in "Robust Point Set Registration Using Gaussian Mixture Models", Bing Jian and Baba C. Vemuri, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(8), pp. 1633-1645. For a Python implementation, please refer to http://github.com/bing-jian/gmmreg-python.
hed
pytorch code for Holistically-Nested Edge Detection
Medical-Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
nerfplusplus
improves over nerf in 360 capture of unbounded scenes
pacnet
Pixel-Adaptive Convolutional Neural Networks (CVPR '19)
pydensecrf
Python wrapper to Philipp Krähenbühl's dense (fully connected) CRFs with gaussian edge potentials.
pytorch-hed
a reimplementation of Holistically-Nested Edge Detection in PyTorch
Self-supervised-Fewshot-Medical-Image-Segmentation
[ECCV'20] Self-supervision with Superpixels: Training Few-shot Medical Image Segmentation without Annotation (code&data-processing pipeline)
SparsePlanes
Planar Surface Reconstruction from Sparse Views
Stereo-Visual-SLAM-Odometry
Real-time Stereo Visual SLAM Pipeline with Bundle Adjustment
Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
trimesh
Python library for loading and using triangular meshes.
tsdf-fusion-python
Python code to fuse multiple RGB-D images into a TSDF voxel volume.
visual_mapping
semantic mapping based on visual perception
voxblox-plusplus
A volumetric object-level semantic mapping framework.