There are 1 repository under depth-images topic.
A procedural Blender pipeline for photorealistic training image generation
Autonomous UAV Navigation without Collision using Visual Information in Airsim
Synthetic Blender Dataset Production
3D Object Detection (iHack IIT Bombay) - Deep Learning based real-Time solution using YOLO and Fast RCNN
DeepDoors2 is a dataset for 2D/3D door classification, 2D door detection and 2D door segmentation.
Inference demo and evaluation scripts for the MICCAI-2019 paper "Human Pose Estimation on Privacy-Preserving Low-Resolution Depth Images"
Remote Color Depth Camera without any 3rd-party dependencies in iOS.
Repository for the implementation of "FastV2C-HandNet: Fast Voxel to Coordinate Hand Pose Estimation with 3D Convolutional Neural Networks"
A SPATIAL AND FREQUENCY BASED METHOD FOR MICRO FACIAL EXPRESSIONS RECOGNITION USING COLOR AND DEPTH IMAGES
ANALYSIS OF A ROBUST EDGE DETECTION SYSTEM IN DIFFERENT COLOR SPACES USING COLOR AND DEPTH IMAGES
Conversion of depth data to derived images that allow classical feature detection. My master thesis.
A data set for upper body orientation estimation of humans with continuous ground truth labels for the angle perpendicular to the ground
Major project on Hand sign classification using RGB-D camera (primesense). Classification done based on depth images, rgb images and depth information seperately.
[TCYB2018] Context-Aware Deep Spatio-Temporal Network for Hand Pose Estimation from Depth Images
This library provides an easy and accessible way to read and write .pfm files in C++ by only relying on the sandard library.
Dataset for patch-based person classification (person vs. non-person objects) and posture classification (standing vs. sitting vs. squatting). The data was recorded using a Kinect2 sensor and consists of labeled depth image patches of 27 persons in various postures and of various non-person objects. In total, the dataset consists of more than 235,000 samples divided into non-overlapping subsets for training, validation, and test.