There are 6 repositories under kinect-sensor topic.
SimpleOpenNI library for Processing 3.5.2, 3.4, 3.3.7, 3.3.6 on MacOS for V1 and V2
Markerless volumetric alignment for depth sensors. Contains the code of the work "Deep Soft Procrustes for Markerless Volumetric Sensor Alignment" (IEEE VR 2020).
An example of integrating Xbox One Kinect sensor "Kinect v2" with OpenCV for Unity.
A simple framework for gesture recognition in Java
A Python wrapper for the kinect built on pylibfreenect2.
We help the deaf and the dumb to communicate with normal people using hand gesture to speech conversion. In this code we use depth maps from the kinect camera and techniques like convex hull + contour mapping to recognise 5 hand signs
Autonomous navigation using SLAM on turtlebot-2 for EECE-5698 Mobile robotics class.
3D Scene Reconstruction in MATLAB with the Microsoft Kinect depth sensor.
Hello World for the Kinect SDK - see the Instructables tutorial http://www.instructables.com/id/Kinect-SDK-Hello-World/
Retroreflective marker tracking using Azure Kinect cameras
Spheres Recognition Color-Depth on Raspberry with Kinect Sensor
A Gesture Recognition App using Microsoft Kinect V2
6DOF Robotic ARM - Nvidia Jetson Nano - Kinect
ROS node to merge LaserScan data from lidar and kinect sensors
A Microsoft Kinect based adaptation of the vr game Beat Saber.
Allows the uArm Metal to mimic the movement of your arm using the Kinect.
Unity3D - Xbox Kinect based game for physiotherapy exercise.
A .NET Library which captures RgbColor and DepthColor Images from the Azure Kinect Sensor and returns them as Bitmaps.
A Leader-Follower Mobile Robot Scheme using an RGB-D Camera and MobileNets
Provides access to Microsoft kinect sensor data (skeleton, color camera and depth camera) from HTTP using WebSockets..
Gesture Recognition is a project to control Microsoft Powerpoint while presenting on a large screen, using the Microsoft Kinect Sensor, C# and Kinect SDK. Using their right or left hand, the presenter can slide to the next or previous page. They can also zoom in/out by fisting their hand, play/pause the presentation by displaying all/none fingers.
Images from an RGB-D camera are used to detect/classify objects in 2D, then detections are projected on the 3D point cloud.
Project 4 - Udacity Robotics Software Engineer Nanodegree Program
Traditional Flappy Bird is translated into a human responsive form, where a user would stand in front of the Kinect module and would act as the bird. Whenever the user jumps, the bird flaps in the gaming screen. The gesture has been translated using the Kinectscripts given in the Unity3D.
A guide on how to get skeleton tracking to work with the Xbox 360 Kinect on a modern Linux distribution.
Drawing Robot using Arduino, a Serial Interface and OpenCV/OpenNI with Kinect
Background removal tool (chromakey-like) using a RGB-D camera
My BEng project for University