ionvision / depth-camera-web-demo

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Depth camera capture in HTML5

Video is not yet loaded.
hands_interaction.gif is not yet loaded.

Moving boxes using hands (or a paper) demo shows live depth captured mesh interaction with scene objects; combining 3D world and depth captured hands (or other objects) rendering and Bullet Physics. Run live demo.

backgroundremoval.gif is not yet loaded.

Simple background removal implemented as flood-fill of background color to similarly colored pixels. Works only with simple backgrounds - e.g. room walls on the demo gif. Check the tutorial article and run live demo.

typing_in_the_air.gif is not yet loaded.

Typing in the air tutorial shows how to use depth stream and WebGL transform feedback to do simple gesture recognition. Check the tutorial article and run live demo.

https://github.com/01org/depthcamera-pointcloud-web-demo/raw/master/recording.gif is not yet loaded.

3D point cloud rendering demo shows how to render and synchronize depth and color video on GPU. Check the tutorial article and run live demo.

how_the_demo_looks.gif is not yet loaded.

HTML5 Depth Capture tutorial shows how to access depth stream, check the tutorial article and run live demo.

To capture and manipulate depth camera stream in HTML5, you'll need:

  • Chrome browser version 62 or later (the official release and no need for additional extensions),
  • Intel® RealSense™ 3D camera plugged to USB 3.0 port
    • SR300 (and related cameras like Razer Stargazer or Creative BlasterX Senz3D) or R200,
  • Windows, Linux or ChromeOS PC.

These are the constraints of current implementation. The plan is to support other depth cameras and OSX and Android, too.

Articles related to the demos:

About

License:Other


Languages

Language:JavaScript 95.5%Language:HTML 4.5%