georgkreimer / Gesture-Recognition-101-CoreML-ARKit

Simple project to recognize hands in realtime. ๐Ÿ‘‹ Serves as an example for building your own object recognizer.

Home Page:https://medium.com/p/7f8c09b461a1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hand Gesture Recognition

This simple sample project recognizes hands in realtime. ๐Ÿ‘‹ It serves as a basic example for recognizing your own objects. Suitable for AR ๐Ÿค“. Written for the tutorial โ€œCreate your own Object Recognizerโ€.

gif showing fist and spread hand appearing and dissappearing from screen, and it being recognized on an iPhone

Demo Video - on Youtube

Tech: iOS 11, ARKit, CoreML, iPhone 7 plus, Xcode 9.1, Swift 4.0

Notes:

This demonstrates basic Object Recognition (for spread hand ๐Ÿ–, fist ๐Ÿ‘Š, and no hands โŽ). It serves as a building block for object detection, localization, gesture-recognition, and hand tracking.

Disclaimer:

The sample model provided here was captured in 1 hour and is biased to one human hand ๐Ÿ‘‹๐Ÿผ. Itโ€™s intended as a placeholder for your own models. (See Tutorial)


Steps Taken (Overview)

Hereโ€™s an overview of the steps taken. (You can also view my commit history to see steps involved.)

  1. Build an Intuition by playing with Google CL's Teachable Machine.
  2. Build dataset.
  3. Create a Core ML Model using Microsoft's CustomVision.ai.
  4. Run the model in realtime with ARKit.

Full Tutorial here

P.S. A few well selected images are sufficient for CustomVision.ai . For the sample model here, I did 3 rounds of data collection (adding 63, 38, 21 images per round). Alternating classes during data collection also appeared to work better than gathering all the class images at once.

image of dataset

License

MIT Open Source License. ๐Ÿงž Use as you wish. Have fun! ๐Ÿ˜

About

Simple project to recognize hands in realtime. ๐Ÿ‘‹ Serves as an example for building your own object recognizer.

https://medium.com/p/7f8c09b461a1

License:MIT License


Languages

Language:Swift 100.0%