The objective of this project is to be able to classify hand made sign language gestures into their respective alphabets in realtime and use them as input.
ASL Alphabet
- Image data set for alphabets in the American Sign Language
- The data set is a collection of images of alphabets from the American Sign Language, separated in 29 folders which represent the various classes.
- The training data set contains 87,000 images which are 200x200 pixels. There are 29 classes, of which 26 are for the letters A-Z and 3 classes for SPACE,DELETE and NOTHING.
- Dataset: Kaggle or Drive
The final code, Sign Language Detection V7.ipynb
can be found in the code directory and final model, sign_lang_detect_model.h5
can be found in the models directory.
The architecture of the model is given below:
We obtain an image classification accuracy of 98.37%.