There are 2 repositories under signlanguagerecognition topic.
Signapse is an open source software tool for helping everyday people learn sign language for free!
Signfy is a Video Chat app that incorporates sign language translation to bridge the communication gap between the deaf and hearing communities.
Bachelor Thesis at the Wroclaw University of Science and Technology.
This web-based app detects and interprets sign languages into English words in real-time in order to help speech-impaired individuals communicate with others more easily.
Applied SSD integrated with MobileNet model for object (sign gestures) detection and recognition and the model is trained using Transfer Learning, with the aim to develop a web app for real-time ASL recognition from user input & then to generate text in English.
Aplicación que permite traducir lengua de señas a audio y texto.
This is a model to classify Vietnamese sign language using Motion history image (MHI) algorithm and CNN.
it able to detect ten type USA sign language. which is Okay, Peace, Thumbs up, Thumbs down, Call me, Stop, I Love You, Hello, No, Smile.
American Sign Language Recognition
This program will use gesture detection to help identify common ASL gestures as well as alphabets, translating them into sentences.
This repo contains the code for sign-language-recognition as part of our final year project.
Major Project in Final Year B.Tech (IT). Live Stream Sign Language Detection using Deep Learning.
Sign Language Translator using Graph Convolution Networks (GCN)
packages needed to work this project are as follows :- OpenCV , Tensorflow , Pyenchant , Mediapipe , keras , numpy , gtts , tkinter
Teaching computers to understand sign language! This project uses image processing to recognize hand signs, making technology more inclusive and accessible.
Sign_languagues_recognition
Shady Elkholy's graduation project, Arab Open University, Computer science
GestureGo facilitates bidirectional communication between people with hearing or speech impairments and other people which in turn will lessen the communication gap between them and allowing everyone to understand and be understood.
Pakistan Sign Language Recognition for Word and Sentence Level Signings
Task 1: Sign Language Classification using machine learning at SYNC INTERN'S.
A model to recognize ten different everyday KSL signs present in the images, using machine learning or deep learning algorithms.This dataset was collected specifically for a Zindi competition by Task Mate. Almost all of the hands are hands of people of color, in an effort to address bias in sign language datasets.
We have trained CNN (Convolutional Neural Network) algorithm to predict sign language. There are also pretrained models which also used in this model.
ASL Letter Classification: Using a CNN to classify American Sign Language (ASL) letter images with high accuracy.
Real-Time Communication System Powered by AI for Specially Abled
This project is aimed at detecting American Sign Language (ASL) alphabets in real-time using computer vision. The system utilizes OpenCV for image processing, MediaPipe for hand detection, and a Random Forest classifier from scikit-learn for alphabet recognition.
Sign Language detection is done using Tensorflow API and OpenCV.
Sign language gesture recognition is done in two ways. Alphabets are detected from sign language and words are formed using this. The words thus formed is then converted to speech. Another method includes recognizing gestures which include words.
Code for the demo of the VGT-NL dictionary at Dag Van De Wetenschap 2023 and other events.
Using yolo-v8 to train on custom dataset for sign language recognition
This project focused on developing a model for recognizing hand gestures in sign language using deep learning. I collected Sign MNIST dataset and trains a convolutional neural network (CNN) model to classify sign language gestures.
"Sign Language Recognition" is a project that employs Long Short-Term Memory (LSTM) neural networks to accurately recognize and interpret sign language gestures. The project encompasses the creation of a robust dataset containing over 50 words in sign language, providing a diverse range of gestures for training and testing the model.
Sanket is a real-time sign language recognition application. It's designed to recognize a variety of sign language gestures, making communication easier for those who use sign language.