mon95 / Sign-Language-and-Static-gesture-recognition-using-sklearn

A Machine Learning pipeline that performs hand localization and static-gesture recognition built using the scikit learn and scikit image libraries

Home Page:https://medium.com/free-code-camp/weekend-projects-sign-language-and-static-gesture-recognition-using-scikit-learn-60813d600e79

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Using the main function to train and test.

mon95 opened this issue · comments

Firstly apologies for having put out some code that is not ready to use (i.e., a one click solution). I have received a lot of queries on how to run the main method, I've decided to share the same here.

The code here was never meant to be a one-click solution to static gesture recognition and in fact, was shared so that the methods used to train the models were available. Either way, if you are interested in running the same as opposed to using the methods in an independent program, you can try the following:

  1. Ignore lines 505 to 508. Change 509, 512 and 513 appropriately to include the correct list(s) of users (as per the data you have downloaded).

  2. Add a line:

gs = GestureRecognizer('/path/to/dataset/') (provide the correct path here)

This constructor is different from the one we've used. The reason we used the other one was because we had already trained the models. Using this constructor means that the models will get trained when you call the train method.

Then use the following as it is:

gs.train(user_tr) gs.save_model(name = "your-model-name.pkl.gz", version = "0.0.1", author = 'ss') print "The GestureRecognizer is saved to disk"

Once it is trained (it might take a lot of time to train), your model will be saved to disk. Then on, you can simply load the model and use it to test (i.e., detect gestures) using the recognize_gesture() method.

For this,

gs = GestureRecognizer.load_model(name = "your-model-name.pkl.gz") # automatic dict unpacking gs.recognize_gesture()

Hope this helps!

Do you now have more datasets available?

Can you show an example of how to use recognize_gesture()?

i have trained that asl mnist dataset from kaggle and i got 95 percent accuracy.but when i trid with your image i dint get proper output canu please help me to get out of this.

Hi Karthick, could you share the link to the asl mnist dataset?

I'm getting
le = loadClassifier('label_en (1).pkl') error in pipeline_final.ipynb file.
How to resolve it?