Mohamed Fazil (fazildgr8)

fazildgr8

Geek Repo

Company:Robotics Engineer at Hello Robot Inc.

Location:San Francisco

Home Page:http://mohamedfazil.com

Github PK Tool:Github PK Tool

Mohamed Fazil's repositories

ros_autonomous_slam

ROS package which uses the Navigation Stack to autonomously explore an unknown environment with help of GMAPPING and constructs a map of the explored environment. Finally, a path planning algorithm from the Navigation stack is used in the newly generated map to reach the goal. The Gazebo simulator is used for the simulation of the Turtlebot3 Waffle Pi robot. Various algorithms have been integrated for Autonomously exploring the region and constructing the map with help of the 360-degree Lidar sensor. Different environments can be swapped within launch files to generate a map of the environment.

Language:PythonLicense:MITStargazers:197Issues:5Issues:14

realsense_explorer_bot

Autonomous ground exploration mobile robot which has 3-DOF manipulator with Intel Realsense D435i mounted on a Tracked skid-steer drive mobile robot. The robot is capable of mapping spaces, exploration through RRT, SLAM and 3D pose estimation of objects around it. This is an custom robot with self built URDF model.The Robot uses ROS's navigation stacks .

Language:CMakeLicense:MITStargazers:89Issues:1Issues:0

realsense_bot

This is a ROS package for Intel realsense D435i with 3-DOF Manipulator robot that can be used for Indoor Mapping and localization of objects in the world frame with an added advantage of the robot's dexterity. The 3-DOF Manipulator is a self-built custom robot where the URDF with the depth sensor is included. The package covers the Rosserial communication with Arduino nodes or I2C with the Jetson Nano to control the robot's Joint States and PCL pipelines required for autonomous mapping/Localization/Tracking of the objects in real-time.

Language:PythonStargazers:9Issues:1Issues:0

AppliedDeepLearning

This repository consists a set of Jupyter Notebooks with a different Deep Learning methods applied. Each notebook gives walkthrough from scratch to the end results visualization hierarchically. The Deep Learning methods include Multiperceptron layers, CNN, GAN, Autoencoders, Sequential and Non-Sequential deep learning models. The fields applied includes Image Classification, Time Series Prediction, Recommendation Systems , Anomaly Detection and Data Analysis.

Language:Jupyter NotebookStargazers:8Issues:1Issues:0

touchlessClockin_project

This is a computer vision based project developed admist the pandemic situation where touchless systems are required everywhere. It uses face recognition and deep learning to identify employees/members of an institution to check in or check out of the system without touching, It uses voice interaction and has a sophisticated interface done with opencv. It is also integrated with google's firebase which is a cloud managed structureless database which stores all the user data, time stamped images for security and face descriptions. A UI for the administrator is also developed using NodeRed platform which can be used to monitor the user checkin activities.

Language:PythonLicense:MITStargazers:3Issues:1Issues:0

virtual_pen_MNIST

This is a python program which uses deep learning and image processing to create virtual pen where the user can hover with the configured colour tip over the webcam to write digits. The deep learning model trained using mnist is used to recognize the digits. It uses keras for deep learning and opencv for image processing.

Language:PythonLicense:MITStargazers:3Issues:1Issues:0

myotron_wrist_control

This project proposes and delivers a novel approach to train and test a Convolutional Neural Network (CNN) model for muscle synergy controlled prosthetic hands. The project is focused on providing a solution for precise control and real-time testing of prosthetic hand control used by below-elbow amputees having independent control over the prosthetic fingers. Multiple EMG sensors that are placed on the forearm will be used to control the prosthetic hand using the trained model. CNN allows us to extract features from raw EMG signals without the requirement for manual feature engineering done over raw data in traditional methods. Furthermore, the trained model will be evaluated in real-time within a Virtual Reality environment developed using the Mujoco Physics environment with the HTC Vive VR headset. The developed algorithm will be tested on ten healthy participants and their data will be analyzed to show the performance of the presented controller.

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

PathPlanning

Common used path planning algorithms with animations.

Language:PythonLicense:MITStargazers:1Issues:0Issues:0

pattern_recognition

This repository contains various jupyter pages written by me working on the MNIST datasets for my course Pattern Recognition. It uses different learning methods such as Support Vector Machines, Neural Networks, Generative Models, Probabilistic Graphic Models and Linear Discriminant functions. It uses keras and tensorflow for most of the codes.

Language:Jupyter NotebookLicense:MITStargazers:1Issues:1Issues:0

PythonRobotics

Python sample codes for robotics algorithms.

Language:Jupyter NotebookLicense:MITStargazers:1Issues:0Issues:0

VR_communication_mujoco200

This is the development repo of Virtual Reality rendering with Mujoco Physics Environment with help of OpenVR SDK and HTC Vive HMD hardware. On top of it, a PubSub socket-based communication is introduced using the ZMQ library. Using this PubSub communication any application outside the Mujoco application can be used to operate actuators inside the Mujoco environment by just publishing the joint positions to the Topic to which the Mujoco has subscribed.

Language:CStargazers:0Issues:1Issues:0

wrist_control_CNN_AWEAR

This is the cumulative repository for the research project Deep Learning Approach to Robotic Prosthetic Wrist Control using EMG Signals done in the AWEAR lab. This repository would consist of all the Data processing pipelines codes, custom data preprocessing library built for this project, and all the time series CNN training Jupyter notebooks using the Data collected within the AWEAR Lab, University at Buffalo.

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

HomographyEstimation

A Python 2 based robust homography estimation via RANSAC tool

Language:PythonStargazers:0Issues:0Issues:0
License:MITStargazers:0Issues:0Issues:0

TouchlessClockin_A2IL

Am Touchless clock-in system Designed and deployed a web based application in which uses face recognition with deep learning. Web application was built in Angular and deployed through Google Cloud Console with Python flask backend which also runs in a GCP Cloud Server. It uses Dlib leep learning library in python and Manages a unstructured database in cloud through MongoDB.

Language:PythonStargazers:0Issues:1Issues:0

UB_gym_notifier

A python-based web content monitoring app to notify available Gym session slots for UB students. As the UB gyms require a prior booking of the time slots but it tedious to get a slot due to decreased capacity as everything gets filled by the start of the week itself. This app would notify you through notify.run web client app whenever there is a free slot available throughout the week which puts in advantage to book slots compared with other people. The python script is deployed in the GCP Debian cloud computer and is executed continuously every 15 minutes.

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0