Close
0%
0%

Hands|On - Human computer interface glove

Hands|On is a smart glove that allows you to interact with ANYTHING using hand gestures!

Similar projects worth following
https://youtu.be/7kXrZtdo39k?t=8

The Hands|On glove captures the wearer's hand and finger movements. The data is collected using a variety of sensors on the glove and sent to a microcontroller, which processes the data to give the real time state of the hand. Developers can write a variety of applications using the information about the hand. For example, it could be used to:

- Translate sign language using an ML model. See youtube video above!

- Play video games and virtual reality naturally with your hands.
- Manipulate 3D CAD models with your hands.
- Control robots.
- Perform remote surgery.
- Use your hand as a MIDI and make music.
- Use your hand as a computer mouse.
- Use any surface as a computer keyboard by 'swyping' on it.

The possibilities are endless! It's up to you creative developers out there to code great applications

Almost all speaking people in the world don't know sign language. It creates a huge barrier between the deaf & mute community and the rest of the world. This reduces the potential for everyday communication and transfer of opportunities across the barrier. Today, for a deaf/mute person to get a job or do business in this world, their company needs to hire an interpreter for them. Many companies prefer to hire a speaking employee so that the cost of an interpreter can be avoided.

This project is designed as a serious effort at tackling the technical problems of sign language translation so that we can take down the communication barrier that exists between deaf & mute communities and the rest of the world. The requirements have been thought through carefully to ensure ease of use and adoption of this glove, so that people from deaf & mute communities can interact with the rest of the world and make their mark on it!

Hardware

The hardware on Hands|On should be capable of sensing and collecting all the necessary data about the hand's movements while being lightweight and portable. With these needs in mind the hardware was designed as follows:

Sensors:

  • Flex sensors: these rad sensors (from the days of the Nintendo PowerGlove) are thin strips of film-like material, whose resistance changes with bending. We use 9 of these to measure the bending of the important finger joints.
  • 9-DOF IMU: measures the orientation of the hand in space.
  • Capacitive touch sensors: a fancy name for pieces of copper tape stuck to a wire. When the copper tape is touched, its capacitance is altered resulting in a measurable response at the microcontroller.

Data handling:

  • Teensy microcontroller: the Teensy is a very lightweight, small form factor microcontroller with the power and capabilities of an Arduino. The best part about it is that it has 20 analog inputs! This is what we need to plug all our flex sensors into. Also, many of these inputs are easily configurable as 'touch' inputs so it was too easy to set up our capacitive touch sensors in the code :)
  • HC-06 bluetooth module: no one wants to be connected by USB cable to a computer all day. Unless you're into that. Bluetooth will send data to the PC and make the device portable.
  • Rechargeable battery.

Software

Hands|On's software should translate signs with high accuracy, and allow the user to train the machine with their own hand gestures. It should also capture and process data very fast, and have an intuitive interface. Below is the software design:

  • Serial communication: the Arduino code on the Teensy transmits the sensor data, after minimal processing, over a serial port (virtual COM port) to the computer. The data is picked up using the PySerial library in Python.
  • Real-time 3D hand rendering: the parsed and processed sensor data is used to show a live 3D animation of the user's hand on the screen. This is done using PyOpenGL. The animated hand's orientation changes and fingers bend in sync with the user's hands!
  • Machine learning: hand signs and gestures are captured by saving the sensor data to a file. Machine learning is performed using the SVM algorithm. The API used for machine learning is the fantastic Scikit-learn library.
  • Text to speech: the letters outputted by the machine learning algorithm are sent to a text to speech engine (Pyttsx) so that you can hear what was being signed!
  • Qt interface: the GUI is designed using PyQt5 because Qt is slick and extremely cross-platform (https://doc.qt.io/qt-5/supported-platforms.html). The Qt Creator was very intuitive to use to get our GUI up and running quickly.

Drawings

1. Flex Sensor Connection Schematic

CapstoneFinalReport.2-22.pdf

Report from when we submitted this for our university final year engineering project. Contains great details and images of every aspect of our project, in a (relatively) short number of pages! :D

Adobe Portable Document Format - 1.25 MB - 09/29/2016 at 23:50

Preview

Teensy Pinout 1.png

Pinout diagram for the Teensy 3.2 microcontroller (FRONT)

Portable Network Graphics (PNG) - 372.75 kB - 09/29/2016 at 23:49

Preview

Teensy Pinout 2.png

Pinout diagram for the Teensy 3.2 microcontroller (BACK)

Portable Network Graphics (PNG) - 270.56 kB - 09/29/2016 at 23:49

Preview

  • 1 × Teensy 3.2 microcontroller Small form-factor microcontroller that is compatible with Arduino code. It has a ton of GPIO pins!
  • 1 × Adafruit BNO055 9-DOF IMU An IMU which contains hardware for processing of the accelerometer and gyroscope data. Provides highly accurate angular position but absolute position has a lot of drift (as expected). Communicates via I2C.
  • 9 × 2.2" Flex sensors Flexible strip sensors whose resistance is a function of the degree of bending. Used to detect the bending of each finger joint.
  • 7 × Copper tape (touch sensors) Copper tape, when a person touches it, acts as a touch sensor because there is a measurable change in capacitance.
  • 1 × Skin tight glove A glove that curves around the hand and bends around the finger joints perfectly. Skin tight is necessary for accurate flexion measurements.

View all 7 components

  • New glove, new capacitive sensors, and Windows compatible!

    Bhavesh Kakwani10/20/2017 at 21:13 0 comments

    Three huge updates in one day! I finally got busy on the project again, because I signed up for a tech fair event that's happening tomorrow in Brampton, Ontario. Never underestimate the power of a deadline on moving your project forward!

    First, I took everything off of the old, ripped-up glove. Then I de-soldered the existing wires on the 'capacitive touch sensors' (pieces of copper tape) and got rid of both the wires and the sensors; the wires were too stiff and the copper sensors were battered up. I replaced them with fresh new pieces of copper tape and new stranded wire instead of the normal stiff single-core wire. This is because the stranded wire is much more flexible, leading to less/no chance of the copper tape snapping off of the wire:

    Soldering stranded wires onto perf-board
    The flexible stranded wires are blue, old stiff wires are red

    After all the new wire and copper tape was soldered on, it was time to test things out using the Hands|On GUI. After a small session of downloading the missing Python libraries on my Windows machine, I was able to run the application on Windows! See the GUI in action:

    Hands|On GUI on Windows
    The Hands|On GUI running on Windows

    And check out the new glove!:

    New Glove, No Rips
    Fresh new glove with no rips or exposed fingertips

View project log

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates