To begin processing and capturing gestures of the hand - bending the fingers, palms touching, and interlacing fingers - these motions would have to be captured by sensor data. There are systems like the Microsoft Kinect or various infrared that can probably track what hand position you have, but to create a wearable that is fully functioning on its own must rely on other types of sensing. And for accessibility, hardware sensors are cheap and readily available.
What sensors should I use?
A number of sensors could work such as tilt sensors, photocells, or capacitive touch fabric. I tested out several sensors and set ups before deciding to create the first prototype using photocells. But try various combinations of sensors! What gestures can be identified when using both photocell and reading which finger is bent? (A thumbs up? Or perhaps a (half) heart shape...)
Soft Circuits
I decided to implement a lot of the design using the natural contours of the glove (in this case I used a woven knit glove that had a lot of stretch). Other projects tend to use 3D printing to encase the hand while other gloves, like The Mimu Gloves, create a complete circuit glove from scratch - sewing and assembling both the machine sewn circuit and glove pieces in the final steps. For the purpose of my design, I have chosen to use easily found materials around the house so that this setup can be recreated easily and is the most accessible. Sewing the circuit into a glove is lighter in weight and also allows for more flexibility when creating gestures and movements with the hand.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.