Our final goal is to obtain an autonomous device but during the weekend the software development was divided in two blocks: server and client. Server software is running inside the Eye of Horus while the client is running on a laptop computer.
Server
This part of the software is executed in the VoCore module, a coin sized linux computer suitable for many applications. This module acts as a server running OpenWrt, a Linux distribution for embedded devices.
The server software is in charge of:
- Capturing the video captured by the camera
- Streaming the data over WiFi using a lightweight webserver
Client
This part is in charge of:
- Receiving the video stream
- Processing the images of the eye in order to detect the center of the pupil
- Provide a user interface to calibrate the system
- Control the mouse in the laptop computer according to the coordinates dictated by the eye
The video stream in the client was processed in real time using HTML5. A segmentation library was developed from scratch to threshold the images and analyze its morphology to detect the pupil and compute its center of mass.
At this point, the user has a mouse controlled by his eye and can interact with any 3rd party software installed in the computer.
A demo of the client sofware recognizing the center of the pupil can be seen in our website. You can play with the range selectors in order to see how it affects the recognition and reach optimal calibration when only the pupil is highlighted. The system will consider the center of gravity of the highlighted area as the coordinates where the eye is pointing. Calibration is very important and it may depend of the illumination. Thats the reason why the Eye of Horus has 4 leds illuminating the area of the eye. The pupil detection system has proven to be quite robust with the illumination provided by the device.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.