At our first meeting, we define the scope of this project, which eliminated the proposed specific objectives and set goals to achieve at the time of this challenge, but never overlooked the general purpose of the project wich is "controlling a house through EEG, using the method of motor imagery, improving the quality of life for people with disabilities".
The elimination of the objectives is due to the complexity of it and the limited number of individuals who make up the Team Brainmotic. We decided to prioritize 3 key aspects of this project that allow reaching the main objective, wich is:
- The capture and process of EEG signal for pattern recognition.
- The User interface (UI).
- Control some elements of common areas (kitchen, bathroom and bedroom) belonging to the user's home.
Considering these aspects, we begin to reshape the project and draw a sketch about the block diagram which explaining the project.
As shown in the figure, each of these aspects is assigned to a team member of Brainmotic, wich is:
- Team EEG: Lead by the engineer Daniel Felipe Valencia, he is the responsible of capturing the EEG, process them and recognizes the patterns of the signals.
- Team User interface: lead by the student María Camila Guarín, she is in charge of designing the User Interface.
- Team Assistive Unit: lead by the student Daniel Poveda, he is the responsible for creating a generic unit wich will be distributed in the common areas of the user's home.
The Brainmotic team is made up of people from the Universidad de San Buenaventura-Cali, Colombia. :)
To explain the behavior of the system and the order of the functional blocks that compose it, you should start with the block on the left representing a user wearing a helmet with electrodes.
This system begins with the acquisition of the bioelectric activity of the brain's (EEG). The user must user a helmet that has eight (8) electrodes connected to an ADS1299 (digital-analog-converter designed for this purpose). This ADS1299 is wired to the RPI2 and use the SPI protocol for sending the capture EEG signal. Once this signal is in the RPI2, this will be analyzed by the Tensor Flow software embedded in the RPI2.
The block who represent the RPI2 has a block that represents the UI that will operate the user with his "brain". This UI will have a menu displaying the elements to be controlled depending on the room where the assistive unit will installed. For trading information between the Assistive units and the RPI2, both will use WiFi technology, operating with an ESP8266 module. The obtained data from the assistive unit are read by the RPI2 and displayed on the UI of this system.
Each assistive unit that is distributed around the house will have a PSoC microcontroller, a lux sensor (light sensor), a temperature sensor, two circuits for power on and off either a light bulb or an electrical outlet.
After many meetings, we have improved the design of the system by proposing improvements to the functional block diagram of the scheme.
This scheme performs the same tasks as the first layout. We decided to put together in a block called "central unit" the processing of the EEG and the UI. This central unit (CU) has been distributed in two RPI and divided the tasks for not overload on one RPI with all the work. We decided to use the O.S. Jessie lite for both RPI. For simplicity we change the ESP8266 of the RPI for an ethernet modem TPLINK, wich helps the CU for communicating with the assistive units, aside; we will use the default router in the user's home to facilitate the connection of all modules (CU and assistive units).
To make this system work, the team Brainmotic taking into account the three (3) aspects already mentioned, we defined some goals to achieve the primary objective of the project. These goals are:
- Develop and Evaluate the UI.
- Set the position of the helmet's electrodes, Acquire EEG signals and define the patterns with which we will work.
- Develop an Assistive Unit that can be placed in any room of the user's home.
- Develop the communication between the RPI2-Thingspeak-assistive Units.
- Install Thingspeak and MySQL in the RPI2.
- Install Tensorflow in the RPI3.
- Evaluate the system.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.