What's the problem?
Unfortunately, whilst the voltages for fist-clenched and non fist-clenched (relaxed hand) are likely to be significantly different. The voltages for other hand actions (e.g. shaking hands, holding a pen, gripping a mouse) might not be so different to the voltages we get for fist-clenched!! So erm, first of all we need to see what kind of voltages we get from the EMG sensor for different hand positions!
Additionally, different people will produce different voltages! So we need to try this for for several people too. Well, it will be a fun experiment to get data for that I think! And it might be useful for others to use!
Training the HelpMe!
Now, of course, there is another thing we could do! We could get each HelpMe! to learn the the correct voltages corresponding to fist-clenched for each individual user. Well, to be honest, I think this is the only approach! Since we don't have a one-size fits all for fist-clenched voltages! So I guess we will have to do this with software on the Feather, the smartphone app (to direct the user what to do e.g. clench you fist for training purposes, and to instruct the Feather which code to run). We can save the results to the MicroSD on the Feather.
I guess everyone is thinking, oh isn't there some machine learning thing we can do instead? Yeah, I did think of that. We could collect a large dataset of different hand position voltages. Maybe it is something to think about in the future..
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.