Here are some more details on how the robot will interact with google spreadsheets/forms/scripts: First, on the android app there will something like a loop or an event handler which will receive commands from , and send data/ sensory input, etc back to, the google spreadsheet or form or app.( any are possibilities- to start out I will use a spreadsheet probably to receive data.) for starters, an acknowledge signal will be passed back up to google. Also, images, sounds(maybe an audio stream), if I ever put other sensors like transducers on the robot, that data as well. The robot can receive a set of commands such as: go left, right, forward, back, up, down, take a picture(or I might have a mode where it streams pictures or video we will see). I also plan on trying to allow it to listen with the speech recognition api and send the results if what it heard back up to the cloud brain. More to come including code examples as soon as I can get a break from a giant project I have at work...
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.