Close

Software

A project log for Wearable GPS

Neoprene Google Maps-enabled headband with haptic GPS directions.

andreAndre 10/16/2017 at 05:170 Comments

The Arduino software was pretty straightforward. I used a combination of tutorials and examples for all of the components from various sources to create the final sketch (available for download in the project zip). The Adafruit libraries for their Bluefruit ecosystem, as well as their LSM303DLHC breakout, made light work of the respective features in the sketch. The boost converter functionality didn't require any code, it just works because of how it's arranged on the PCB. Interfacing with the demultiplexer was also easy enough. Its four inputs A, B, C, and D are routed to A2, A1, A4, and A5 on the Feather respectively. The enable pin, which must be low for the mux to work, is connected to A3. From there, progressively turning on the inputs in a binary fashion results in connecting one of 16 motors to the 5V output from the boost converter. The sketch is a Frankenstein's monster of various example code and a few of my own algorithms, so I apologize if its readability is less than crystal clear.

Android is of course very compatible with Bluetooth 4.0, but I'm more comfortable with Xcode than Android Studio, so while I'm still alpha testing I only made an iPhone app. The app is a tab-based application with 3 sections. The first section has a Google Maps GMSMapView and a search bar. This is where the bulk of user interaction will take place. A destination can be typed in or dictated into the search bar. Then, an HTTP request to Google Maps should deliver a coordinate-based, precise route from the user's current location to their queried destination. The bearing to the next straight-line destination (for example, the closest intersection on the route to the final destination) is sent to the WGPS headband. The WGPS then combines this bearing with the orientation of the wearer's head that it has from the LSM sensor, and pulsates the motor that is the closest to this bearing. As soon as the wearer reaches within 15 meters of this via-point, the app switches to the next one. This process repeats until the final destination is reached.

Since the target audience for this device is the visually impaired, the app uses Siri's voice to give occasional status updates about the route, such as when big intersections are reached, and the estimated time remaining until arrival.

The second tab is just a table of available BLE devices that match the Feather's UUID, which is used to detect and connect to the WGPS. The final tab has a circular slider where any heading from 0° to 360° can be selected and beamed up to the device for debugging purposes.

The result is an intuitive and responsive interface for interpreting GPS directions that looks and feels very cool.

For those of you interested in making your own Swift app for use with the Adafruit Feather, here are some tutorials that I found very useful to get me going:

How to use the HM10: Connect your Arduino to an iPhone with Bluetooth

Create a Bluetooth LE App for iOS

Additionally, feel free to use any of my Swift code, downloadable from the Files section.

Discussions