The HTC Vive's position tracking system is a thing of absolute beauty. Instead of trying to describe it here, check out this description. All the credit for this design goes to Valve. They're very generously freely licensing the technology to anyone. And they've gone further by really enabling the community by designing and making available tracking hardware that can trivially be connected to a microcontroller. Big kudos to Alan Yates for designing this system, and to Valve for opening it up.
The first TS3663-CM1 parts went up for sale late last week and I was fortunate enough to get my hands on a 10-pack. They showed up on my doorstep this afternoon. Time to get building!
A couple of important considerations in using this sensor for determining position are 1) you need to measure the spacing of pulses with very high precision, and 2) you need to be able to read from a bunch of the sensors at once (5 are needed to get full 3D position and orientation). Of the various Arduino boards in my toolbox, the Due looked the best to me. It has an 84Mhz clock, and can measure time at 42Mhz (~24ns per timer tick). It also has a ton of digital IO pins, all of which support interrupts.
With that in mind, I've written a first proof-of-concept program to read a single sensor and report basic position data back out the serial port. The code is ugly, but it works great. The position information is incredibly stable, and the precision appears to be sub-centimeter even when the lighthouse is ~15 feet (4.5m) away.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.
Boo Ya!
I'm a little stuck right now on implementing the algorithms for getting location. Some of the other examples I've seen in the press are using 2 lighthouses and doing triangulation with that. I really want to get the positioning working with a single lighthouse and build from there. (With triangulating with 2 lighthouses, in the likely case that one is obscured, you can't calculate a position. Using a single lighthouse for positioning gives you either redundancy or a larger area in which you can track an object.)
Fundamentally, the problem is essentially the same as the perspective-n-point problem. (https://en.wikipedia.org/wiki/Perspective-n-Point) . EPnP is the gold standard algorithm for solving the problem. There's a nice implementation that uses OpenCV, so my thinking is to first get that code working with data I capture and process offline. Then, once I've got something working, start the job of porting the code to a microcontroller.
That whole "getting the code working" appears to be the hard part. The implementation using OpenCV obviously expects that you're using a camera. But, the lighthouse isn't quite the same as a camera. It let's us know (at high precision) the angular position to each sensor. I think it's valid to consider the two angles (inclination and azimuth) as x and y coordinates of an image that is rather fish-eyed. OpenCV should be able to handle that fairly well. The next step is to get the "camera" intrinsic and extrinsic parameters for the lighthouse. (Getting these is where I'm at right now). OpenCV can auto-generate these using one of two image types. One is a circle, and the other is a chessboard. I think the chessboard could be simulated fairly easily given a sufficient number of sensors (I'm not sure I have enough TS3633-CM1 modules-- maybe) in a grid pattern.
Once you have the intrinsic and extrinsic parameters for the lighthouse "camera", you should be able to fairly easily run EPnP, which will calculate the camera position relative to the tracked object. It should be easy to flip this around to calculate the tracked object position relative to the camera.
Getting to this point will be kind of a "magical point" I think, in that the rest is really about squeezing the code onto an appropriate microcontroller.
There are, of course, plenty of other ways of doing this. And if this doesn't work well enough, I think the next step is to add in and make heavy use of a Kalman Filter to get better positions out. If you're interested in jumping in, I'll take all the help I can get!
--Mike
Are you sure? yes | no
let's do this!
Are you sure? yes | no
By leveraging the work cnlohr is doing right now, I'm hopeful this project will make a nice leap forward soon. The localization problem is basically the same regardless of whether it's being implemented on a laptop or microcontroller. Props to him as well for an MIT licensed solution and few external dependencies.
Are you sure? yes | no