How does it work / What I've done so far
I put together a quick demo video (linked at the top of the post) just to document the current state of my prototype.
I'm very early in the process, and honestly, I've kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I've done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the "piping" set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC.
On the hardware-side, I'm using Bitcraze's crazyflie drone with it's lighthouse positioning deck and steamVR's base stations for estimating the drone's 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle.
On the software-side, I'm using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization.
Challenges I've run into so far
Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I'm using Unity's ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity's coordinate frame and uses it to set the gameobject's position. Overall, it's just a lot of forwarding of information (with some annoying coordinate frame transforms along the way).
Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed.
What's next?
Tactically, the next few steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data forwarding (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor.
Thanks for reading through this whole update! If you made it this far, I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I'd be super interested learn if there are other cool directions I could take this project!
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.