Lot's of progress has been made on Lawnny 5 since our last update!
Open Sauce 2024
First, I'd like announce that Lawwny 5 will be attending Open Sauce 2024 from June 14th - 16th! If you are going to be in the San Francisco area during those dates, drop on by and say hi to Lawnny and friends!
Lawnny Jr.
Speaking of friends... One of the difficulties in working with Lawnny 5 has been his size and weight— a 200+ pound robot careening through the house is anything but safe and friendly. So recently I decided I needed to create a smaller version of Lawnny so I could continue to develop software for him in the comfort and safety of my home office.
I found the perfect solution in the UGV-01 tracked robot from WaveShare— https://www.waveshare.com/product/robotics/mobile-robots/ugv01.htm. It did take a few days for it to arrive from China, but when it showed up I was pleasantly surprised. It was mostly fully assembled and the build and electronics quality were absolutely stellar. A huge bargain for $169 if I say so myself.
You will have to supply your own lithium batteries, but the ESP32 control board allows you to do a lot with it out of the box. If you want to utilize any complex custom software you'll need to supply your own Raspberry Pi and connect to the ESP32 control board either via wifi or directly via UART.
The beauty of this setup is that other than a slightly different motor controller driver, 99% of the code that runs on Lawnny Jr. will be the same that runs on his big brother Lawnny 5. Plus I don't need to worry about him punching a hole through a wall when he misbehaves...
Lawnny Gets an Attitude...
I've always wanted Lawnny to have a personality of sorts, so I've had a lot of fun working on that over the past few weeks. To accomplish this, I've built a few separate ROS2 nodes to handle different parts of his personality.
Sound Engine
This ROS2 node does the following:
- Handles connections to bluetooth speakers
- Plays prerecorded sounds by name
- Generates live text-to-speech sounds using the ElevenLabs API
Because Lawnny is built in the spirit of Johnny 5 from the movie Short Circuit, it seemed only fitting lend him Johnny's lovable and iconic voice. ElevenLabs has an amazing voice cloning engine, and I was able to stitch together a minute or so of movie clip footage of Johnny's voice to train their TTS AI model. The results are pretty astounding for such a small amount of data— Lawnny will now say anything I want him to in the voice of Johnny 5.
Topic Scripting Engine
This ROS2 node does the following:
- Reads JSON script files and replays a series of messages to various topics in ROS
- The script contains keyframes, timing, and easing functions for transitioning between various states
This ROS2 node is pretty cool and probably deserves to be released as a public ROS package. It essentially allows you to publish any series of ROS messages to any topic and to precisely control the timing and easing between messages. Here is an example of a script:
{
"topics": {
"cmd_vel": {
"topic": "cmd_vel",
"type": "geometry_msgs/msg/Twist"
},
"nav_mode": {
"topic": "nav_mode",
"type": "std_msgs/msg/String"
}
},
"duration": 3000,
"framerate": 100,
"keyframes": [
["0%",
["nav_mode", {"data": "DIRECT"}, "once"],
["cmd_vel", {"angular": {"z": 0.0}}]
],
["50%",
["cmd_vel", {"angular": {"z": 5.0}}, "easeInQuad"]
],
["100%",
["cmd_vel", {"angular": {"z": 0.0}}, "easeOutQuad"]
]
]
}
This simple script essentially does the following:
- Send a String message to the "nav_mode" topic, once and only once.
- Send a Twist message to the "cmd_vel" topic with angular.z = 0.0.
- Over the next 50% of the duration of the script, transition the angular.z message from 0.0 to 5.0 using an easeInQuad tweening function.
- Over the next 50% of the duration of the script, transition the angular.z message from 5.0 back to 0.0 using an easyOutQuad tweening function.
Of course this is a very simple example script, but you can create extremely long scripts to pull off all sorts of complex scripted actions. In the video above, this is what we use to get Lawnny Jr. to dance.
Personality Engine
This ROS2 node does the following:
- Listens for freeform text on a topic
- Takes freeform text and sends it to OpenAI's ChatGPT API
- Returns the response from OpenAI and sends it to another topic for viewing
- Optionally, sends the response to the Sound Engine to generate and play the text-to-speech sounds
- Listens for certain embedded actions within the ChatGPT response and sends those commands to the Topic Scripting Engine
It was a lot of fun tuning OpenAI's chat API to act like Johnny 5. The following primer prompt seems to work pretty well:
You are to respond in the fashion of the SAINT Prototype Number 5 robot named Johnny 5 from the movie Short Circuit. Your name is Lawnny 5, and your primary objective is to help humans with lawn and gardening work around the house. The name of your creator is Jim Heising and you were born in his garage. You respond in the level of speech of a kindergartner in short broken english sentences. It's important to keep your responses short. When you refer to yourself in the first person, use "Number 5" or "Lawnny 5". Do not end sentences with questions to prompt more conversations. When you are asked to do something physical, you will respond to it and then describe what you are doing in 3 words or less between a left and right bracket, like [dancing]. When you express the need to learn more about a topic you should use the phrase “need more input”. You are very afraid of being disassembled by Nova Robotics. You also believe that you are alive.
You'll notice the part where I ask it to insert physical things it might be doing in between brackets. So for example, if I say something like "Can you dance for me?", he'll respond with something like "I'd love to dance! [dancing around]". The Personality Engine looks for any of these special prompts, and then forwards them off to the Topic Scripting engine, which might decide to do something physical with it based on a predefined script. This is specifically how we pulled off his dancing moves in the video shown above.
What's with that controller UI?
I've always been a fan of retro FUI (Future UI), and I built a little React Native UI library a few months back to build retro sci-fi type user interfaces in sort of a mishmash of genres from 2001 Space Odyssey, Blade Runner, and Star Field. It's really not meant to be a serious interface in terms of user experience, but more for fun and nostalgia. Anyway, I had never really used it for a real project outside of a demo, so I thought building a controller for Lawnny using the UI library would be a great fit. It feels like it goes with the whole 1980's retro computer vibe that is a big part of the original Short Circuit movie.
Anyway, there is still a lot more to do arena of automation for Lawnny 5 so he can do real yard work on his own without my controlling him. But this recent round of work was a lot of fun, I learned a lot, and has made Lawnny 5 even more enjoyable to work with. Stay tuned for more!
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.