Raspberry Pi Zero 3D Printed Video Streaming Robot
To make the experience fit your profile, pick a username and tell us what interests you.
We found and based on your interests.
For the Zerobot robot there are different instructions and files spread over Hackaday, Github and Thingiverse which may lead to some confusion. This project log is meant as a short guide on how to get started with building the robot.
Where do I start?
- Raspberry Pi Zero W - 2x ICR18650 lithium cell 2600mAh - Raspberry camera module - Zero camera adapter cable - Mini DC dual motor controller - DC gear motors - ADS1115 ADC board - TP4056 USB charger - MT3608 boost converter - Raspberry CPU heatsink - Micro SD card (8GB or more) - 2x LED - BC337 transistor (or any other NPN) - 11.5 x 6mm switch - 4x M3x10 screws and nuts
The robot spins/ doesn't drive right
The motors might be reversed. You can simply swap the two wires to fix this.
I can't connect to the Zerobot in my browser
The Raspberry Pi itself with the SD card image running on it is able to display the user interface in your browser. There is no additional hardware needed, so this can't be a hardware problem. Check if you are using the right IP and port and if you inserted the correct WiFi settings in the wpa_supplicant.conf file.
I see the user interface but no camera stream
Check if your camera is connected properly. Does it work on a regular Raspbian install?
Zerobot and Zerobot Pro - What's the difference?
The "pro" version is the second revision of the robot I built in 2017, which includes various hardware and software changes. Regardless of the hardware the "pro" software and SD-images are downwards compatible. I'd recommend building the latest version. New features like the voltage sensor and LEDs are of course optional.
Can I install the software myself?
If you don't want to use the provided SD image you can of course follow this guide to install the required software: https://hackaday.io/project/25092/instructions You should only do this if you are experienced with the Raspberry Pi. The most recent code is available on Github: https://github.com/CoretechR/ZeroBot
All new features: More battery power, a charging port, battery voltage sensing, headlights, camera mode, safe shutdown, new UI
The new software should work on all existing robots.
When I designed the ZeroBot last year, I wanted to have something that "just works". So after implementing the most basic features I put the parts on Thingiverse and wrote instructions here on Hackaday. Since then the robot has become quite popular on Thingiverse with 2800+ downloads and a few people already printed their own versions of it. Because I felt like there were some important features missing, I finally made a new version of the robot.
The ZeroBot Pro has some useful, additional features:
If you are interested in building the robot, you can head over here for the instructions: https://hackaday.io/project/25092/instructions
The 3D files are hosted on Thingiverse: https://www.thingiverse.com/thing:2800717
Download the SD card image: https://drive.google.com/file/d/163jyooQXnsuQmMcEBInR_YCLP5lNt7ZE/view?usp=sharing
After flashing the image to a 8GB SD card, open the file "wpa_supplicant.conf" with your PC and enter your WiFi settings.
After a few people ran into problems with the tutorial, I decided to create a less complicated solution. You can now download an SD card image for the robot, so there is no need for complicated installs and command line tinkering. The only thing left is getting the Pi into your network:
If you don't want the robot to be restricted to your home network, you can easily configure it to work as a wireless access point. This is described in the tutorial.
EDIT 29.7. Even easier setup - the stream ip is selected automatically now
The goal for this project was to build a small robot which could be controlled wirelessly with video feed being sent back to the user. Most of my previous projects involved Arduinos but while they are quite capable and easy to program, there are a lot of limitations with simple microcontrollers when it comes to processing power. Especially when a camera is involved, there is now way around a Raspberry Pi. The Raspberry Pi Zero W is the ideal hardware for a project like this: It is cheap, small, has built in Wifi and enough processing power and I/O ports.
Because I had barely ever worked with a Raspberry, I first had to find out how to program it and what software/language to use. Fortunately the Raspberry can be set up to work without ever needing to plug in a keyboard or Monitor and instead using a VNC connection to a remote computer. For this, the files on the boot partition of the SD card need to be modified to allow SSH access and to connect to a Wifi network without further configuration.
The next step was to get a local website running. This was surprisingly easy using Apache, which creates and hosts a sample page after installing it.
To control the robot, data would have to be sent back from the user to the Raspberry. After some failed attempts with Python I decided to use Node.js, which features a socket.io library. With the library it is rather easy to create a web socket, where data can be sent to and from the Pi. In this case it would be two values for speed and direction going to the Raspberry and some basic telemetry being sent back to the user to monitor e.g. the CPU temperature.
For the user interface I wanted to have a screen with just the camera image in the center and an analog control stick at the side of it. While searching the web I found this great javascript example by Seb Lee-Delisle: http://seb.ly/2011/04/multi-touch-game-controller-in-javascripthtml5-for-ipad/ which even works for multitouch devices. I modified it to work with a mouse as well and integrated the socket communication.
I first thought about using an Arduino for communicating with the motor controller, but this would have ruined the simplicity of the project. In fact, there is a nice Node.js library for accessing the I/O pins: https://www.npmjs.com/package/pigpio. I soldered four pins to the PWM motor controller by using the library, the motors would already turn from the javascript input.
After I finally got a camera adapter cable for the Pi Zero W, I started working on the stream. I used this tutorial to get the mjpg streamer running: https://www.youtube.com/watch?v=ix0ishA585o. The latency is surprisingly low at just 0.2-0.3s with a resolution of 640x480 pixels. The stream was then included in the existing HTML page.
With most of the software work done, I decided to make a quick prototype using an Asuro robot. This is a ancient robot kit from a time before the Arduino existed. I hooked up the motors to the controller and secured the rest of the parts with painters tape on the robot's chassis:
After the successful prototype I arranged the components in Fusion 360 to find a nice shape for the design. From my previous project (http://coretechrobotics.blogspot.com/2015/12/attiny-canbot.html) I knew that I would use a half-shell design again and make 3D printed parts.
The parts were printed in regular PLA on my Prusa i3 Hephestos. The wheels are designed to have tires made with flexible filament (in my case Ninjaflex) for better grip. For printing the shells, support materia is necessary. Simplify3D worked well with this and made the supports easy to remove.
After printing the parts and doing some minor reworking, I assembled the robot. Most components are glued inside the housing. This may no be professional approach, but I wanted to avoid screws and tight tolerances. Only the two shells are connected with four hex socket screws. The corresponding nuts are glued in on the opposing shell. This makes it easily to access the internals of the robot.
For...
Read more »DISCLAIMER: This is not a comprehensive step-by-step tutorial. Some previous experience with electronics / Raspberry Pi is required. I am not responsible for any damage done to your hardware.
I am also providing an easier alternative to this setup process using a SD card image: https://hackaday.io/project/25092/log/62102-easy-setup-using-sd-image
https://www.raspberrypi.org/documentation/installation/installing-images/
This tutorial is based on Raspbian Jessie 4/2017
Personally I used the Win32DiskImage for Windows to write the image to the SD card. You can also use this program for backing up the SD to a .img file.
IMPORTANT: Do not boot the Raspberry Pi yet!
Access the Raspberry via your Wifi network with VNC:
Put an empty file named "SSH" in the boot partiton on the SD.
Create a new file "wpa_supplicant.conf" with the following content and move it to the boot partition as well:
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
network={
ssid="wifi name"
psk="wifi password"
}
Only during the first boot this file is automatically moved to its place in the Raspberry's file system.
After booting, you have to find the Raspberry's IP address using the routers menu or a wifi scanner app.
Use Putty or a similar program to connect to this address with your PC.
After logging in with the default details you can run
sudo raspi-config
In the interfacing options enable Camera and VNC
In the advanced options expant the file system and set the resolution to something like 1280x720p.
Now you can connect to the Raspberry's GUI via a VNC viewer: https://www.realvnc.com/download/viewer/
Use the same IP and login as for Putty and you should be good to go.
sudo apt-get update sudo apt-get upgrade sudo apt-get install apache2 node.js npm git clone https://github.com/CoretechR/ZeroBot Desktop/touchUI cd Desktop/touchUI sudo npm install express sudo npm install socket.io sudo npm install pi-gpio sudo npm install pigpio
Run the app.js script using:
cd Desktop/touchUI sudo node app.js
You can make the node.js script start on boot by adding these lines to /etc/rc.local before "exit 0":
cd /home/pi/Desktop/touchUI sudo node app.js& cd
The HTML file can easily be edited while the node script is running, because it is sent out when a host (re)connects.
Create an account to leave a comment. Already have an account? Log In.
Such a well thought out and well designed project. It all seems to work so perfectly, combining multiple skills to make it work.
Is there any config file? Forward and backwards works fine, but my steering works the wrong way.
You can either change the wiring or look for the files touchUI or app.js in the folder TouchUI on the desktop. Its probably the easiest thing to just swap the cables for the left and right motor.
I found the touch.html-File and added
x = 0 - x;
to the function tankDrive(x, y). Now the bot works fine!
Works very well-- Even over the internet. Impressed by how responsive the websocket/mjpg combo is and how intuitive the browser based controls are. I used two batteries, added a charging port, and it still fits just fine. Thanks for the guide!
That's great to hear! If you want you could upload a picture of your robot to Thingiverse:
Hey! I finally made it! Works pretty well and the video stream is pretty responsive.
One thing though... it's quite difficult to control, especially because it turns so easily... One slight movement to the right/left, and it starts spinning. Is there any way to adjust the sensitivity?
Well done so far! The motors are not really made for slow movement. I tried to adjust the dead-zone so that the robot starts driving immediately instead of just beeping the motors. You can reverse this by removing these lines of code from Touch.html:
if(leftMot > 0) leftMot += 90;
if(leftMot < 0) leftMot -= 90;
if(rightMot > 0) rightMot += 90;
if(rightMot < 0) rightMot -= 90;
A charging port would be nice and will definitely be included if I'm going to make a second version. Right now you can easily modify the original CAD files (link is on this page) to include any charging port you need.
Thanks. I tried that, but removing the dead zone does not completely solve the quick spin issue. I have to figure out a way to indicate that if there is not enough forward speed, turning has to be slower than the indicated.
By the way. I tried adding a microsub port on the back, but soldering it was a pain as it kept desoldering itself. I ended up creating a dock, just like a roomba. It is much easier to charge it this way, and it is possible to leave it powered on and ready to move.
By the way, it would be convenient to have an opening in the back for a female microusb port, in order to charge the robot without having to open the cover.
Got it all up and running, can't get it to reverse? Any ideas?
I'm planning to use code and all instruction of this great project to hack a RC car featuring two motor driving 2 wheels each from the same side. Of course it'd be better if it may go reverse. I'm interested in answer to your request !
Have a nice day
Sorry for the late response. The Hackaday.io feed seems to show me anything but comments on my projects. Have you tried measuring the voltages on the GPIO pins (with a multimeter) when driving forward/backward? It should read 0-3.3 V.
where did people get the ninjaflex printed tires made at?
Try sculpteo for flexible filament, i.materialise, or even 3dhubs.com for lower price
Sorry now I see I have n Error:
pi@raspberrypi:~/Desktop/touchUI $ sudo node app.js
2017-06-13 09:47:26 initInitialise: Can't lock /var/run/pigpio.pid
/home/pi/Desktop/touchUI/node_modules/pigpio/pigpio.js:11
pigpio.gpioInitialise();
^
Error: pigpio error -1 in gpioInitialise
at initializePigpio (/home/pi/Desktop/touchUI/node_modules/pigpio/pigpio.js:11:12)
at new Gpio (/home/pi/Desktop/touchUI/node_modules/pigpio/pigpio.js:25:3)
at Object. (/home/pi/Desktop/touchUI/app.js:9:8)
at Module._compile (module.js:569:30)
at Object.Module._extensions..js (module.js:580:10)
at Module.load (module.js:503:32)
at tryModuleLoad (module.js:466:12)
at Function.Module._load (module.js:458:3)
at Function.Module.runMain (module.js:605:10)
at startup (bootstrap_node.js:158:16)
It is possible that the autostart script is already running while you are trying to start it manually. Did you remove these lines for testing from rc.local?
cd /home/pi/Desktop/touchUI
sudo node app.js&
cd
Try to remove these lines, the cd into the touchUI folder and run app.js
Anyway I'm not much of an expert on Raspberry Pi or Linux so I don't really know if I can help you with this.
when I install the pacetes it looks like so:
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install express
npm WARN saveError ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN enoent ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN pi No description
npm WARN pi No repository field.
npm WARN pi No README data
npm WARN pi No license field.
+ express@4.15.3
added 42 packages in 47.102s
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install socket.io
npm WARN deprecated isarray@2.0.1: Just use Array.isArray directly
> uws@0.14.5 install /home/pi/node_modules/uws
> node-gyp rebuild > build_log.txt 2>&1 || exit 0
npm WARN saveError ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN enoent ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN pi No description
npm WARN pi No repository field.
npm WARN pi No README data
npm WARN pi No license field.
+ socket.io@2.0.3
added 36 packages in 361.082s
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install pi-gpio
npm WARN saveError ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN enoent ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN pi No description
npm WARN pi No repository field.
npm WARN pi No README data
npm WARN pi No license field.
+ pi-gpio@0.0.8
added 1 package in 24.732s
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install websocket
> websocket@1.0.24 install /home/pi/node_modules/websocket
> (node-gyp rebuild 2> builderror.log) || (exit 0)
make: Entering directory '/home/pi/node_modules/websocket/build'
CXX(target) Release/obj.target/bufferutil/src/bufferutil.o
SOLINK_MODULE(target) Release/obj.target/bufferutil.node
COPY Release/bufferutil.node
CXX(target) Release/obj.target/validation/src/validation.o
SOLINK_MODULE(target) Release/obj.target/validation.node
COPY Release/validation.node
make: Leaving directory '/home/pi/node_modules/websocket/build'
npm WARN saveError ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN enoent ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN pi No description
npm WARN pi No repository field.
npm WARN pi No README data
npm WARN pi No license field.
+ websocket@1.0.24
added 4 packages in 84.744s
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install pigpio
> pigpio@0.6.0 install /home/pi/node_modules/pigpio
> node-gyp rebuild
gyp WARN EACCES user "root" does not have permission to access the dev dir "/root/.node-gyp/8.1.0"
gyp WARN EACCES attempting to reinstall using temporary dev dir "/home/pi/node_modules/pigpio/.node-gyp"
make: Entering directory '/home/pi/node_modules/pigpio/build'
CXX(target) Release/obj.target/pigpio/src/pigpio.o
SOLINK_MODULE(target) Release/obj.target/pigpio.node
COPY Release/pigpio.node
make: Leaving directory '/home/pi/node_modules/pigpio/build'
npm WARN saveError ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN enoent ENOENT: no such file or directory, open '/home/pi/package.json'
npm WARN pi No description
npm WARN pi No repository field.
npm WARN pi No README data
npm WARN pi No license field.
+ pigpio@0.6.0
updated 1 package in 70.828s
And no I have only a black background and the camera live Picture
I'm sorry but I can't do more than google your error messages. I don't know what the problem is. Maybe you have to start from a fresh install.
I get exactly the same error message (Raspberry Pi B, debian wheezy). I should try it on a fresh install.
Thank you for your answer @Ole Madsen.
now when I give my "ip:3000" I see the live picture from the camera.
I don´t see in the left corner the text an when I click with the mouse I don´t see this rings
Sorry for replying so late.
Have you solved the problem yet?
You have to connect to the port that contains the html/javascript (probably 9000). If you connect to the camera (3000), you will only see the stream.
hey, no i dont sloved my Problem...
With witch port i shoud use too drive the robot.
The port 9000 ist the prort from the camera... i got the site to set the camera ...
With witch port can i drive the robot.
Thank you for your help
The projekt is verry good i Love it
I was wrong with the ports at first: According to my tutorial and file, the port for controls and camera is 3000 (it is defined in app.js). If you see the camera and a black background, that means you already have a running node.js script. Maybe you are missing a library. Are you starting the node.js script from the command line or is it already set to autostart? In the command line are there any warning when starting the script?
I have following problem:
Can anyone help me, where the Problem are?
pi@raspberrypi:~/Desktop/touchUI $ sudo npm install socket
npm WARN engine socket@0.0.1: wanted: {"node":">= 0.6.0 < 0.7.0"} (current: {"node":"0.10.29","npm":"1.4.21"})
/
> microtime@0.2.0 install /home/pi/Desktop/touchUI/node_modules/socket/node_modules/microtime
> node-waf configure build
sh: 1: node-waf: not found
npm WARN This failure might be due to the use of legacy binary "node"
npm WARN For further explanations, please read
/usr/share/doc/nodejs/README.Debian
npm ERR! microtime@0.2.0 install: `node-waf configure build`
npm ERR! Exit status 127
npm ERR!
npm ERR! Failed at the microtime@0.2.0 install script.
npm ERR! This is most likely a problem with the microtime package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! node-waf configure build
npm ERR! You can get their info via:
npm ERR! npm owner ls microtime
npm ERR! There is likely additional logging output above.
npm ERR! System Linux 4.9.24+
npm ERR! command "/usr/bin/nodejs" "/usr/bin/npm" "install" "socket"
npm ERR! cwd /home/pi/Desktop/touchUI
npm ERR! node -v v0.10.29
npm ERR! npm -v 1.4.21
npm ERR! code ELIFECYCLE
npm ERR!
npm ERR! Additional logging details can be found in:
npm ERR! /home/pi/Desktop/touchUI/npm-debug.log
npm ERR! not ok code 0
hi, I think it is a very cool project! But I also got the same problem.
But I did found out it was problably an type error in the guide and an error with node.js version. So here is an updatede guide that should work :-)
cd ~
wget https://nodejs.org/dist/latest/node-v8.1.0-linux-armv6l.tar.gz
cd /usr/local
sudo tar xzvf ~/node-v8.1.0-linux-armv6l.tar.gz --strip=1
cd ~
sudo apt-get install apache2 npm git
git clone https://github.com/CoretechR/ZeroBot Desktop/touchUI
cd Desktop/touchUI
sudo npm install express
sudo npm install socket.io
sudo npm install pi-gpio
sudo npm install websocket
+install this way https://github.com/fivdi/pigpio#installation
the run
sudo npm install pigpio
If you run off you own router, please change the touch.html. http://10.0.0.1:9000 to your ip:9000. ex. http://192.168.1.22:9000
Thanks a lot for your solution. I will try to update the instructions.
Yes, but probably at a later time. OpenCV would be a nice addition.
Would you consider making the Fusion 360 Models available, so they can be modified easily?
No problem, a link to the Fusion 360 model is now on the project page.
Hi Max, can i supply 5V to both raspberry and DC driver
The 5V should not be coming from the same source. The noise from the motors can potentially harm the Raspberry. Try to use two voltage sources, but both devices can handle 5V.
how do we use the single 5vlt source for both some kind of caps or something on the motors?
This is bloody marvellous matey!
The design is fantastic, and the use of common parts marks this a fantastic project.
I need to get this printed somewhere!
Can this be made to be controlled from anywhere? Connect to router with port-
forwarding?
Yes, connecting it to a router is much easier than the access point mode. Port forwarding should not be a problem from there.
Become a member to follow this project and never miss any updates
By using our website and services, you expressly agree to the placement of our performance, functionality, and advertising cookies. Learn More
First, thank you for this tutorial. It will allow some intresting project in the future.
Then, i have a problem with app.js when I try to connect to "My IP":3000 I get the message :
ERR_CONNECTION_REFUSED
that was corrected by adding :
cd Desktop/touchUI
sudo npm init -Y
sudo npm install express
to Madsen's comment.
Here again thank you this code was adapted in short time to one of my previous project