STM32F103-Robot
🎉 A quadruped crawling robot based on STM32F103C8T6 🎉
project introduction
Using STM32F103C8T6 as the main controller, combined with ESP32-CAM and MQTT, to achieve real-time image feedback and remote control.
Two solutions, cloud control and local control, were used for the control of the robot. The mobile app written by Flet can communicate with EMQX installed on the server through MQTT protocol. The commands transmitted through the cloud are received by ESP32-CAM and transmitted to the microcontroller through USART2. The microcontroller can then control the module based on the commands to make the robot perform corresponding actions. The video stream captured by ESP32-CAM will be uploaded to the server in real time, and after being received by the Nodejs Server deployed on the server, it will be presented through a webpage, thus realizing the entire process of cloud control.
Local control, on the other hand, is realized by Microdot, which accesses the local control panel by accessing a specified IP after connecting to a hotspot issued by the ESP32-CAM. The ESP32-CAM can be controlled by clicking a button on the web page, and the ESP32-CAM sends commands to the Microdot through the serial port. The image stream returned by the ESP32-CAM will be displayed in real time on top of the local control panel via WIFI.
The repository contains all the code needed for this project, which also includes the 3D modeling files for the robot.
Project design process
The structural design of the quadruped crawling robot was carried out using the open-source software FreeCAD. Based on the size of the MG90S servo motor, grooves and screw mounting holes have been designed in the legs. A battery installation hole is reserved in the middle of the body, and screw holes are also designed at key positions to make the structure more sturdy and convenient for later installation.
After the modeling is completed, use slicing software to slice the model and import the slicing results into a 3D printer for printing. The filling density is 0.1mm, the layer thickness is 0.2mm, and the extrusion head temperature is set to 210 ℃. The printing material selected is PLA material with a diameter of 1.75mm. Screw holes are reserved on the power board for installing nylon columns. Each functional module is connected by a universal circuit board and stacked on top of the body through nylon columns, making it convenient for quick loading, unloading, and debugging.
The power module of the quadruped crawling robot uses TI's TPS5430, which was referenced in the design Electric competition module: TPS5430 positive and negative power output module. Three power sources are designed on the power module to supply power to the servo and the main control and functional modules.
After the robot is powered on, the first step is to initialize the functional modules and related peripherals, while also initializing FreeRTOS. The subsequent tasks are then handed over to FreeRTOS for scheduling. Before receiving a command from USART2, the microcontroller executes the default task. In the default task, the robot always remains in a stopped state, waiting for commands sent by the user. After receiving the command in USART2, the microcontroller will enter the serial interrupt program and determine the specific meaning of the instruction within the interrupt program, and execute the corresponding task based on the instruction content.
When USART2 receives a message from ESP32-CAM, the microcontroller will execute the interrupt function of USART2 to evaluate the received information. Firstly, create an array in the code to set a buffer, and use HAL library functions to store the received data in the buffer. Then, match the contents of the buffer to determine if it is the corresponding command.
In the code, it is first determined whether the task being executed by the current robot is already activated. If it is, it is skipped directly to avoid the microcontroller crashing due to repeated activation of the task. After determining that the task has not been repeatedly activated, trigger the robot's action task through event bits. After the action task is triggered, execute the corresponding task content according to the task flag.
After matching the corresponding task content, start executing the corresponding code. First, display the current task status on the OLED, and then execute the corresponding action function to complete the task process. The task will be repeatedly executed after being activated until it is interrupted by a new instruction.
⚠ Note: The project also includes the driver implementation of STM32 for OLED12864, HC-SR04 ultrasonic module, and PCA9685 servo driver module. The driving principle can be seen from the project code, and will not be elaborated here. You can download the project code to view it yourself.
Microdot is a very small web framework designed for resource constrained systems such as microcontrollers and embedded systems, suitable for running on MicroPython. In the design of this article, the WIFI support feature of Microdot and ESP32-CAM is utilized to build a small web server on ESP32-CAM to achieve interaction between users and robots.
After power on, ESP32-CAM will execute its internal program on its own. The program will initialize the camera device at the beginning and start the AP hotspot to wait for the user to connect. After the user connects to the AP hotspot sent by ESP32-CAM, they can access 192.168.4.1 in their browser to enter the mode selection webpage. The front-end webpage here is also saved in the form of a file in ESP32-CAM, so the entire connection process is completely executed locally.
Taking local control as an example, after selecting "local control", the user sends a Get request to ESP32-CAM. After receiving the command, ESP32-CAM returns the webpage of the local control panel. You can directly see the image stream captured by ESP32-CAM in the local control webpage. At the bottom of the screen are the control buttons for the robot, namely forward, backward, left turn, and right turn. After the user presses the button, an information prompt will appear in the middle of the button, informing the user of the current function and status of the button being pressed.
After the user presses the button, a command will also be sent to ESP32-CAM in the form of a Get request. Upon receiving the command, ESP32-CAM directly outputs the command through the serial port, which is connected to USART2 of STM32. Thus, user control over the robot has been achieved.
If the user selects AP distribution network, data is also sent to ESP32-CAM in the form of a get request, and ESP32-CAM returns the distribution network page to the user. In the configuration page, users need to enter the WIFI name and password. Due to the fact that ESP32-CAM only supports 2.4G, users need to pay attention to whether the WIFI frequency band they are connecting to is 2.4GHz.
After the user submits the information, they will be redirected to the confirmation interface. This design provides users with an opportunity to make modifications. If the information previously filled in is incorrect, the user can return to make changes. If the information is confirmed to be correct, the confirmation button will be clicked to submit to ESP32-CAM.
After receiving WIFI information from the user side, ESP32-CAM executes the code to connect to the WIFI hotspot, disconnects from the user while connecting to WIFI, and connects to the MQTT server that has been set up in the code. This completes the process of local AP distribution network and cloud connection.
After the robot is powered on, the first step is to initialize the functional modules and related peripherals, while also initializing FreeRTOS. The subsequent tasks are then handed over to FreeRTOS for scheduling. Before receiving a command from USART2, the microcontroller executes the default task. In the default task, the robot always remains in a stopped state, waiting for commands sent by the user. After receiving the command in USART2, the microcontroller will enter the serial interrupt program and determine the specific meaning of the instruction within the interrupt program, and execute the corresponding task based on the instruction content.
Flet is a Python based application framework that allows developers to build web, desktop, and mobile applications using the Python language, inspired by Google's Flutter. Flet's design is very lightweight and supports multiple platforms, with a set of cross platform features for multi use of code. Building on Python language makes applications easier to maintain, while the community provides rich controls that make it easy to build user interfaces.
The app is mainly divided into two parts: the "Description" page and the "Control" page. The purpose of the "Instructions" page is to provide users with a brief introduction to the usage of the app, including the control method of the robot and the distribution network process. The "Control" page provides users with a control interface and displays the video streams obtained from the cloud on the interface.
When the user selects the "Control" button, the app will connect to the set IP address, port, and subscribed topic according to the settings in the program, and connect to the EMQX server. The video on the page is displayed in the form of a WebView control, which shows the image stream on the webpage.
When the user presses the button, the app sends commands to the EMQX platform through the network. Since both ESP32-CAM and the app are connected to the server and subscribe to the same topic, ESP32-CAM can also receive messages from the cloud server while the app sends messages. After receiving the command in ESP32-CAM, the command is forwarded to STM32 through the serial port, triggering a serial port interrupt and executing the corresponding task. This completes the control process from the user end to the robot through cloud services.
Robot Gait Planning
The robot's movement can be divided into three states, taking the forward movement as an example:
- Action 1: The robot aligns its left and right forelimbs parallel to its body, with its hind limbs at a 135 ° angle to its body. Its four feet are perpendicular to the ground, preparing for the next step.
- Action 2: The left front foot and right rear foot of the robot are raised 45 degrees simultaneously, then the left front limb and right rear limb are moved forward 45 degrees simultaneously, and finally the left front foot and right rear foot are lowered simultaneously.
- Action 3: The left and right hind legs of the robot are simultaneously raised by 45 degrees, and then the left and right hind limbs are simultaneously moved forward by 45 degrees. The left and right hind limbs are simultaneously moved backward by 45 degrees, and finally the left and right hind legs are simultaneously lowered.
By continuously repeating the action group, the robot can achieve alternating forward and backward movement of its limbs, with the backward movement being a mirror image of the forward movement.
The rotation motion of a robot can also be divided into three parts, taking left rotation as an example:
- Action 1: The robot aligns its left and right forelimbs parallel to its body, with its hind limbs at a 135 ° angle to its body. Its four feet are perpendicular to the ground, preparing for the next step.
- Action 2: Lift the left hind foot and right front foot up 45 degrees, then move the left hind limb and right front limb counterclockwise 45 degrees before the left hind foot and right front foot fall down. Lift the left front foot and right rear foot 45 degrees, move the right hind limb counterclockwise 45 degrees, and then lower the left front foot and right rear foot.
- Action 3: The robot rotates its left hind limb, right forelimb, and right hind limb clockwise by 45 degrees simultaneously to complete one rotation.
The right turn action is the same as the left turn, just change the rotation angle of the joint to counterclockwise.
Project construction and deployment
This project uses EMQX as the MQTT server and uses the 1panel Linux operations panel to quickly install applications. Find the app marketplace in the 1panel panel, search for EMQX, and click install. The server will automatically execute commands to pull images from the Docker image repository and deploy them automatically. Afterwards, the corresponding port will be opened in the server management interface of Alibaba Cloud, and users can choose to access it using domain name resolution. After installation is complete, simply access the domain name in the browser to enter the EMQX control panel.
Similarly, using the 1panel Linux operations panel, first upload the code from the "CAM_Server" folder in this repository to the server's file directory. Find the website ->runtime environment in the menu of the operations panel, select "node. js", and then click "create runtime environment".
Remember to open the corresponding port on the firewall of the cloud server/operations panel! 😊
STM32 is developed using the HAL library, with STMCubeMX used to establish the basic engineering, and developed in conjunction with CLion. There are already many tutorial guides on setting up an environment online, so I won't go into detail here. After cloning the code of this repository, simply open the "HAL_C8T6" folder using Clion to use it.
Download Arduino Lab for MicroPython IDE, open the IDE and connect ESP32-CAM to the computer. Connect the corresponding serial port in the IDE and burn the files in the "ESP32-CAM" folder of this repository code to ESP-CAM.
In 'main. py', you need to modify it to your own server information.
The Flet application is developed using PyCharm IDE due to the use of Python language. The specific environment setup of Flet can refer to the official documentation of Flet. Similarly, after cloning the code in this repository, simply open the "Fletapp" folder in the project using PyCharm IDE to use it.
In 'mqtt. py', you need to modify it to your own server information.
During the development of the application, due to the inability of the compiled APK file to function properly, we attempted to use the SDK version of the Dev branch.
After completing the software, connect each module to STM32. At this point, it should be possible to create this small robot~ 😎
Project Summary
- The design of the power module still has flaws: although the power module can work normally and output voltage, only functionality was considered in the design process, and safety was not fully taken into account. The exposure of PCB circuits can easily lead to short circuits due to accidental contact, posing a safety threat to the power module and even the entire system. Consider adding fuses to prevent circuit burnout caused by high currents generated during power short circuits. At the same time, anti reverse/anti backflow diodes need to be added to avoid burning of the power chip.
- The selection of servo and the structure and posture of the robot: Due to the lack of consideration for the weight of the battery in the initial design, the MG90S cannot easily support the weight of the robot during actual operation. During the process of movement, the unstable center of gravity of the robot causes both forward and backward tasks to not follow a normal straight trajectory. It is also possible that the large error of the 3D printer causes some deviation in the actual printed structures, ultimately leading to unstable center of gravity and uneven force distribution of the robot.
- App optimization: Although the app can achieve basic functions, there are still a series of issues, such as clearing cache data after use before it can be used again, otherwise the screen will remain white after opening and cannot be used normally. At the same time, the control interface of the app does not implement adaptive devices, which can result in a large amount of blank space on the screen once opened on a tablet. After pressing the button, the screen above the button will also refresh synchronously with the page, affecting usage.
reference material
- flet-dev/flet
- Alidong/PCA9685_STM32HAL
- mokhwasomssi/stm32_hal_ssd1306
- shariltumin/esp32-cam-micropython-2022
- STM32入门教程-2023版 细致讲解 中文字幕
- 韦东山freeRTOS系列教程之【第八章】事件组(event group)
- STM32系列(HAL库)——F103C8T6通过HC-SR04超声波模块实现测距
And my previous projects as well: