I have now managed to conduct an official second test. Of course, I have tested the robot many times before, but I didn't have enough time for a complete test with camera and setup. The current test was performed with an optimized navigation algorithm. The control unit was described in VHDL as mentioned before. The whole system is running on a TEI0003 development board equipped with a small but powerful Cyclone 10 FPGA.
The entire board computer of the robot is based on this small FPGA board. There is no microcontroller built into the vehicle. Except for the JavaScript-based web application, which was explained in the last article, this project does not include any software application. This is not because I don't like software programming, but because I am an absolute FPGA fan.
The attached video shows the new test. The test was performed in three runs. In the first test the robot was placed in a small room without objects. In the second room there were one and in the third room 2 obstacles.
By close observation you can see that the robot reacts to various situations with different behavior. When it detects an obstacle, the vehicle compares the distance between the two ultrasonic sensors and moves in the direction with the greater distance.The following changes were implemented in the navigation algorithm:
- The robot no longer features a constant angle of rotation. This means that if the vehicle detects an obstacle through one of the two ultrasonic sensors, it will not continuously perform the same movement with a constant angle of rotation. Instead, data is loaded by a state machine out of an array that stores several possible maneuvers. The state machine of the navigation algorithm includes a counter that increments the address of the array by 1 after each maneuver, so that different rotation angles can be used to avoid an obstacle.
signal rom_addr: integer range 0 to 4 := 0;
constant rotation_rom: rotation_rom_array :=(2,3,4,5,6);
- If the robot gets stuck, it drives a small predefined distance backwards, so the vehicle can usually escape from the unfortunate situation.
- When the circuit in the FPGA detects that the robot is desperately stuck, fan and motors are deactivated, and the robot waits for the owner's help.
- In the second generation of the robot the user should also be informed by the web application about the current state of the vehicle (working, stuck, etc.)
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.