There's a new version of the image in the Internet Archive. I tested the navigation for autonomous mode in various environments and tried to optimize it, it's a little less intuitive and perceptive indoors but should work well on various surfaces and large spaces.
The difficulty comes from the interplay between the camera and the ultrasonic sensor, I'm trying to use the camera only for obstacle detection and the ultrasonic for overall navigation but tweaking the priorities and how it perceives obstacles has proven challenging.
Overall a good compromise, I will continue to optimize, maybe there's even a way to make it perceive which environment he is in and make him tune the settings according to that.
Ideally you'd train a neural network for that case but I feel that might be a little out of scope for the raspberry pi4b I'm using for this build, however we might be able to game the system a bit over time and make it approximate to something that would use a neural network for navigation.
Quackieduckie
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.