To ensure the feasibility of my project before diving into physical prototyping, I decided to explore a virtual approach. My goal was to assess whether the system could be operational without the need for an actual prototype.
To begin, I employed PolyCam to conduct a 3D scan of the mobility scooter and its surroundings. This provided me with a digital representation of the scooter, which served as the foundation for subsequent virtual experiments.

Utilizing Blender, I integrated a 3D model of a connector into the scene of the 3D scanned scooter. The connector comprised two elements: the magnetic power connector on top and a black square at the bottom, acting as an indicator. This virtual setup allowed me to simulate the interactions between the robot and the connector.

To evaluate the robot's ability to locate the connector, I generated image sequences from the perspective of a potential robot. This provided me with different viewpoints to assess the system's performance. Additionally, I utilized various light settings within Blender to emulate different lighting conditions, ensuring comprehensive testing.

Next, I employed OpenCV in conjunction with Python to establish a blob detector. This detector consistently identified the square indicator within the rendered images. Once the square was detected, it served as a reference point for the robot's alignment with the connector. By analyzing the position of the indicator-square relative to the center of the image, the robot could determine its alignment. Furthermore, the size of the square offered insight into the robot's proximity to the connector.

The successful virtual prototyping of the robot's navigation components instilled confidence in the feasibility of utilizing a camera and OpenCV for its navigation.
By leveraging virtual simulations and advanced image processing techniques, I have gained valuable insights into the potential success of my project. These preliminary experiments provide a solid foundation for further development and bring me closer to achieving my ultimate goal.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.