Since this is a glassless augmented reality system, I'm facing some lighting problems related to the interaction between the camera and the projector.
The first problem is simple: The camera needs a lot of ambient light to detect the image. However, the projector works better on darker environments. Finding the best compromise between these may be tricky. And, with my first set, I didn't have much choice, because of my cheap projector. Now I'm moving to a better one.
A possible solution for that is to use a non-visible source of light. For example, a torch made out of IR leds. That would not interfere with the projector image, but would still be "visible" by the camera sensors. The problem here is that we would loose any color information on the image. You can see, by the simulation of the next image, that it would be harder to identify the balls against the cloth without the contrast of their colors
As pointed by the red arrow, we also have some visual elements that are easily confused with a ball (The light reflecting on my hand, at this example). Another detection that is harder to do without color information
Another problem is that the projector image is projector over the image captured by the camera. So, the captured image will include all drawings made by the projector, and them may interfere with the computer vision. For example, the software may confuse a projected aiming line with the cue. I'm dealing with this by adjusting the brightness of the projected image, so it will only create "translucent" drawing over the table cloth that I can easily filter out via software.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.