The goal of this project is to make a regular display work like a window into the virtual world. This is done by tracking the user's head with a camera and rendering a distorted image to account for the user's perspective of the display. The effect becomes more convincing at higher framerates, so performance is critical. This could be used for entertainment, productivity, or even as decoration.
Written in Python, using OpenCV for image manipulation, Mediapipe for face tracking, and Pygame and OpenGL for 3D rendering. Face tracking works at a wide range of distances, from a few inches to at least 6 feet away, and at angles you probably wouldn't use a display at. The effect is best for producing small amounts of inward depth; scenes that "pop out" or are very far away suffer greatly from a lack of stereo depth.
A long-term goal is for this to be a software tool that would allow any display and camera to be calibrated for use as this type of virtual window.
Have you been able to continue this project? Been looking to do this exactly