I have diplopia, which means I see everything in double, avoiding me to see real life in 3D. This project tries to solve this.
To make the experience fit your profile, pick a username and tell us what interests you.
We found and based on your interests.
As the cameras need to be looking at the same things, they have to be exactly parallel . I fixed them on a wooden stick.
As the headset cover will need to me removed and put back in place quite often, I glued tiny magnets on the cameras and the stick .
I wanted a headset with usb connection to the smartphone. It also appears my smartphone almost fits in the samsung gear VR, so I went for it. I found one second hand for around 20$. I had to trim the corners and it worked. Then I had to replace all the inside uncompatible and useless (to me) electronics with a micro usb cable.
Once I have put everything together , it looks like this.
The smartphone sits on the same usb hub as the raspberry pi on the right, all connected to the left one, which acts as the server for everything.
https://github.com/hurdaskellir/h264-live-player . It creates a webpage where you can see the live video. Just run Chrome on the phone and access the webpage on the raspberry pi. You have to find it's IP on the usb ethernet network at first, by using ssh or another method, and that's it.
I measured the delay to be lower than 250ms by taking a picture of both a stopwatch and the smartphone displaying a video of it. Let's hope I don't get sea sick when I use this in front of my eyes... ;-)
The phone I have is a sony xperia z5 premium, with a 4K display, the resolution should be nice when watched through the VR headset.
I am trying to use everything I already have on my shelves to keep the cost as low as possible, and because I do not want to wait for spare parts I would have to order on some asian country, with delivery delays of around a month.
Here is the idea : I want to use two raspberry pi zero linked together by a usb cable, one being a usb gadget connected to the other. The video streaming between both will be entirely managed by usb virtual network interfaces and routed through iptables. The usb hub will also host an android smartphone using its usb tethering mode to use the same network and be able to watch the images streamed by both raspberry pis.
Create an account to leave a comment. Already have an account? Log In.
Thank you :-) At the moment, I am playing with WebGL and the game pad to interact with the web browser on the smartphone to align the two images according to the eyesight. The picture I posted shows a Gear VR with two Raspberry Pi Zeros with PiCams. The two Zeros are configured as RNDIS USB devices on my computer, and everything is displayed on a web page... It lags and is not very efficient. I haven't tried to align anything physically yet as it was only a first proof of concept. It will require some more precise adujstments. Maybe I will in the end just use a Rpi 3 with a 5.5 display (4K is possible?) and two USB webcams in order for everything to be fluid and real time.
Become a member to follow this project and never miss any updates
Interesting project! How you are going to transmit the video to the smartphone? Maybe after you find the right image shift offset, you can get a prism with the right parameters and make a eyeglasses from it.