First idea with raspberry pi camera
We can let running a RaspberryPi for quite a long time and there is plenty of existing code to use the camera. It is not possible to achieve directly with the camera module a 15 min exposure time picture. But, if during these 15 min we stack all the pictures taken every 5 seconds, we can get similar effect.
Each pixels are coded with 3 bytes for red-green-blue. If our buffer is made with 16 or 32 bit types, we can sum all the pixels value of each 5 seconds image, and then, when every 15min, divide the values by the number of images.
However, there still a problem with the average of pixels values. Imagine during night when the scene is almost dark. If a car pass by, it lights a part of the picture. On a real long exposure picture, the part which is enlighten will appear bright or even saturated. On the average, it will appear grey, the average of the bright short moment and the other dark image.
What we could do is to normalize the image in a such way the maximum summed value correspond to the maximum 8bit value (255).
VERY impressive! This is exactly what I'm trying to achieve, however my results with raspistill, etc where far inferior...
I checked your git repo (https://github.com/pierre-muth/TimeJourney), however the code includes some ADC components, which I don't have. Could you help me to create a timelapse similar to yours, with RPi / Picam hardware components + code only, please?