I don't know if you ever tried to measure time with a microcontroller ? It's quite good for order of magnitude, or for comparing several duration, but when it comes to precisely keep track of the time, it's another story entirely. Let's have a look at what solutions are available (in an Arduino context).
You can use millis().
No, it's a joke.
You can use a dedicated timer and its interrupts. The first step in the right direction. Most (if not all, I don't know !) microcontrollers have hardware timers. On Atmega 328p (Arduino Uno, Nano, etc.) you have three timers, two 8 bits (timers 0 and 2) and one 16 bits (timer 1). The first is used by millis(), the second is used by the Servo library, if my brain serve well (in french that would be a pun ! Servo / cerveau), the third by Tone. PWM uses either of the three, following the pin you want to use.
A 16 bit timer seems to be a good choice to keep track of the time. You can use the prescaler to make it tick every seconds, or you can use it at its fastest speed, count overflows and at a regular interval change the overflow value to get a round count to one second. By using this, you soon discover that your resonator doesn't run at 16MHz, but more to some 16MHz-ish speed. Ok, no problem, you just need to compensate it by comparing against a known accurate time. But then you realize that the clock that was too fast ten minutes ago is know too slow. And we are talking about several seconds over ten minutes. Dead-end. The main problem here is that microcontroller boards use resonators as clock, not quartz. And their precision is far from good on long term. We are talking about ±0.5% absolute precision and ±0.2% drift with temperature. 0.2% is 172 seconds per day, or 7 seconds per hour.
One side note : when using this kind of visual synchronization, you also quickly realize that most of the devices you think are precise, are not that much. On my smartphone the 59th second of every minute is only half a second, I believe that every minutes it receives a reference tick from my network or the service provider. Between those reference ticks it relies only on its internal circuitry, which boils down to the resonator accuracy exposed above. My computer is not a good choice either : With a simple processing sketch, I can visually measure quite a difference between the processing update, which seems reliable, and the clock update on my taskbar.
The first thing that comes to mind when you've tried this is "use a RTC module ! They are readily available for a few euros (or dollars if you count with them), and they are made for keeping track of time". Yes. That's what I did. Maybe I was unlucky, but the one I tried was far from beeing anything near reliable. several seconds drift per hour. Maybe they (DS1307) can be calibrated, but the datasheet doesn't explain how.
On Atmega 328p, if you read the datasheet, you learn that the microcontroller has an internal 8 MHz oscillator that it can run on (that should be enough for a clock !) and the datasheet claims that it can be very accurately calibrated by the user. Promising ! You can indeed calibrate this oscillator, and the difference is immediately visible. I've stopped at "this value set on the calibration byte gives a too slow clock, but the value immediately above is too fast". I don't know if it's stable on long term, because I couldn't have a frequency that is multiple of a second. The calibration process I've used, exposed hereafter, could make it a valid choice. Need to test it one day.
On atmega 328p again, if you read the datasheet further, you also learn that Timer 2 can run asynchronously. That is, it can take its reference from the same clock as the CPU, but it can also use another one, like a pulse input on a specific pin, or an external resonator. If you correlate this with the previous point, that means that you could run the whole microcontroller on its internal clock at 8 MHz, and use a 32.768 kHz crystal to clock the Timer 2, that you will use as a reference for the clock. It still have a noticeable drift, but this time it can probably be calibrated. If watchmaker have known how to do this for 50 years on wristwatch, I believe it can be made on a modern microcontroller. Honestly I don't know if it works, I stopped at this calibration problem on the Atmega 328P...
But use the exact same thing on the Attiny 816 ! The main difference are that the Attiny is a much modern controller, which can run at 20MHz on its internal oscillator at any voltage, and that it has a RTC timer for exactly what we want : counting time. It should be straightforward. Yes, almost. Handling registers on modern Attiny is a bit different than on Atmega. No big deal, the datasheet is quite clear. Except on one point : the naming of the bitshift constants for each register bytes. Hardcoding works, but it's less portable. Configuring the timer is quite easy once you've read the datasheet. There are a few examples on Internet, and I'm pleased to see that where Atmel had Applications Notes covering specific points of working with their chips (both for soft and hardware), microchip continues this traditions with Getting Started tutorials on how to use this or this part. Here you learn the bare minimum to wake the RTC, and also that your crystal needs two charge capacitor, which were not needed on Atmega chip. You soon have a chip that has a reliable timer, that drift for a few second per hour, but that drift uniformly. That leads to the last part :
Calibration ! The crystal I used on this project is the most accurate I could find at Mouser's. It runs at 32.768 kHz and has a ±15ppm tolerance (that is, it less than 0.5 Hz deviation from the reference frequency). So, which way can we calibrate it ?
One solution is to set the time, let one day pass, and check the drift after 24 hours. It's long, and relies on the ability of the user to be precise enough.
One other solution is to do the same thing using a computer. I've wrote a short Processing program to that : send an arbitrary byte to start a counter on the clock, let ten minutes elapse, then send another byte to stop the counter and request its value, and compare it to the one in the Processing program. There is at least two drawbacks with that approach : if I make several clocks (which I plan to do), I have to let my computer alone for long times, and more importantly, you can control how the USB sends packets, so In the end you're not controlling the accuracy of the calibration.
The solution I used is quite easy : use a GPS as the reference clock. GPS modules are available, they are quite cheap, and they are very accurate. Plus most GPS ships and modules have a "1pps" (one pulse per second) output. That's great, with this e can trigger an interrupt and have a reliable way or calibrating our system !
The algorithm I came with is quite simple, is processed in 30 seconds, and is divided in two parts :
On each tick of the GPS clock, the RTC current value is sampled. The current sample is compared with the previous one, and the cumulated drift is computed.
After ten seconds, the average drift per tick is computed, and this value is added or subtracted to the counter top value.
The drift is again measured with the new counter top value, this time over 20 seconds. Then the cumulated drift is again computed, and two values are determined. One is a correction to apply on a regular delay, and the other is the delay.
With these values known, the main program can now count ticks, and every n ticks it will add or subtract a given value to the counter. The value are of course saved to EEPROM, so on next startup a calibration will not be needed.
With this simple calibration, accuracy of half a second drift per day is eaisly obtained, which is quite good, and as good as, or even better, than most watch and clocks available on the market.
There is still room for improvement. Better calibration can probably be made by simply let it run a bit longer. Another thing is that crystal drifting is dependent on ambient temperature. Attiny, like Atmega, have a temperature sensor channel on their ADC. This could be used to monitor temperature and compensate calibration accordingly.
Finally, there is one thing that can be made : moving the GPS reference clock from its breadboard to another freeform object ! And that will probably happen soon.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.
I use DS3231 modules and they are very good. The DS1307 is old. https://hackaday.io/page/9293-a-quick-note-on-ds3231-rtc-modules
About your smartphone and the 59th second, I doubt that. It's probably a quirk of your smartphone's UI. Smartphones get time sync frorm the Internet or phone signals from the mobile carrier. Freerunning, their accuracy is on par with computer RTCs.
Are you sure? yes | no
Yes, I've read that DS3231 are better. I didn't tested it. They are expensive if you source genuine parts from a distributor.
About the smartphone, you may be right about their precision, I don't know. What I wrote is a supposition. It would make sense : you don't really need accuracy on a device that is always connected to the network. I have some old smartphones lying around, I'm gonna try to let them run for a few days, without any connection to mobile carrier or Internet.
Are you sure? yes | no
The DS3231 modules for a couple of dollars from AliExpress work fine. As for phone symchronisation it isn't as simple as poll every minute. Internet symchronisation uses NTP which dynamically adapts to sources. Finally one problem with GPS is that you may need to acquire satellite signal before you get time data. This will be harder indoors but obviously phones can do it, with the help of good antennae.
Are you sure? yes | no