Greetings Freaks -
This is a question about high level "issues" of time keeping. I am not talking about how time "ticks" are generated or about how those ticks are converted into useful time quantities. Instead, it is about long term accuracy. And, it is not, strictly, an AVR question as it probably applies to microcontroller clocks, generally, or maybe even to all clocks, generally.
Let me frame this question with two assumptions:
1. Lets assume that we have a source of clock ticks close to some standard rate and that these ticks are moderately adjustable, probably by adjusting the timer roll-over value (though the detail of how is not important).
2. Lets assume that we have intermittent access to some external higher precision time source. Maybe this is internet time, or perhaps GPS. But, the assumption is that it is intermittent so that it cannot be used as THE internal time-base.
Now, the question: What sort of algorithm or process is commonly used to correct the internal time source from the external one? I assume that the internal accumulated time is not just block-updated from the external source, because that would result in either time gaps or, if the internal time were to be set back, intervals when the same time number is reported at two different times. This leads me to suspect that the "better" strategy is to speed up or slow down the internal clock so that reported time remains continuous. If this is the case, how is the decision made about how much to speed up or slow down? Or, is some other method used?
Many thanks
Jim