Hello all,
I've constructed a fairly basic TC-based timestamp on my SAME70. The aim is to be able to timestamp I/O events within the SAME70 with microsecond precision, to be used in further signal/event processing on an external processor. I've used the TC0 Timer Counter (channels 0, 1 and 2) in a daisy-chained manner to implement the timestamp, with the basic setup as follows:
- TC0_0 provides microsecond counting:
- Configure TC0_0 to use the PCK_6 peripheral clock.
- Configure the PCK_6 peripheral clock to use MCK with a prescale value of 150. This provides PCK_6 with a 1MHz clock.
- Configure TC0_0 in wave mode, with RC compare triggering.
- Configure TC0_0 with RA = 500 and RC = 1000. This results in an RC compare trigger every 1ms (1000 ticks @ 1MHz), using a waveform with 50% duty cycle.
- TC0_1 provides millisecond counting:
- Configure TC0_1 to use XC1 as its input.
- Configure TC0_1 in wave mode, with RC compare triggering.
- Configure TC0_1 with RA = 500 and RC = 1000. This results in an RC compare trigger every 1s (1000 ticks @ 1ms/tick), using a waveform with 50% duty cycle.
- TC0_2 provides second counting:
- Configure TC0_2 to use XC2 as its input.
- Configure TC0_2 in wave mode, with RC compare triggering.
- Configure TC0_2 with RA = 30 and RC = 60. This results in an RC compare trigger every 1min (60 ticks @ 1s/tick), using a waveform with 50% duty cycle.
- Note: unlike the previous two TC channels, the TC0_2 RC compare trigger is serviced by an interrupt which increments a minute counter. This functionality isn't relevant to this issue/question, so I won't detail it further.
I then configure the TC0 block mode, such that the output triggers of TC0_0 and TC0_1 are used as the inputs on TC0_1 and TC0_2 respectively. Note that in point 1 above, I state that MCK is used as the PCK_6 source. I have tried using a range of clock sources (e.g. PLLA, main clock, etc.) with the required PCK_6 prescaler, but the inaccuracy/jitter issue appears to be independent of the selected source clock.
I have a very simple test circuit in place to validate this timestamp design. I have a GPIO configured as an input, and I capture the timestamp in it's corresponding rising/falling edge interrupt handler:
static void PIOC_Interrupt_Handler(uint32_t id, uint32_t mask) { BaseType_t higherPriorityTaskWoken = pdFALSE; timestamp = ((TC0->TC_CHANNEL[2].TC_CV & 0x3F) << 20) | ((TC0->TC_CHANNEL[1].TC_CV & 0x3FF) << 10) | (TC0->TC_CHANNEL[0].TC_CV & 0x3FF); //The timestamp value is then sent to an external processor }
As shown above, I am using the FreeRTOS kernel, but I don't believe FreeRTOS is causing any issues (as outlined below). Note that 'timestamp' above is defined and instantiated elsewhere in the class. The timestamp value is a combination of the second, millisecond and microsecond TC values, which is then processed for measurement (e.g. frequency, duty cycle, etc.) on the external processor.
The issue: the recorded timestamp values exhibit a non-trivial amount of jitter, despite the GPIO being provided with a very stable, low frequency input waveform. I'm using very low frequency square wave inputs in the order of 10Hz - 1kHz, so I would expect the generation and handling of the interrupt, and the recording of the TC0 channel values to be executed much faster than the 1us resolution of TC0_0. Below is the disassembly output of the above 'timestamp' line:
00407646 9b.4b ldr r3, [pc, #620] 00407648 d3.f8.90.30 ldr.w r3, [r3, #144] 0040764C 1b.05 lsls r3, r3, #20 0040764E 03.f0.7c.72 and r2, r3, #66060288 00407652 98.4b ldr r3, [pc, #608] 00407654 1b.6d ldr r3, [r3, #80] 00407656 99.02 lsls r1, r3, #10 00407658 97.4b ldr r3, [pc, #604] 0040765A 0b.40 ands r3, r1 0040765C 1a.43 orrs r2, r3 0040765E 95.4b ldr r3, [pc, #596] 00407660 1b.69 ldr r3, [r3, #16] 00407662 c3.f3.09.03 ubfx r3, r3, #0, #10 00407666 13.43 orrs r3, r2 00407668 4f.f0.00.04 mov.w r4, #0 0040766C 93.4a ldr r2, [pc, #588] 0040766E c2.e9.00.34 strd r3, r4, [r2]
The SAME70 is running at 300MHz, so I would expect the above instructions to be executed in far less than 1us, including the GPIO interrupt latency. However, as my test results below illustrate, the actual timestamp values recorded contain variation between timestamps of over 100us.
The below data shows the measured frequency of the GPIO input (the frequency was calculated on the external processor based on the timestamp values of consecutive edges) using a 10Hz input signal.
Calculated Frequency |
---|
10.005 |
9.99261 |
9.99251 |
10.0048 |
10.0048 |
10.0048 |
9.99271 |
9.99271 |
10.0048 |
10.0049 |
10.0048 |
9.99251 |
9.99271 |
10.0048 |
10.0048 |
10.0049 |
9.99261 |
9.99261 |
I have validated the input signal with an oscilloscope; the 10Hz is stable and doesn't exhibit any jitter. I have also added a simple GPIO output in the above PIOC_Interrupt_Handler() function, which either asserts or deasserts a spare GPIO pin based on the input signal state. I have validated on the oscilloscope that this output signal is exactly 10Hz as well, with no jitter. As such, I'm somewhat lost as to why the timestamp values contain jitter, given that the hardware appears to be processing the interrupt without any latency/jitter issues whatsoever (as evidenced by the matched GPIO output signal). Note that I've increased the input signal to 50Hz, 100Hz, 500Hz, 1000Hz, etc. and the result is the same (i.e. the GPIO output signal matches the stability and accuracy of the input signal exactly, while the timestamp value contains considerable jitter).
Does anyone have any idea as to why I'm unable to get exact timestamp values with the above approach?