After a couple of days trying to debug an issue; I'm resorting to help!
So, the summary is:
- We have an application which transmits data over UART, and is fully interrupt driven. This data is streamed to a PC and displayed in a graphing system or terminal.
- When executing the program at full speed, consistently - every time, the third data byte in each frame that is loaded into the transmit FIFO goes missing
- Data buffering is handled by a lower layer FIFO system, such that head/tail management is fully abstracted
- We are using all 3 UART interrupt; RXc, TXc and UDRe
- RXc just puts the new RX'd data into the RX FIFO
- UDRe gets the next byte for transmission and puts it into UDR
- TXc simply disables the relevant interrupts and disables the transmitter
So, the TX FIFO is loaded up with the data (usually lots of numerical data, but for purposes of debugging - an ASCII string of "hello").
But here's the problem:
- If the application is just run at full speed (12MHz, ATMega1284P) - we receive "helo", i.e. the third char goes missing every time. This happens regardless of UART speed, buffer contents, or length of data frame. Note also polled operation transmits the data without loss of third byte, on the same hardware. The hardware has been in use for over a year without issue.
- With a breakpoint on the TXc interrupt, and executing the interrupt by clicking 'Continue' in AS7, then "hello" is transmitted out over the UART and received just fine on a PC.
- Note, whether run at full speed or single-stepped, I can confirm the buffer loading is correct and does contain the data we want to send.
For some reason, code tags aren't working today - please ask if you need to see the interrupts - it's more of a 'conceptual' question I suppose, as to how I appear to have a race condition of sorts with such a simple task.
All the best