I've noticed while trying to debug my code using a lot of printf() statements that the more printf() statements I put in seems to adversely affect the performance of my code. And I realize that the printf(), for example, takes time... and if you over-run your frame time problems may arise. What I don't understand, though... is what the mechanism of this interference actually is. In my case, for example, all I have done is enabled a UART for serial communication with a digital compass... The UART communication works fine until I make the string in my printf() statement too long. I first thought it was simply a display issue, and assumed my code actually had the right value from the digital compass and it was just coming out wrong on my terminal every now and then due to issueing the printf() statement too frequently for it to keep up. As it turns out, this was not the case... when the value comes out wrong in my terminal, it was really read in wrong during the UART communication. This problem completely goes away when I disable my printf() statements. Why would printing to one UART have any bearing on communication with another UART? I'm using ISR driven receive on the UART communicating with the digital compass. I would think that no matter how much time is being spent in the printf() function, or how frequently it is called, that it would jump away from this to get the incoming byte from the digital compass? But apparently, it doesn't seem to work that way. Anyway, I can live with what's happening... because the problem goes away when I eliminate my printf() statements, it just worries me that I don't understand why this happens. It makes me wonder if communication with multiple devices over two different UARTs would cause similar problems as well. Does anyone know why this type of behaviour occurs?