Hi - so I'm working on a program that is controlled through the UART. The computer attached to the AVR sends commands over serial to the AVR that sets various parameters for the AVR (whcih is essentially acting as a signal generator). The AVR is programmed so that it is constantly checking the UART for new commands, but then the signal generating part of the code is entirely timer interrupt driven (thus it interrupts the UART checking code)
I had first been thinking that I would need to send one byte to the computer, then have the AVR send a byte back saying it's ready for the next byte, etc. But just for kicks I tried skipping sending a byte back, and just have the AVR assume it wouldn't miss any commands. This worked without any problems at all - but I also wonder if it's because the ISRs are very short (I think less than 10 clock cycles), and I was only running the UART at 9600 baud.
So my question is this: Is there some sort of built in protection in the serial protocol/AVRs/UARTs/computer/etc. that forces the computer to be send the next byte only when the AVR is ready for it? My worry is that when the ISRs start to get more complicated (and they will get very complicated and lenghty) that I could miss entire commands (like say two commands are sent while the AVR is in an ISR, so I'm worried that the first would be lost due to receiving the second).
So is this a problem? Hopefully I'm explaining myself OK... Thanks!