This may sound quite vague but I think it's better to ask a generic question before providing all the related code.
I use an ATMega48 to get a number of single ADC conversions from 2 channels without changing the ADMUX during conversions. Specifically, I first get 250 results from ADC0, then I disable the ADC, change the channel in ADMUX, re-enable and get another 250 single conversion results from ADC2. The ADEN remains set during each set of 250 conversions.
An if statement in each while loop (one for each channel) updates the maximum value found on each channel. The maximum values are stored in a volatile uint16_t array (i.e. Results...). The first conversion results are discarded as they should.
At the end, a series of if statements compare the maximum values of the two channels and show the respective messages on an LCD.
When I debug the code with JTAGICE mk2 and set a breakpoint before changing ADMUX to switch to ADC2, everything works fine and I finally get the right output on the LCD.
However, when the code runs free of breakpoints, the output is wrong and it seems(*) like the maximum value on ADC2 is equal to the one on ADC0.
(*) for the time being I have only tried to tie the inputs to either AVCC or GND.
I guess this is a timing-related issue but since ADC conversion is paused when a breakpoint is reached then I think it should exhibit the same behaviour regardless of the existence of breakpoints. Interrupts are disabled. I also tried applying a long delay in place of the breakpoint but I got the same erroneous result.
How can this be attributed to the presence or not of breakpoints? Any ideas?