I'm using an atxmega32a4u and I'm getting some sort of weird noise/jitter/aliasing or something when trying to sample the ADC at certain frequencies. The ADC clock at 1.5 MHz. I have a timer configured to take samples at various frequencies below that. The timer triggers an event which tells the ADC to take a sample. When the sample is done this triggers a DMA transfer to SRAM.
It "works" but at certain timer frequencies I get crazy noise in the results.
Take a look at the images here. These are all sampling the exact same signal. The sample is good at 100, 250, 300, 750, 1500 kHz, etc. The samples at 400, 700 kHz, and other frequencies are filled with horrible (and variable depending on frequency) noise. Note, in the images the scaling is the same, notice the amplitude is way higher and includes regular noise on "bad" sampling frequencies.
It seems to get a clean signal only at multiples of the ADC clock (1.5 MHz). It does not matter what the ADC clock is set at, it can be slower and the results are the same which some sampling rates being clean and others full of noise.
Anyone have any idea what is going on? To me it looks like some part of the back-end (DMA or something) is transferring the data in mid-sample before it is complete or something. :(
I made better images to show the issue. I put them all on one page.