DAC jitter on UC3 (with video) [updated/solved]

Go To Last Post
9 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Update:

Atmel helped debug some of the issues. There are still some weird things going on, but, for the most part, the DAC outputs are much improved once the ADC is disabled. I believe this could possibly be related to how I'm using it, so I take back what I said about this being Atmel's fault. The DACs are being sampled externally by the ADC, which could be a significant addition of noise and accuracy problems. With that said, even when I remove those inputs from the sequencer the DAC outputs are still affected by the ADC. The effects are much diminished, though, and the accuracy problem vanished completely. All in all, it works better than I thought and I'm going to continue using it for now.

=============================================

UC3C0512C

I'm using the PWM peripheral to generate events that trigger the DACs. DACIFB0 and DACIFB1 are both listening for the same event, which causes each DAC to sample on channel A only. The PDCA is feeding data to both DACs, so this should be completely deterministic timing unless I'm misunderstanding something.

First of all, the output waveforms have several hundred nanoseconds of jitter. It makes no sense why that would be the case unless there's something in my DAC configuration regarding timing that isn't correct. I've been over it tons of times and can't seem to find anything wrong.

Secondly, the DACs don't always sample the right values! There seems to be overshoot and undershoot on some of the samples, but it's weirder than that because they often come as a pair - an overshoot and then exactly two samples later an undershoot.

Video: http://www.youtube.com/watch?v=iS61Uvnt_Mg&hd=1

Here's the code in easy to read segments so hopefully someone can find something wrong...

First, the PWM peripheral. The PDCA moves bytes into channel 0 (the only synchronous channel) to generate a sine wave using an external filter. I truncated a lot of this to make it smaller, but the concepts are all here.

volatile U8 nozzle_sine_table[MAGNITUDES][POINTS] = {
    {15,17,15,11,7,5,7,11},
    {25,29,25,17,9,5,9,17},
    ...
};

void Init_PWM( void ) {
    volatile avr32_pwm_t *pwm = &AVR32_PWM;
    volatile avr32_pwm_channel_t *sin_channel = &AVR32_PWM.channel[0];

    // Configure the PWM peripheral
    pwm->CLK.clksel = 1; // Use GCLK_PWM (132 MHz)
    pwm->SCM.sync0  = 1; // Only channel 0 is sync for less PDCA latency
    pwm->SCM.updm   = 2; // Automatic update of duty cycle

    // Configure synchronous channels
    pwm->SCUP.upr          = 0; // Output the same value for UPR+1 periods
    pwm->ELXMR[0].csel0    = 1; // Pulse PWM EVT0 when COMP0 matches
    pwm->comp[0].cmp0v     = 1; // Initial value (can't be zero! dunno why)
    pwm->comp[0].CMP0M.cpr = 3; // Compare twice per sine period (POINTS = 8)
    pwm->comp[0].CMP0M.ctr = 2; // Compare every 3rd PWM period
    pwm->comp[0].CMP0M.cen = 1; // Enable COMP0

    // Configure the sine wave channel
    sin_channel->CMR.cpre = AVR32_PWM_CPRE_CCK;
    sin_channel->cprd = 250; // 132MHz/250=528kHz carrier
    // Duty cycle is updated automatically by the PDCA
}

Config the PEVC to make each DAC listen for the PWM EVT0 trigger:

void Init_PEVC(void) {
    pevc_evs_opt_t pwm_dac_opts = {
        .igfdr = 0x00,
        .igf   = PEVC_EVS_IGF_OFF,
        .evf   = PEVC_EVS_EVF_OFF,
        .evr   = PEVC_EVS_EVR_ON
    };
    pevc_channel_configure(&AVR32_PEVC,
         AVR32_PEVC_ID_USER_DACIFB0_CHA,
         AVR32_PEVC_ID_GEN_PWM_0, &pwm_dac_opts);
    pevc_channels_enable(&AVR32_PEVC,
         (1 << AVR32_PEVC_ID_USER_DACIFB0_CHA));

    pevc_channel_configure(&AVR32_PEVC,
         AVR32_PEVC_ID_USER_DACIFB1_CHA,
         AVR32_PEVC_ID_GEN_PWM_0, &pwm_dac_opts);
    pevc_channels_enable(&AVR32_PEVC,
         (1 << AVR32_PEVC_ID_USER_DACIFB1_CHA));
}

Configure the PDCA. The PWM values are in a ring buffer and the DAC values are updated via interrupts, but there should be no latency issues because the interrupt is triggered when the reload value is zero, but the reload values take a while to sample.

void Init_PDCA(void) {
    // Configure the PDCA channel for the PWM sine wave channel
    const pdca_channel_options_t PWM_PDCA_Opts = {
        .pid = AVR32_PDCA_PID_PWM_TX,
        .transfer_size = PDCA_TRANSFER_SIZE_BYTE,
        .addr   = (void *)nozzle_sine_table[7],
        .size   = PWM_SINE_POINTS,
        .r_addr = (void *)nozzle_sine_table[7],
        .r_size = PWM_SINE_POINTS
    };
    pdca_init_channel(PDCA_CHANNEL_PWM_SINE, &PWM_PDCA_Opts);
    AVR32_PDCA.channel[PDCA_CHANNEL_PWM_SINE].MR.ring = true;
    pdca_enable(PDCA_CHANNEL_PWM_SINE);


    // Configure the PDCA channel for the DAC Tunnel N
    INTC_register_interrupt(&PDCA_IRQ0_Handler,
        AVR32_PDCA_IRQ_0,AVR32_INTC_INT3);
    const pdca_channel_options_t dac0a_opts = {
        .pid = AVR32_PDCA_PID_DACIFB0_CHA_TX,
        .transfer_size = PDCA_TRANSFER_SIZE_HALF_WORD,
        .addr   = (void *)dac_table_n[0],
        .size   = DAC_TUNNEL_POINTS,
        .r_addr = (void *)dac_table_n[1],
        .r_size = DAC_TUNNEL_POINTS
    };
    pdca_init_channel(PDCA_CHANNEL_DAC_TUNNEL_N, &dac0a_opts);
    pdca_enable_interrupt_reload_counter_zero(
         PDCA_CHANNEL_DAC_TUNNEL_N);
    pdca_enable(PDCA_CHANNEL_DAC_TUNNEL_N);


    // Configure the PDCA channel for the DAC Tunnel P
    INTC_register_interrupt(&PDCA_IRQ1_Handler,
        AVR32_PDCA_IRQ_1,AVR32_INTC_INT3);
    const pdca_channel_options_t dac1a_opts = {
        .pid = AVR32_PDCA_PID_DACIFB1_CHA_TX,
        .transfer_size = PDCA_TRANSFER_SIZE_HALF_WORD,
        .addr   = (void *)dac_table_p[0],
        .size   = DAC_TUNNEL_POINTS,
        .r_addr = (void *)dac_table_p[1],
        .r_size = DAC_TUNNEL_POINTS
    };
    pdca_init_channel(PDCA_CHANNEL_DAC_TUNNEL_P, &dac1a_opts);
    pdca_enable_interrupt_reload_counter_zero(
         PDCA_CHANNEL_DAC_TUNNEL_P);
    pdca_enable(PDCA_CHANNEL_DAC_TUNNEL_P);
}

Config the DACs. Maybe I did something wrong with the timing, but that's all I can think to check at the moment.

void Init_DAC( void ) {
    volatile avr32_dacifb_t *dac0 = &AVR32_DACIFB0;
    volatile avr32_dacifb_t *dac1 = &AVR32_DACIFB1;

    dacifb_opt_t dac0a_cfg = {
        .reference            = DACIFB_REFERENCE_EXT,
        .channel_selection    = DACIFB_CHANNEL_SELECTION_A,
        .low_power            = false,
        .dual                 = false,
        .prescaler_clock_hz   = F_CPU/16
    };
    dacifb_opt_t dac1a_cfg = {
        .reference            = DACIFB_REFERENCE_EXT,
        .channel_selection    = DACIFB_CHANNEL_SELECTION_A,
        .low_power            = false,
        .dual                 = false,
        .prescaler_clock_hz   = F_CPU/16
    };

    // Get calibration data
    dacifb_get_calibration_data(&AVR32_DACIFB0, &dac0a_cfg,
                                 DAC_INSTANCE_TUNNEL_N);
    dacifb_get_calibration_data(&AVR32_DACIFB1, &dac1a_cfg,
                                 DAC_INSTANCE_TUNNEL_P);
    dacifb_configure(&AVR32_DACIFB0, &dac0a_cfg, F_CPU);
    dacifb_configure(&AVR32_DACIFB1, &dac1a_cfg, F_CPU);

    dac0->CR.aoe = 1;       // Channel A Output Enable
    dac0->CFR.aae = 1;      // Channel A Event Trigger
    dac0->CFR.chc = 1;      // Channel A only
    dac0->ECR.esla = 1;     // PEVC Triggers Conversions
    dac0->TCR.chi = 8;      // Channel Interval Control
    dac0->TCR.presc = 4;    // Prescaler Clock Divider
    dac0->CR.en = 1;    // DAC Enable

    dac1->CR.aoe = 1;       // Channel A Output Enable
    dac1->CFR.aae = 1;      // Channel A Event Trigger
    dac1->CFR.chc = 1;      // Channel A only
    dac1->ECR.esla = 1;     // PEVC Triggers Conversions
    dac1->TCR.chi = 8;      // Channel Interval Control
    dac1->TCR.presc = 4;    // Prescaler Clock Divider
    dac1->CR.en = 1;    // DAC Enable
}

Lastly, the PDCA ISRs for updating the DAC reload registers.

void PDCA_IRQ0_Handler(void){
    static U8 counter = 1; // TCRR starts at [1]

    if(++counter >= DAC_TUNNEL_SETS) {counter = 0;}

    pdca_reload_channel(PDCA_CHANNEL_DAC_TUNNEL_N,
    (void *)dac_table_n[counter], DAC_TUNNEL_POINTS);
}
__attribute__((__interrupt__))
void PDCA_IRQ1_Handler(void){
    static U8 counter = 1; // TCRR starts at [1]

    if(++counter >= DAC_TUNNEL_SETS) {counter = 0;}

    pdca_reload_channel(PDCA_CHANNEL_DAC_TUNNEL_P,
    (void *)dac_table_p[counter], DAC_TUNNEL_POINTS);
}

As you can see below, it works quite well actually, but take a look at the close up view to see the jitter on the scope. I have the intensity turned up so you can see that the DACs are sampling too high and too low on many cycles. Also, you can see the edge jitter. The trigger is on the sine wave (yellow) that's being generated by the PWM peripheral. That signal is very clean, so the jitter isn't because of scope triggering. I checked it against a square wave output on another PWM channel.

The blue and green channels are the outputs of amplifiers being driven by the DACs. The gain is 3. The red channel is the output of a differential amplifier being driven by the two gain stages. The gain of this stage is 2 (See the attached schematic).

Attachment(s): 

Last Edited: Fri. Oct 18, 2013 - 03:30 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

An additional data point:

This could somehow be related to the PWM event generation. When I update the CTR register (the register that defines which PWM period triggers the event), it seems to only move one edge of the DAC waveform. That's definitely weird because it should move both edges. I'll post a video later because it's very obvious.

The clocks should all be synchronous, right? This is my clock setup:

12 MHz external clock -> PLL0 -> 132 MHz
12 MHz external clock -> PLL1 -> 48 MHz (USB)

PLL0 -> GCLK_PWM (No divider, 132 MHz)
PLL0 -> CPU/PBA/PBB/PBC (Divide by 2) 66 MHz

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I tried changing the event source for the DAC to GCLK7 in order to prove that the PWM peripheral wasn't the cause of the jitter issue.

This line enables GCLK7 to run at 132 kHz, which is the frequency I need the DAC to sample:

genclk_enable_config(AVR32_SCIF_GCLK_GCLK7, GENCLK_SRC_CLK_CPU, 500);

This maps the GCLK7 output to the event channel that the DAC is using as its trigger source:

pevc_channel_configure(&AVR32_PEVC, AVR32_PEVC_ID_USER_DACIFB0_CHA, AVR32_PEVC_ID_GEN_GCLK_0, &pwm_dac_opts);

The result is the same as when the PWM triggers the DAC - a large amount of jitter as you can see below. The two scope shots are meant to show a zoomed out view where the DAC output is frequency matched to the sine wave (the sine wave is 66 kHz so the DAC output is also 66 kHz because they're tracking exactly). That also means the GCLK setup was done correctly because the DAC is sampling a high and low value once per sine period, hence the 132 kHz GCLK frequency to generate a 66 kHz DAC output waveform.

The trigger is the rising edge of the red channel (DAC output). The second shot shows the falling edge of the DAC output. Also visible in that picture is the sine wave output that's generated by the PWM peripheral. It's pretty well behaved as you can see. The DAC, however, shows over 200 ns of jitter. It really seems like the PEVC isn't routing triggers cycle deterministically like the datasheet claims.

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Here's two more shots of the very poor sample accuracy. One is an overlay of many cycles and the other is a single capture showing an extreme difference on the very right side of the green wave. Keep in mind that this is after a gain stage of 3, so the DAC output missed by nearly 100 mV as the screen shot shows approximately 300 mV.

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I've come to the conclusion that the DAC is just a piece of crap. I know that isn't very technical, but there's really no other explanation. Every possible combination of timing settings, event triggers, internal DAC triggers, and all other configuration bits have been changed. The jitter seems to be related to the CLK_DACIFB frequency which is set by the prescaler. Someone sent me a PM that helped make me realize this. After multiple tests, the jitter was definitely reduced when the clock frequency was increased, so the very poor clock prescaler resolution is most likely the root of the problem.

I still have no idea why the DACs are failing to sample the values correctly. The DC accuracy is absolutely horrific and makes the DAC useless. I know I've seen other people say the same thing, but I was hoping they were doing something wrong. Maybe I'm doing something wrong, but it shouldn't be this hard to make the DAC work correctly. I'm going to move to another vendor of 32bit micro because I don't have any more time to spend debugging Atmel's silicon and vague datasheets. Very disappointing.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Perhaps tweaking the operation of the HSB Bus Matrix (HMATRIXB) might improve things.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mikech wrote:
Perhaps tweaking the operation of the HSB Bus Matrix (HMATRIXB) might improve things.

I tried your suggestion, but the DC accuracy and jitter issues still remain. The jitter is still present even with interrupt driven DAC sampling, which removes all timing variables. It just sucks.

The DC accuracy problem persists even on boards that don't have anything else populated. I've used the DAC on this particular microcontroller in the past with some success, but it clearly can't meet the specs listed in the datasheet. I never used it at 'high' frequency. The DAC is sampling at 132 kHz whereas all of the examples are running far slower.

I already converted my design to use a discrete current DAC. I didn't want to have to use an external part, but accuracy is critical in this application.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Atmel support replied to my email and gave me some ideas. They also took the time to create a project to help figure this out, so kudos to Atmel for the effort. In the example they provided, they set the DAC clock frequency to FOSC0/2, which is 8 MHz. I have mine set to F_CPU/8, which is 8.25 Mhz. I was also manually setting CHI and PRESC later, so I took those out to see if the ASF DAC driver would help. Somewhat surprisingly, that helped the situation a lot.

In the end, the culprit was actually the ADC. I was in the process of reducing my project down to the absolute minimum in order to send it to Atmel for help. I tried it after doing that and the DAC seemed to be working quite well. The jitter was gone except for what can be expected due to clock accuracy and the DC accuracy problem completely vanished. I started to add one thing at a time until the ADC was the obvious problem.

Enabling the ADC at lower frequency helps some, but turning it on at all is a significant performance penalty. I definitely admit that it could be a routing problem on my board, but it seems somewhat unlikely. It could also be the fact that I'm trying to sample the DAC outputs, so I'm going to buffer both signals and send them right back to the ADC externally on the next revision.

The datasheet is still far too sparse, but at least the silicon appears to work for the most part. Thanks for the support Atmel - the UC305123C might just work for my application now.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello,

 

is it possible for you to post the entire source here. I'm struggeling with a similar problem.

 

Kind regards

Kurt