XMEGA DAC using DMA

Go To Last Post
11 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I am trying to generate a sine wave using DMA and the DAC. I am not using the event system. i have the following code and for some reason it seems like my DMA transfer is never occurring. Can someone offer any suggestions on why this is not working?

	DACA.CTRLB = (DACA.CTRLB & ~DAC_CHSEL_gm) | DAC_CHSEL_SINGLE_gc;
	DACA.CTRLC = (DACA.CTRLC & ~(DAC_REFSEL_gm | DAC_LEFTADJ_bm)) |
				  DAC_REFSEL_INT1V_gc;
	DACA.TIMCTRL = DAC_CONINTVAL_32CLK_gc;
	DACA.CTRLA |= DAC_CH0EN_bm | DAC_ENABLE_bm;

	/* setup DMA input to DAC */
	DMA.CH0.ADDRCTRL = DMA_CH_SRCRELOAD0_bm | DMA_CH_SRCDIR_INC_gc | DMA_CH_DESTRELOAD_NONE_gc | DMA_CH_DESTDIR_FIXED_gc; //reload and increment source
	DMA.CH0.TRIGSRC = DMA_CH_TRIGSRC_DACA_CH0_gc; // use DACA channel 0 as trigger source
	DMA.CH0.TRFCNT = (UINT8)(sizeof(SinTable)/sizeof(INT16));
	DMA.CH0.SRCADDR0 = ((UINT16)(&SinTable[0]) >> 0U) & 0xFFU;
	DMA.CH0.SRCADDR1 = ((UINT16)(&SinTable[0]) >> 8U) & 0xFFU;
	DMA.CH0.SRCADDR2 = 0U; //((UINT16)(&SinTable[0]) >> 16) & 0xFF;
	DMA.CH0.DESTADDR0 = ((UINT16)(&DACA.CH0DATA) >> 0U) & 0xFFU;
	DMA.CH0.DESTADDR1 = ((UINT16)(&DACA.CH0DATA) >> 8U) & 0xFFU;
	DMA.CH0.DESTADDR2 = 0U; //((UINT16)(&DACA.CH0DATA) >> 16) & 0xFF;
	DMA.CH0.CTRLA = DMA_CH_REPEAT_bm | DMA_CH_BURSTLEN_2BYTE_gc;
	DMA.CH0.CTRLA |= DMA_CH_ENABLE_bm | DMA_CH_TRFREQ_bm; // | DMA_CH_SINGLE_bm;
	DMA.CTRL = DMA_ENABLE_bm;	//enable DMA with single buffer, round robin

SinTable is a 200 entry array of 16-bit wide data. Only the least significant 12 bits are used for the sine data with the upper 4 bits always being 0.

When I run my program my DAC data (DACA.CH0DATA) is always 0.

Any help would be appreciated.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

They put so many configuration options into the Xmegas for the DMA; it's a bit tricky to get them all right. I always use the AVRstudio simulator and/or JTAG debugger with the JTAGICE to test my DMA code. Since you're triggering off the DAC conversion I think the simulator should run your code. Is it possible you need to trigger a single DAC by writing to the DAC CH0DATA to get things going?

-Paul

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

wandererwolf wrote:
They put so many configuration options into the Xmegas for the DMA; it's a bit tricky to get them all right. I always use the AVRstudio simulator and/or JTAG debugger with the JTAGICE to test my DMA code. Since you're triggering off the DAC conversion I think the simulator should run your code. Is it possible you need to trigger a single DAC by writing to the DAC CH0DATA to get things going?

-Paul

I added the following line to the end of the code above. Still nothing.

DACA.CH0DATA = SinTable[0];

The DMA controller seems to be advancing properly. The channel source address is incrementing, the channel block transfer count in decrementing, when it gets to the end it starts over, but the DAC data never changes.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Looking at the memory map the DMA transfer is writing to 0x30A while the destination address is 0x318 (DACA.CH0DATA). Any thoughts on why this is happening? I am using the AVR ONE! for debugging it that makes any difference.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I have developed an arbitrary waveform generator using the XMEGA's DMA and DAC.
http://www.gabotronics.com/development-boards/xmega-xmultikit.htm
You can see the source code and some demonstration videos on my webpage.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I see what you are doing, but I am not using the event system. When I set it up without events and the DMA starts the channel data goed to 0x30A or 0x32A depending on when I am using DAC channel A or channel B. Also, I am trying to use 12-bit mode, but all of my writes are only 8-bits.

If I use 8-bit left adjusted and write to CH0DATAH then everything works as expected. The DMA address is correct and everything works as expected.

Also, single word writes to CH0DATA produce correct results so I know it isn't something wrong with the address lines in the part.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

For 16 bit transfers I think you need to configure the ADDRCTL like this:

   DMA.CH0.ADDRCTRL = DMA_CH_SRCRELOAD0_bm | DMA_CH_SRCDIR_INC_gc | DMA_CH_DESTRELOAD_BURST_gc | DMA_CH_DESTDIR_INC_gc; //reload and increment source 

The DMA needs to do 2 8bit transfers to 2 different addresses I think, so you want to increment the destination to load hi and lo bytes.

-Paul

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks Paul that fixed my problem.

Now, for some reason I seem to have killed my repeat forever. I thought it was working, but maybe it never was. The documentation says that the if the repeat mode flag in DMA.CH0.CTRLA is set and the REPCNT is 0 then it will repeat forever. I am getting one pass through and then it stops.

The DMA channel is still enabled. The source address and destination address are reloaded, but the transfer doesn't start again. Is there something I am missing.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Yeah, in the simulation I ran it did that but I chalked it up to something wrong with my implementation, but interestingly if I run TCC0 and trigger the DMA transfer off of that the DMA runs continuously. When I select the DAC as trigger source it stops after one block. Very strange.

-Paul

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi,

TRFCNT must be set to bytes, not (16bit) blocks. So you have to multiply this with two?
DMA.CH0.TRFCNT = (UINT8)(sizeof(SinTable)/sizeof(INT16));

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You need repeat, but you also need to do a 2-byte burst each time the DAC is finished with the previous conversion.. enabling single shot mode should accomplish this.. TRFCNT then need to be the size of your sine table (or possibly half when using 2-byte bursts).

single shot mode is normally always used when doing peripheral-sync'ed dma, btw..