Problems with ATXmega32D4 USART comm with 32Mhz clock

Go To Last Post
5 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I am trying to communicate at 9600 baud, 8 N 1 via the Xmega32's USART. However, I can't get it to work when the system/peripheral clock is set to 32Mhz.

- When system and peripheral clock is set to 2Mhz it works. (I made the appropriate change in baudctrla and baudctrlb.)

- When I use 32Mhz clock and connect to another board with the same Atmel chip, communications succeeds. So that seems to imply that the system clock probably isn't running at 32Mhz or I'm using the wrong baudctrla and baudctrlb values.

Here's the system clock initialization:

	unsigned char n;

	// Internal 32 kHz RC oscillator initialization
	// Enable the internal 32 kHz RC oscillator
	OSC.CTRL|=OSC_RC32KEN_bm;
	// Wait for the internal 32 kHz RC oscillator to stabilize
	while ((OSC.STATUS & OSC_RC32KRDY_bm)==0);

	// Internal 32 MHz RC oscillator initialization
	// Enable the internal 32 MHz RC oscillator
	OSC.CTRL|=OSC_RC32MEN_bm;

	// System Clock prescaler A division factor: 1
	// System Clock prescalers B & C division factors: B:1, C:1
	// ClkPer4: 32000.000 kHz
	// ClkPer2: 32000.000 kHz
	// ClkPer:  32000.000 kHz
	// ClkCPU:  32000.000 kHz
	n=(CLK.PSCTRL & (~(CLK_PSADIV_gm | CLK_PSBCDIV1_bm | CLK_PSBCDIV0_bm))) |
	CLK_PSADIV_1_gc | CLK_PSBCDIV_1_1_gc;
	CCP=CCP_IOREG_gc;
	CLK.PSCTRL=n;

	// Internal 32 MHz RC osc. calibration reference clock source: 32.768 kHz Internal Osc.
	OSC.DFLLCTRL&= ~(OSC_RC32MCREF_bm | OSC_RC2MCREF_bm);
	// Enable the autocalibration of the internal 32 MHz RC oscillator
	DFLLRC32M.CTRL|=DFLL_ENABLE_bm;

	// Wait for the internal 32 MHz RC oscillator to stabilize
	while ((OSC.STATUS & OSC_RC32MRDY_bm)==0);

	// Select the system clock source: 32 MHz Internal RC Osc.
	n=(CLK.CTRL & (~CLK_SCLKSEL_gm)) | CLK_SCLKSEL_RC32M_gc;
	CCP=CCP_IOREG_gc;
	CLK.CTRL=n;

	// Disable the unused oscillators: 2 MHz, external clock/crystal oscillator, PLL
	OSC.CTRL&= ~(OSC_RC2MEN_bm | OSC_XOSCEN_bm | OSC_PLLEN_bm);

	// ClkPer output: Disabled bit 7
	PORTCFG.CLKEVOUT=(PORTCFG.CLKEVOUT & (~PORTCFG_CLKOUT_gm)) | PORTCFG_CLKOUT_OFF_gc;

The baudCtrlA and baudCtlB values are as defined by Atmel's baudrate_calculations Excel spreadsheet. I tried both with a bscale value of -4 and 0 using the appropriate corresponding Bsel value.

Here's the code using BSCALE of -4:

    //Required Baud rate: 9600
	// BAUDCTRLA and B calculated by Atmel excel spreadsheet
	// BSEL = 3317 = 0x0CF5
	// BSCALE = -4 = 1100b = 0xC- remember the bscale is high nibble of BAUDCTRLB register so need to multiply by 0x10 
	// BAUDCTRLA = F5
	// BAUDCTRLB = BSCALE + 0X0C
	// X2 = 0  (do not double clock)
	// ERROR% = 0.01%
	USARTC0.BAUDCTRLA = 0xF5;
	USARTC0.BAUDCTRLB = ( 0XC << USART_BSCALE_gp) | 0X0C;

Here's the other attempt using BSCALE of 0 and appropriate BSEL:

    //Required Baud rate: 9600
	// BAUDCTRLA and B calculated by Atmel excel spreadsheet
	// BSEL = 207 = 0xCF
	// BSCALE = 0
	// BAUDCTRLA = 0XCF
	// BAUDCTRLB = 0
	// X2 = 0  (do not double clock)
	// ERROR% = 0.16%
//	USARTC0.BAUDCTRLA = 0xCF;
//	USARTC0.BAUDCTRLB = 0;

As I mentioned, I can get it to work using a clock of 2Mhz or by connecting two Xmega's together.

Thank you.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Welcome to the Forum.

I don't use C, so I'll let others review your program.

That said, it appears that for 32 MHz, 9600, N, 8, 1, baudctrla = 207 and baudctrlb = 0, which is what you have, and which works for me.

As it works at 2 MHz, or from "32MHz" to "32MHz", it would appear that perhaps you are not really getting the clock set to 32 MHz.

Can you set your clock to 32 MHz, as you have been doing, and then simply flash an LED, once per second, and count it for a minute?

This will confirm that the clock is actually running at (about) 32 MHz, and not at something significantly different. (Its probably not at 32 MHz.)

If you have an O'scope you could also look at the USART's output, or just toggle an output bit at 1 KHz, or route the system clock to an output pin, (ClockOut), and measure it.

JC

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

DocJC wrote:
Welcome to the Forum.

I don't use C, so I'll let others review your program.

That said, it appears that for 32 MHz, 9600, N, 8, 1, baudctrla = 207 and baudctrlb = 0, which is what you have, and which works for me.

JC

JC,

Great suggestions. I don't have an O'scope (I normally only do software but have been doing a lot of MSP430 programming.) I'll do the blinking LED test. Thanks for the suggestion.

In case no one else replies and even though you don't use C, could you post your clock initialization code to set the clock at 32Mhz? The language you are using should give me a clue unless whatever you are using hides all that detail.

I have to admit that TI's examples for the MSP430 series is much better. They are always very short (read no function calls), to the point, and include individual examples for every possible combination of peripheral.

Jim Y.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

OK, you asked for it...

Note that when the Xmega chips first came out Bascom had very limited support for them. The ability to set a few registers and that was about it. These clock setting routines are from my early testing of the Xmega. They all work as expected.

Bascom has undergone many revisions since than, and now one just uses one command to define the clock source and the clock frequency. The language has rather good support for the Xmega. These are legacy routines...

Option 5 actually polls the clock to see when it is ready to be used, the other routines just put in a short delay and assumed the clock had stabilitzed, (KIS approach).

Option 8 overclocks the Xmega to 50 MHz. Use at your own risk. This is well outside the chip's spec's, and many of the modules were not tested, and what was tested was at room temperature.
That said, I needed it for a "quickie project" where 32 MHz just wasn't fast enough, and this got the job done...

Clockopt4:
   'Set up the Xmega clock.  Works.
   'Run on Internal 2 MHz Osc, at 32 MHz, via the PLL.
   'This MANUALLY turns on the Xmega PLL.
   'Xmega runs at 2MHz on power up.
   Osc_ctrl = 15       'All Osc ON, PLL Off
   Clk_psctrl = 0      'No PreScaler in use
   Osc_xoscctrl = 195    '12-16MHz, 256 Clks
   Osc_pllctrl = 16           'PLL: Int 2 MHz Osc x 16
   Waitms 1
   Osc_ctrl = 31           'PLL ON, All Osc Sources On
   Waitms 1
   Cpu_ccp = 216      'Config Change Protection
   Clk_ctrl = 4            'Use PLL as Clock Source
   Return

Clockopt5:
   'Set up the Xmega Clock. Works.
   'Run at 32 MHz from the Internal 32 MHz Osc, set by my code.
   'Xmega runs on Int 2MHz Osc on Startup.
   'This turns on the Int 32 MHz Osc, awaits it being ready, and switches to it.
   'Don't forget the Configuration Change Register Protection Trigger before
   'changing the uC's Clock Source.
   'First turn ON the 32 MHz Int Osc:
   'They wait until the Int 32 MHz Osc is ready to be used.
   Osc_ctrl = 2          'Int 32 MHz Osc ON
   Rvbit = 0               'Clear flag
   While Rvbit = 0
      'Read the Int 32 MHz Osc Status
      Regdata = Osc_status    'Status of all Int Osc Sources
      Rvbit = Regdata.1          'Int 32 MHz Osc Status, 1 = Ready
   Wend
   Cpu_ccp = 216            'Config Change Protection
   Clk_ctrl = 1                 'Use Int 32 MHz Osc
   Return

Clockopt6:
   'Set up the Xmega clock.  Works.
   'Run on External Xtal, (16 MHz), at 32 MHz, via the PLL x2.
   'This MANUALLY turns on the Xmega PLL.
   'Xmega runs at 2MHz on power up.
   Osc_xoscctrl = 203      'Ext Osc: 12-16MHz, 16 K Clks
   Osc_ctrl = 9                     'PLL Off, Ext Xtal On, Int 2M OSC On
   Clk_psctrl = 0                 'No PreScaler in use
   Osc_pllctrl = 194           'PLL: Ext Xtal 16 MHz, Multi x2
   Waitms 1
   Osc_ctrl = 31                  'PLL ON, All Osc Sources On
   Waitms 1
   Cpu_ccp = 216              'Config Change Protection
   Clk_ctrl = 4                    'Use PLL as Clock Source
   Return

Clockopt8:
   'Try Overclocking for Logic Analyzer Sampling.
   'Xmega Max Spec is 32 MHz.
   'Incrementally Incr PLL using 2 MHz Int Osc.
   'PLL = x16  gives 32 MHz, (Baseline, in spec.)
   'Set up the Xmega clock.
   'Run on Internal 2 MHz Osc, at 50 MHz, via the PLL.
   'This MANUALLY turns on the Xmega PLL.
   'Xmega runs at 2MHz on power up.
   Osc_ctrl = 15                  'All Osc ON, PLL Off
   Clk_psctrl = 0                 'No PreScaler in use
   Osc_xoscctrl = 195       '12-16MHz, 256 Clks
   Osc_pllctrl = 25             'PLL: Int 2 MHz Osc x 25  !!!
   Waitms 2
   Osc_ctrl = 31                  'PLL ON, All Osc Sources On
   Waitms 2
   Cpu_ccp = 216               'Config Change Protection
   Clk_ctrl = 4                     'Use PLL as Clock Source
   Return

JC

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I did finally get it to work.

What was right in the code I originally posted:
- My USARTC0 initialization code. This code was also partially generated by codewizardAVR but I had to modify ctrla and ctrlb registers for 9600 baud (it generated code for 600 baud.)

What went wrong:
- I had used CodeWizardAVR to generate the code to initialize code to use the 32Mhz clock and use the 32khz clock for calibration. It didn't work.

To solve the problem I found an example from Atmel's avr1003 document that set several system clocks. When I stripped out the clock dividing prescaler from the AVR1003 code, clock initialization for 32Mhz worked. I then added the code from CodeWizardAVR to add in the 32khz calibration. That worked.

Here is the code that sets the system clock to 32Mhz and calibrates using the 32khz clock:

    //
	// Internal 32 kHz RC oscillator initialization
	// Enable the internal 32 kHz RC oscillator
	CLKSYS_Enable( OSC_RC32KEN_bm );
	// Wait for the internal 32 kHz RC oscillator to stabilize
	while ((OSC.STATUS & OSC_RC32KRDY_bm)==0);

	/*  Enable internal 32 MHz ring oscillator and wait until it's
	*  stable. set the
	*  32 MHz ring oscillator as the main clock source. Wait for
	*  user input while the LEDs toggle.
	*/
	CLKSYS_Enable( OSC_RC32MEN_bm );
//		CLKSYS_Prescalers_Config( CLK_PSADIV_1_gc, CLK_PSBCDIV_1_2_gc ); // Divide clock by two with the prescaler C 
	// Internal 32 MHz RC osc. calibration reference clock source: 32.768 kHz Internal Osc.
	OSC.DFLLCTRL&= ~(OSC_RC32MCREF_bm | OSC_RC2MCREF_bm);
	// Enable the autocalibration of the internal 32 MHz RC oscillator
	DFLLRC32M.CTRL|=DFLL_ENABLE_bm;

	do {} while ( CLKSYS_IsReady( OSC_RC32MRDY_bm ) == 0 );
	CLKSYS_Main_ClockSource_Select( CLK_SCLKSEL_RC32M_gc );

Sorry that the code is a mixture of AVR1003 and CodeWizardAVR code but it works and I'm not messing with it.

Why Atmel chose to use all those goofy macros rather than directly showing the code to set the registers, I'll never know. Maybe for portability? But it certainly makes readability suffer.

Again, the original code that I posted in my original post for setting the baud rate to 9600 worked. Both examples. One using prescale of 0. The other with prescale of -4.

Hope that helps someone.