RC oscillator and UART in mega48

Go To Last Post
14 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

For a new design I'm currently using the 3.686MHz oscillator on the STK500. But on the board there's no space for a low-frequency crystal.

The only timing critical part of my program is RS232 communication for debugging. How stable is the internal 8MHz / divided down to 1MHz RC oscillator of the atmega48? Which RS232 divider settings do you recomend with this clock?

The design is going to go into production, so I have to be confident that the clock will work with chips from different batches and so on.

Regards,
Børge

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Børge,

I'm producing a production device based on AVR with a UART and was thinking I could get away with just using the internal RC rather than splash for a crystal per unit but the more I read here about people's experiences of discrepancies in batch timing and clock drift with temperature / voltage the more I'm thinking that I'm going to need an external resonator or crystal of some sort even if it's just a 32768 watch crystal to keep re-calibrating OSCCAL so that the baud rate generator will always be accurate.

Cliff

(I'm already making a fixed change to OSCAL so I can get an accurate 115200 baud on an ~8MHz clock but I'm using about £5,000 worth of calibration equipment for the single unit I'm working on (a top of the range Agilent digital storage scope))

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Børge,
from my own experiences with the ATmega8, I would strongly recommend to use some sort of resonator/crystal.
If board space is really a problem, there is an idea which I would have liked to try since a long time, but never found the time to.
If the communication over RS232 is bidirectional, means you are also receiving data bytes (apparently with the same baud rate), why not use these to 'calibrate' or 'adjust' your transmit clock. Connect the RXD signal also to a capture input and measure the width of the incoming data. This should always be a multiple of one bit width. Adjust the clock calibration until you measure the 'correct' values.
Has anyone tried that before?

Jörg.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Clawson,

You worte:

Quote:
...I'm thinking that I'm going to need an external resonator or crystal of some sort even if it's just a 32768 watch crystal to keep re-calibrating OSCCAL so that the baud rate generator will always be accurate.

So why not use a crystal that will provide an even clock division ratio for the BAUD rate that you will be using? Just be done with it and save the code space that would otherwise be used for the continuous OSCAL recalibration...

JBecker,

Quote:
Connect the RXD signal also to a capture input and measure the width of the incoming data. This should always be a multiple of one bit width. Adjust the clock calibration until you measure the 'correct' values.

I can see where this would work, provided you used a table of all of the times representing each and every possible BAUD rate that might be used by your system.

I think you would have to measure the bit time. Then calculate the difference in the bit time, in percent, to that saved in the table for the selected BAUD rate. Use the error in percent to recalibrate OSCAL.

Of sourse, as with all things, this is probably much easier said then done...

The problem with this scheme is that if you are not communicating with the outside world, you would have to turn off that algrothim because there would always be a large error.

The other issue is that the OSCAL recalibration would only be as accurate as the clock used by the transmitting device. If there were other devices that required precise timing, such as an SPI communications or something else, you might have timing isues with those devices. Also, any software delays would be subject to errors proportional to the OSCAL recalibration.

It's always a trade-off! Rarely it there ever a "Win-WIN" situation.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

... provided you used a table of all of the times representing each and every possible BAUD rate that might be used by your system.

You mostly use only one fixed bitrate, do you? We do not talk about automatic bit rate detection here.
So you know the expected capture values of 1 to 9 bit widths. You do not have to adjust the clock calibration on EVERY byte received. It could even be possible (if the protocol allows) to send special 'sync' bytes every now and then. Means sending 0x55 or 0xAA. When the receiver reads a 0x55, it uses the capture values for correction, no big problem.
Starting communication can be done sending these sync bytes until an acknowlege is received.

Quote:

The problem with this scheme is that if you are not communicating with the outside world, you would have to turn off that algrothim because there would always be a large error.

No communication, no received bytes, no capture, no error, nothing to turn off !

Quote:

Of sourse, as with all things, this is probably much easier said then done...

That's true!!!
'The controller just has to find out, whether the bus is working correctly' ('original' from our hardware guy) can result in some houndred lines of code!

Jörg.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi,

I'm just have the same problem and because of space requirements an crystal is not an option to me. So I started to do the math.

Assuming 1 start bit, 8 databits and one stop bit, a single bit uses 10% of the total TX time. A simple receiver samples the signal in the center of the bit which gives us +/- 5% of total time from center of bit to the edge. During the transmission of one byte the receiver is synchronized at the beginning of the start bit and due to offset in clock frequency the sampling slips from the center of the bit. The total phase offset accumulates during the transmission and must not exceed the 5%.
This means in an ideal case the clock rate of transmitter and receiver can differ by 5% maximum in order to receive the last bit correct. This 5% tollerance margin is shared by transmitter and receiver frequency offset, so that I would say worst case the accuracy should be less than 2%.
2% total accuracy over temperature, supply voltage and other vparameters. This can not be acchieved by simply calibrating the oscillator once in production.

In my opinion only some "calibration on the fly" will work reliably.
But still unknown to me is the short term stability of the oscillator and the sensitivity to supply voltage change (low frequency noise / spikes on the supply pin). This would give an idea in which intervals the calibration has to be repeated and how stable the supply line must be.

Reimund

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

JBecker, there is an atmel application note concerning this subject:

http://www.atmel.com/dyn/resourc...

If you want to use this method you have to ensure a duty cycle of 50% and you have to look closely to the frequency/OSCCAL value diagram in the data sheet.

Florian

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

FWIW
I have been producing a product for quite a while now that uses the 4MHz internal RC osc on a mega8 - then to a rs232 interface chip running 9600 8N1 usually connected to a PC of some sort. it all runs off a 7805 voltage regulator - so not great regulation. it is indoor aplication, so no temperature extremes.
the strings of characters are not very long - maybe 10 at most, then there is nothing for a while. we test everything before shipping with no significant fallout. no complaints from the field so far.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

If its just for debug. you could just tack the xtal or resonator to the avr pins. Or just run the serial at 2400 or some slower speed where a several % error is tolerable.

Imagecraft compiler user

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Bob,

As Reimund explained above it doesn't get any better at low baud rates. The required clock accuracy is the same.

kevin

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think that the internal RC oscilator will probably not worth all this work
I mean, you can use a ceramic resonator w/ internal caps, it's just a single piece that you can solder directly to your circuit and you're ready to go
if the board is going to production, it's even more advisable to use xtals or ceramic resonators.. perhaps on indoor environment you could do that just to try out or to avoid wasting an xtal for a simple test board, but that's certainly not the case

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

it doesn't get any better at low baud rates
====================================
I dont get it. the avrcalc prog shows 2400 baud and 1mhz clk is .16% err, 9600 and 1mhz is 6.99% err, so obviously 2400 will work and 9600 wont, assuming the 1mhz is within a couple of %. Agree?

Imagecraft compiler user

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Bob,

You're right, I was neglecting the error due to quantization of the division ratio - that does improve at low rates although the error can be minimized by not dividing down the clock and is only 0.16% at 9600bps if you leave the clock at 8MHz

The main, very significant, problem with using the internal RC clock is that there is an initial calibration error and also it drifts due to temperature, voltage and aging, the error from these causes is independent of the baud rate. From the data sheet it looks like you can get the clock to stay within 2% with calibration, which is adequate for the UART but using the factory calibration value it is only guaranteed to be accurate to +-10%, that is not good enough, although from other people's experience it may not be that bad very often.

kevin

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:
The only timing critical part of my program is RS232 communication for debugging

If the RS232 is just for debugging (eg, sending printfs over the serial port to track variables) I have found the internal oscillator most suitable. Furthermore if you need to send some simple packets to your microcontroller you can always introduce a simple system that generates a XOR checksum every 6 bytes sent or something like that just to help minimise data error.

I'd be wary of sending long packets with the internal oscillator, but for simple transmissions it does the trick.

oddbudman