clock stability

Go To Last Post
6 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

i have build a project with avr and ft232.
this is my program

all works perfectly except the clock.
when i say clock i mean that in the for(;;) loop
the led blinking for 1sec and i think that after
few minutes (about 20-30 minutes, then i check it)
the led has less a few ms

/*
 * _8535_uart.c
 *
 * Created: 8/6/2013 5:57:00 μμ
 *  Author: Giorgos
 */ 
#ifndef F_CPU
#define F_CPU 14745600UL
#endif

//#include 
#include 
#include 
//#include 
#include 


//define baud rate
#define USART_BAUDRATE 19200

#define BAUD_PRESCALE (((F_CPU / (USART_BAUDRATE * 16UL))) - 1)
char DATA[10];
int x,y=10;
int main(void)
{
	
	UCSRB |= (1 << RXEN) | (1 << TXEN);   // Use 8-bit character sizes - URSEL bit set to select the UCRSC register
	UCSRC |= (1 << URSEL) | (1 << UCSZ0) | (1 << UCSZ1);   // Load upper 8-bits of the baud rate value into the high byte of the UBRR register
	UBRRH = (BAUD_PRESCALE >> 8);   // Load lower 8-bits of the baud rate value into the low byte of the UBRR register
	UBRRL = BAUD_PRESCALE;	
	UCSRB |= (1 << RXCIE); 
	sei();   // Enable the Global Interrupt Enable flag so that interrupts can be processed 
	
	DDRA = 255;

	
	_delay_ms(10);
	for (;;)
	{
		for (x=0; x

in the image shows what exactly i mean

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

-- Every time you send a new timebase, you are likely to get that offset as you don't wait for a complete cycle before you change the pulse duration.
-- That may not matter, because "y" is not volatile.

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

this is happen without send anything.
only blinking led!

reset mcu, led blinking every 1 sec (1sec on - 1 sec off)
i dont send anything from pc. and after a few minutes
i see that it happen (mcu led image).

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I don't know whether the _delay_ms(100) will be nearly cycle-accurate or not. That would be a question for the people on the GCC forum, right?

Each loop, there will be some loop overhead. 1us is 10ppm of 100ms, right? Should be hardly noticeable.

The F_CPU value implies that the AVR is running off a crystal? That should be fairly close.

Why don't you do another test program using the AVR's timers to toggle a pin at 1Hz? Then you've taken the loop overhead and delay_ms calculations out of the equation.

PD5 needs to be made an output:

// Timer/Counter 1 initialization
// Clock source: System Clock
// Clock value: 57.600 kHz
// Mode: CTC top=OCR1A
// OC1A output: Toggle on compare match
// OC1B output: Disconnected
// Noise Canceler: Off
// Input Capture on Falling Edge
// Timer Period: 0.5 s
// Output Pulse(s):
// OC1A Period: 1 s Width: 0.5 s
// Timer1 Overflow Interrupt: Off
// Input Capture Interrupt: Off
// Compare A Match Interrupt: Off
// Compare B Match Interrupt: Off
TCCR1A=(0<<COM1A1) | (1<<COM1A0) | (0<<COM1B1) | (0<<COM1B0) | (0<<WGM11) | (0<<WGM10);
TCCR1B=(0<<ICNC1) | (0<<ICES1) | (0<<WGM13) | (1<<WGM12) | (1<<CS12) | (0<<CS11) | (0<<CS10);
TCNT1H=0x8F;
TCNT1L=0x80;
ICR1H=0x00;
ICR1L=0x00;
OCR1AH=0x70;
OCR1AL=0x7F;
OCR1BH=0x00;
OCR1BL=0x00;

Beyond that, you need some kind of 'scope with an envelope feature.

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

BTW, any interrupt servicing will also offset your output with your method as the delay loops are paused.

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

A bit of math
a "few ms" / 30 minutes, let's say 10ms / 2000 seconds or 1 part in 200000, or 0.5us for every 100ms delay. If those numbers aren't what you are seeing, you should have been more specific. In any case, how could you possibly expect better from your technique (which will only get worse as your CPU starts actually handling data)? If you want dead-on timing (as good as your crystal can give you), use a hardware timer.

Or hide the stopwatch and stop timing your LED blinking. :)