Dear Friends,
Its been a month since I started learning AVR. Originally wanted to make a LED Cube, then realizing there are basic lessons I have to understand then finally settled to make a propeller clock misunderstanding myself about the knowledge I had.
Sooner got into the task to "trying" to understand timers and interrupts. Then navigated to external clock sources. Meanwhile today I realized I have not moved anywhere.
I have a very primitive question or rather a stupid question. Revisited the LED blinking example to clarify some of my understandings.
To start with below is the program i wrote :
#include#include int main(void) { // //DataDirection Setting the first pin of Port B to output = 1; //by default all the pins are in input mode = 0; //DDRB |= 1 << PINB0; //PORTB |= 1 << PINB0; int i; DDRB = 0b00000010; PORTB = 0b00000000; while(1) { //Setting the first pin of Port B to 1 = 5v; PORTB ^= 1 << 1; _delay_ms(1000); } }
As you see I am trying to blink an LED connected to PB1 of my ATmega32 every one second. And my MakeFile was having an F_CPU reflecting "F_CPU = 8000000". When I ran the program I notice the LED was blinking every 8 seconds rather should have been blinking every 1 second. With little common sense I figured this out by looking at the MakeFile where the F_CPU was set to 8 Mhz and changed it to 1MHZ. After this the program worked as expected.
Given I have made a wrong entry to the F_CPU, I simply do not understand how the timing varies between two clocks. Logically speaking 1000 milliseconds is always 1000 milliseconds right? Does the implementation of _delay_ms method has got anything to do with the F_CPU? Please enlighten me.
Thanks and Regards,
#0K Srinivasan.