How to set F_CPU?

Go To Last Post
17 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

In Studio4, if I recall correctly (and might not be), there were separate settings for the CPU clock in simulation and the F_CPU needed for delay_ms(), et al.

So far, I have been unable to find a place to set it in Studio 6 at compile time. Do you just need to do a #define at the start of the source?

Thanks
Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Project -> Toolchain -> AVR/GNU C Compiler -> Symbols

then add F_CPU=8000000 (or whatever)

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks!

By the way, for anyone else, I found this at

Project > (ProjectName) Properties > Toolchain > AVR/GNU C Compiler > Symbols

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi guys,
I'm driven a bit crazy by this F_CPU. Reading through the thread I have tried the following defines under Ubuntu:

#define F_CPU ((unsigned long)1200000)
or
#define F_CPU (1200000UL)
or
#define F_CPU (1200000ul)
...
...
and wanted delay as
_delay_ms(1000);
...
but using any of the forms I get the error message:
../4.5.3/../../../avr/include/util/delay.h: In function ‘_delay_ms’:
../4.5.3/../../../avr/include/util/delay.h:140:17: error: expected expression before ‘)’ token.

The same source can be compiled using (an old) Astudio under Windows. I have hunted for invisible character at the end of line, changed space and tab back and forth in #define - no change.
Any suggestion to break this black magic?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

This worked for me:

#define F_CPU 12000000UL
#include 
#include 

int main(void)
{
	DDRB = 0xFF;
	while(1) {
		PORTB ^= 0xFF;
		_delay_ms(200);
	}
}

It generated:

  9c:	8f ef       	ldi	r24, 0xFF	; 255
  9e:	92 e5       	ldi	r25, 0x52	; 82
  a0:	a7 e0       	ldi	r26, 0x07	; 7
  a2:	81 50       	subi	r24, 0x01	; 1
  a4:	90 40       	sbci	r25, 0x00	; 0
  a6:	a0 40       	sbci	r26, 0x00	; 0
  a8:	e1 f7       	brne	.-8      	; 0xa2 
  aa:	00 c0       	rjmp	.+0      	; 0xac 
  ac:	00 00       	nop

SUBI is 2 cycles, SBCI is 1 cycle and the BRNE will take 2 cycles most of the time. So 6 cycles repeated 0x752FF (479,999) times is 2,879,994 cycles. With the 12MHz cycle time that is about 0.23s. Not sure why I got that when I asked for 0.2s but I maybe need to check my sums.

Anyway, bottom line is that it's pretty close to what might be expected and compiles without error.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for the notice. Later I have realized the delay is over the acceptable range, but it is indifferent in respect of reported error.
Well, the question is not fully solved, but I have got nearer. Perhaps there will be somebody to solve the enigma.
I generally use a hardware.h file to define hardware specific things. It makes life easier if any change is needed.
I had in it the following lines in it:
#ifndef F_CPU
#define F_CPU 1200000UL
#endif

Then this hardware.h was included in my source:
#include "hardware.h"
Using this structure any form mentioned in my other mail was not accepted. However, hardware.h was included in my source.
But if I write it directly into my source as
#define F_CPU 1200000UL
then there is no error message.
I DO NOT define F_CPU elsewhere, but some of the original avr include files have somewhere a conditional F_CPU declaration. I suppose is causes the conflict.

Let me modify my question:
If I'd like to insist to declare F_CPU in my hardware.h instead of my source.c, how can I do it?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Is hardware.h included ABOVE or BELOW the include of in the files where _delay_ms() is used. If it's the thing providing F_CPU it obviously needs to be above.

In fact what most people do with F_CPU as it's so ubiquitous is not define it in any file in the project (as there's a possibility not everything that needs to see it will) but instead pass it as a -DF_CPU=120000000UL on the compiler invocation to ensure that (a) all files compiled see it and (b) it's defined before any .h #include'sthat may require it.

If you really want to declare it in hardware.h then just ensure that is the very first file #include'd in any .c file where F_CPU may be required. If hardware.h itself is dependent on other includes (like for uint8_t for example) then make sure it #include's those near the top.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

T+hanks for explanation and suggestion!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:
Project > (ProjectName) Properties > Toolchain > AVR/GNU C Compiler > Symbols
Defined or undefined? And why does it have to be so blasted hard to find? :evil: What was wrong with the easy way AS4 did it?

Thank David and Jim for finding it, seems that "Defined symbols" works. Just as well I remembered this thread.

John Samperi

Ampertronics Pty. Ltd.

www.ampertronics.com.au

* Electronic Design * Custom Products * Contract Assembly

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

Defined or undefined? And why does it have to be so blasted hard to find? What was wrong with the easy way AS4 did it?

Defined, obviously, unless you want to forcefully keep it undefined. It's not a dedicated box any more since that implied the user was always going to use a specific compiler and C library.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

It's not a dedicated box any more since that implied the user was always going to use a specific compiler and C library.

Oh that's rich! Surely the fact that you now bundle a C compiler with the IDE suggests it's even MORE likely a specific compiler will be used? In fact most of the options are for GCC or are you saying the same UI has to handle all of avr-gcc, avr32-gcc and arm-gcc? If so isn't that a bit of an oversight - shouldn't the IDE adapt to which of those 3 the project is using? I did wonder why options such as "Link statically" were offered when using avr-gcc!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Not that this is a mjor matter, but...

Quote:

It's not a dedicated box any more since that implied the user was always going to use a specific compiler and C library.

That barrel does not hold water - or are you going to remove all other check-boxes for specific avr-gcc options (e.g. -Werror)?

As of January 15, 2018, Site fix-up work has begun! Now do your part and report any bugs or deficiencies here

No guarantees, but if we don't report problems they won't get much of  a chance to be fixed! Details/discussions at link given just above.

 

"Some questions have no answers."[C Baird] "There comes a point where the spoon-feeding has to stop and the independent thinking has to start." [C Lawson] "There are always ways to disagree, without being disagreeable."[E Weddington] "Words represent concepts. Use the wrong words, communicate the wrong concept." [J Morin] "Persistence only goes so far if you set yourself up for failure." [Kartman]

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Maybe for F_CPU one should get into the habit of defining it into some common header file.

I usually have a project_definitions.h file, it's just a matter of getting used to it.

John Samperi

Ampertronics Pty. Ltd.

www.ampertronics.com.au

* Electronic Design * Custom Products * Contract Assembly

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Well, this thread has been dead for a while, so this is awkward.

 

But can anyone please explain why I have to define F_CPU in the beginning of my program in the first place to use delay.h? I'm thinking that since ATMEL 6.0 asks me in the beginning of my project what kind of controller I'm using (I'm using an ATmega128, F-CPU = 16 MHz), doesn't it already know that my F_CPU is 16 MHz? Isn't it unnecessary to define?

 

Please clarify what I'm missing out! :)

Sometimes you gotta run, before you can walk.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 3

ATmega128 MAY run at 16MHz, but it can also run at 8MHz or 32kHz or whatever you choose. It's all up to what clock or oscillator you choose to use in your design. There is no way for the CPU or the compiler to figure out that on its own.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Awesome. Thanks a ton! :D 

Sometimes you gotta run, before you can walk.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks je_ruud!   I have been looking for a list of available overrides so we can compile in settings for specific chips.   Can you tell us if there is a header file with the most used defines?

I smell burning bakelite!