How to implement delay in variables while using util/delay?

Go To Last Post
5 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I am trying to make a PWM based LED fading with ATMega8. 

 

But the problem is, 

 


 * GccApplication151.c
 *
 * Created: 02-06-2015 18:29:53
 *  Author: Bagho
 */ 


#include <avr/io.h>
#include <util/delay.h>
#define F_CPU 2000000UL



int main(void)
{
	int i;
	DDRB=0xFF; //all pins of PORTB declared as output
	PORTB=0x00;
	
    while(1)
    {
		for (i=1; i<1000; i++)
        {
	PORTB=0xFF;        //High State
        _delay_us(i);          //on delay
        PORTB=0x00;        //low state
        _delay_us(100);     // off delay
        }

}
}

this code returns an error of "Error    4    __builtin_avr_delay_cycles expects a compile time integer constant" 

 

The util/delay.h is probably not configured to accept variables as a delay value. How to edit it to accept the variable and use it to form delays?

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The util/delay.h is probably not configured to accept variables as a delay value.

You mean EXACTLY like the user manual tells you? ...

 

http://www.nongnu.org/avr-libc/u...

 

Note

In order for these functions to work as intended, compiler optimizations must be enabled, and the delay time must be an expression that is a known constant at compile-time. If these requirements are not met, the resulting delay will be much longer (and basically unpredictable), and applications that otherwise do not use floating-point calculations will experience severe code bloat by the floating-point library routines linked into the application.

The idea, if you want run-time variable delays is  that you implement your own function something like:

void delay_us(int n) {
    while(n--) {
        _delay_us(1);
    }
}

but be warned that in the case of us delays (not so important for ms) the while(n--) overhead itself might be costing a few us, especially at low (like 1MHz) CPU speeds.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Run one of your timers to generate a 10ms interrupt. In that ISR decrement a global variable until it reaches zero and then leave it at zero. In main() set the global variable to the delay amount you want in 10ms steps and then wait until it is zero. When it is zero you continue.

#1 Hardware Problem? https://www.avrfreaks.net/forum/...

#2 Hardware Problem? Read AVR042.

#3 All grounds are not created equal

#4 Have you proved your chip is running at xxMHz?

#5 "If you think you need floating point to solve the problem then you don't understand the problem. If you really do need floating point then you have a problem you do not understand."

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

PS the user manual just happens to have this too:

 

http://www.nongnu.org/avr-libc/u...

 

(shame a recent build of the manual has upset the code indentation though frown)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I wrote a tutorial on multi-tasking. See the tutorials section.