Using hardware and software PWM at the same time

Go To Last Post
8 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

This program is a cross between two servo driver softwares from Patrick Hood-Daniel's tutorials:
https://www.youtube.com/watch?v=...
https://www.youtube.com/watch?v=...
The first one uses hardware PWM to create the signal for the servo.
The second uses a similar timer, but it manipulates the output pin by software.
Both codes worked fine separately.

My future project (an RC plane) needs two "precise" hardware PWMs for the elevons, and many other software PWM channels for other devices (motor ESCs, landing gear and other servos). While elevon servos react too fast to timing glitches, a retractable landing gear or a heavy motor doesn't need precision, so I thought using software was fine.

Now I'm just testing several different parts of this project, so I've a servo to test the concept (I'll have different ones and much more) a uC, and a couple of switches and LEDs to see if I'm able to learn what an Atmel can do and how.

I've tried to put both example codes into the same program. On Attiny2313 I'm using PB0 as a generic pin, while PB3 happens to be the first 16-bit PWM output (OC1A). (Now I'm just learning Atmel uCs, the final project will definitely use a better one, as 2313 lacks ADC I need to check battery level.)
The hardware and software PWM uses an almost-common code, you need to initialise the timer the same way for the software PWM as well. The only difference is the OCR1A.

The idea was to set the 16 bit long timer to go from 0 to 19999 (using 1MHz clock it gives us 20ms long frame without prescaling). When 0 is reached, the timer interrupt sets the software PWM 1 (and the HW PWM gets the same). When the TCNT1 reaches OCR1A, HW PWM channel automatically goes 0. I'm watching TCNT1 and compare it in the 1-2ms time slot to my sw_servo variable (having the pulse width in nanoseconds), and when the timer passes its value, I change PB0 to 0.

As long as I don't set the OCR1A, PB0 works fine (sends the intended pulse). But when I do, PB3 has the proper output, but PB0 "goes crazy". I've tried the same with other pins (pins from PORTD), but they didn't work either after setting OCR1A.
When the servo was replaced with two leds (between ground and PB0, the other between ground and PB3), the LED on PB0 got much dimmer when OCR1A got set. The PB3 has the same light that PB0 had before setting OCR1A.
By "going crazy" I mean the servo went off the scale. Servos need an approxymately 1ms-2ms pulse every 20ms, but it got much less than 1ms.
The servo position is changing on PB3, showing dir is changing. So my timer interrupt is still working.

Can You please explain this? I'm new to Atmel, got my programmer just 2 weeks ago.
Is there anything I can't do when HW PWM is working? (like using the same timer or using interrupts?)
I'll be using UART (with interrupts) and ADC (again, with interrupts) on my final chip. Can PWM interfere with them as well? (I know I'll need to disable interrupts for a short time when I'm checking my servos, but apart from that.)

Flags/fuses: default. I haven't played with them yet, 1MHz internal oscillator is used.

Some unnecessary additional info:
compile command/toolchain ($@ is the filename):
avr-gcc -std=c99 -Wall -Os -mmcu=attiny2313 -o test.out "$@"
avr-objcopy -O ihex test.out test.hex
avrdude -c stk500 -p t2313 -P /dev/ttyACM0 -U flash:w:test.hex:i && echo !!OK!! || echo !!!ERROR!!!

I'm using an "unofficial" stk500-compatible programmer (has a switch to turn on/off 5V, an USB port and a 10 pin SPI/ISP, nothing fancy)
nyos@hex:~/atmel/tests$ avr-gcc --version
avr-gcc (GCC) 4.7.0

(BTW this is a complete, but shortened code, the original had more channels, a button, more sophisticated positioning and some interrupt code.)

Thanks for help!

#define F_CPU 1000000UL // unused

#include 
#include 

//#define MAKE_THIS_BUGGY // define this to initialise OCR1A and make PB0 go crazy

uint8_t cnt=0; // counting frames, we change the servo position (dir) after a few frames
uint8_t dir=0; // direction/servo position to see if the servo works or not (can be 0 or 1)

int main(void)
{
  DDRB|=1<<PB3|1<<PB0; // PB3 is HW PWM, PB0 is SW PWM, both outputs
  TCCR1A|=1<<WGM11;
  TCCR1B|=1<<WGM12|1<<WGM13|1<<CS10; // PWM, use clock
  TCCR1A|=1<<COM1A1|1<<COM1B1; // non-inverted
  TIMSK|=1<<OCIE1A; // enable timer interrupt
  ICR1=19999; // 20ms frame
  
  sei(); // enable global interrupts
  int lastdir=0; // we change sw_servo and OCR1A only when we need to
  
#ifdef MAKE_THIS_BUGGY
  OCR1A=1400;
#endif
  int sw_servo=1400;
  
  while(1)
  {
    if(TCNT1>=800&&TCNT1<=2200) // these are same in the tutorial
    {
      while(TCNT1<=2200)
      {
        if((TCNT1>=sw_servo)&&bit_is_set(PORTB,PB0)) PORTB&=~(1<<PB0);
        // more servos will be added here
      }
    }
    else
    {
      if(dir!=lastdir)
      {
#ifdef MAKE_THIS_BUGGY
        OCR1A=1400+dir*200;
#endif
        sw_servo=1400+dir*200;
        lastdir=dir;
      }
    }
  }
}

ISR(TIMER1_COMPA_vect)
{
  PORTB|=1<<PB0;
  
  if(++cnt>=200) // change position every few seconds
  {
    cnt=0;
    dir^=1;
  }
}

[/code]

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You need to use the overflow interrupt (this will be the 20ms time). When you don't set OCR1A it defaults to 0 so it just happens that you have a 20ms interrupt with TIMER1_COMPA_vect also.
/Lars

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks!

I've replaced OCIE1A with TOIE1 and TIMER1_COMPA_vect with TIMER1_OVF_vect and it works!
Now I only need to understand what the difference was.
Did TIMER1_COMPA_vect make my 20.000ms interval 65.536ms long instead? That would explain the dimmer light.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Glad you got it working.

The difference is that TIMER1_COMPA_vect is triggering at the pulse generation point, not the full period. Hence the much shorter times.

But, there is really no need for software PWM. There are lots of examples on these forums for generating R/C servo pwm using a timer in compare mode. The trick is to generate the pulses sequentially not all at once. I've had eight servos running at 50Hz update off a single 8-bit timer with no jitter whatsoever ( or, no more than the servo might naturally have ). I've also ( just for the fun of it ) generated five servo signals at 100Hz. Total processor overhead for both versions? Less than 10% even with sweeps generated for all servos.

Of course, if it's working...

Martin Jay McKee

As with most things in engineering, the answer is an unabashed, "It depends."

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

On can argue that that what mckeemj describes is still software PWM (in my mind it is only HW PWM if the pins are switched by hardware). Anyway, here is an example of that method of servo signal generation (I write a bit about jitter there also).
https://www.avrfreaks.net/index.p...
It is for sure precise enough for flying models. I have flown some (small) helicopters using this when I was playing with 2.4GHz radio modules.
/Lars

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

It is a hybrid method, certainly. It may even qualify, by some definitions, as wholly software based. Either way, your code proves an important point... it is fully possible. It is also much neater with the 16-bit timer than 8-bit -- both are workable solutions.

Interesting you should point out that thread, it was in the back of my mind as I was writing.

Martin Jay McKee

As with most things in engineering, the answer is an unabashed, "It depends."

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for the suggestions!
I'll definitely study the code in the linked topic, however now I prefer doing them all at once. First of all, I'm not very familiar with this chip yet, and this looks simpler. Second, if I'd generate pulses sequentially using a timer+interrupt, other interrupts (like usart the pulse info will be coming from later or ADC ready) could disrupt my pulse width. Of course a hierarchical interrupt system could solve this issue, but that's beyond my current knowledge of Atmel.
On the other hand, I can set servo bits in a timer interrupt at 0. I can disable interrupts and check servo variables and clear desired bits in a range 800-2200 microseconds from start, then re-enable interrupts and do whatever I want to in time left from my frame.
Actually I'd written a simple servo code before I've found Patrick's servo programs, so I was aware of the timer interrupt+toggle servo pin method.

I may even turn off unused servos to save battery, (trading a bit of holding power), which seems easier in software implementation.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

As I said, so long as it is solving your problem, thee is no need for optimization ( or modification ). Solved is solved. On the other hand, if you need more servos, it is always going to be easier with a software method of some sort. Keep in mind, however, what you are really looking at as far as jitter is concerned. A well designed interrupt routine is certainly under 100 cycles. Since you are doing R/C control you can run at 5v and, with many modern AVRs, that means you could be running up to 20MHz. In that case, the worst case jitter caused by an interrupt would be ~5us, or .5% of the pulse width range. Even in the case that such jitter is present, it will be hardly noticeable. Moreover, it is unlikely to show up in the pulses in any consistent fashion, so the actual, effective jitter is even less.

In the end, yes, it is more complicated. It also requires more processor overhead. But, it is a simple and effective method that, once implemented, just works. Or, at least, it works in any system designed with reasonable interrupt loads.

Martin Jay McKee

As with most things in engineering, the answer is an unabashed, "It depends."