Why does running a timer produce erratic input?

Go To Last Post
6 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I have a DC motor turning a disc at around 5 rpm. I use a hall effect gear tooth sensor to monitor its position. Periodically, based on the sensor input, my ATMega2561 stops the motor (both by killing its power through a transistor and shorting its leads through another transistor). The cpu's supposed to pause for a second, start the motor up again, and wait for another sensor hit.

If I comment out the pause, the motor runs continuously (well, it stops with each sensor hit but immediately starts again). The input from the sensor (active low) appears as it should on a scope. If I enable the pause, the sensor input becomes erratic and the motor turns spastically as a result. I can't figure out why the pause would have any impact on the sensor. Even though the input pin I'm using (PINB6) is an output compare and PWM pin for timer 1 (which I use to generate the pause), I'm not using either of those modes. Also, I get the same erratic behavior if I use a couple of nested loops to generate the delay instead of the timer. Any thoughts what might be causing this behavior? Thanks.

//ICC-AVR application builder : 6/17/2009 9:52:10 PM
// Target : m2561
// Crystal: 16.000Mhz

#include 
#include 

void port_init(void)
{
 PORTB = 0x00;
 DDRB  = 0xA0;
}

//TIMER1 initialize - prescale:1024
// WGM: 0) Normal, TOP=0xFFFF
// desired value: 1Hz
// actual value:  1.000Hz (0.0%)
void timer1_init(void)
{
 TCCR1B = 0x00; //stop
 TIFR1 |= 1;	// Reset OVF
 TCNT1H = 0xC2; //setup
 TCNT1L = 0xF7;
 TCCR1A = 0x00;
 TCCR1C = 0x00;
 TCCR1B = 0x05; //start Timer
}

//call this routine to initialize all peripherals
void init_devices(void)
{
 //stop errant interrupts until set up
 CLI(); //disable all interrupts
 XMCRA = 0x00; //external memory
 XMCRB = 0x00; //external memory
 port_init();

 MCUCR  = 0x00;
 EICRA  = 0x00; //pin change int edge 0:3
 EICRB  = 0x00; //pin change int edge 4:7
 PCICR  = 0x00; //pin change int enable
 PCMSK0 = 0x00; //pin change mask
 EIMSK  = 0x00;
 TIMSK1 = 0x01; //timer1 interrupt sources
 PRR0   = 0x00;
 PRR1   = 0x00;
 
 //all peripherals are now initialized
}

//
void main(void)
{
 	 init_devices();
 	 PORTB = 0x80; 	   			  // Turn on motor (active high on B7)
 	 while ((PINB & 0x40) == 0);  	  // Wait for sensor to clear (active low on B6)
	 while (1)
	 {
	     while ((PINB & 0x40) == 0x40);		  // Wait for sensor hit (active low on B6)
		 PORTB &= ~0x80;				  // Turn off motor
		 PORTB |= 0x20;					  // Turn on brake (active high on B5)
		 timer1_init();					  // Wait a second
		 while ((TIFR1 & 1) == 0);
		 PORTB &= ~0x20;				  // Turn off brake		 
	 	 PORTB |= 0x80; 	   			  // Turn on motor
	 	 while ((PINB & 0x40) == 0);		  // Wait for sensor to clear
	 }
}
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I believe you should have:
#include
in your code. That file brings in a few others that you may be missing. I don't know if that will fix anything yet but it's a start.

EDIT: Oooops ignore that. I'm stuck in GCC mode. Sorry.

jon soons

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

A glance seems to indicate that you have enabled an interrupt and enabled global interrupts, but have not provided an ISR to "catch" the event. In many cases (depending on how your compiler handles it) this will result in a reset every time it hits.

Lee

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Where do you see interrupts enabled? The timer interrupt mask bit is set but CLI() is issued explicitly in init_devices(), and there's no SEI(). I thought that was the only way for global interrupts to be enabled? I just confirmed through JTAGICE that the I bit in the SREG is not set while the program is running.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

None of these points are likely to be "the problem", but I'll make them anyway before giving up:

    a) It doesn't look like you ever need the timer to run while your program does other useful work. If all you want is to generate a fixed delay, you can do it far more simply with a software busy-wait loop. All the toolchains have one or more of these in their libraries, I believe.
    b)You have only one clock cycle (at a blistering 16MHz instruction rate) worth of commutation between turning OFF your motor's drive and turning ON the transistor that shorts the motor out. Power semiconductors don't generally switch that fast, and I'd worry that the Turning-OFF device would still be in saturation while the other device is turning on
    c) Why are you fussing with setting up pin-change interrupts if all you're doing is polling the pins directly?
    d) Don't bother re-doing things that powerup reset initialization handles anyway (clearing global interrupt, setting bunches of I/O control regs to zero, etc)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for your thoughts L.

a) I find the timer easy enough to use directly; my IDE has a configuraiton wizard.

b) This is an excellent point, especially since I use a MOSFET to drive the motor and a 2n2222 to short the leads. Still, after inserting a 10 msec. delay between switching the MOSFET and switching the 2n2222 I get the same behavior.

c) That stuff was generated by the app configuration wizard. I already pulled a lot what was generated out and left the pin change stuff in as I might want to use it later and it's easier to just leave it there.

d) Old habit of just being redundant. By doing that stuff explicitly I don't have to remember what happens automatically and what doesn't.