Processor: mega169 (Butterfly)
Clock: 8MHz (external oscillator)
Timer1 prescaler: clkio/1024
I have an application where I want to measure the time it takes to stop a motor shaft once a brake is applied. The Butterfly has two inputs: one that tells it when the brake is applied and the other tells it if the shaft is spinning. It seems pretty simple, you just start Timer1 when the brake is applied and read Timer1 when the shaft stops spinning. The problem, for me, is how to convert the Timer1 value into something meaningful that can be displayed on the Butterfly display.
Each count in Timer1 represents 128 microseconds (8MHz divided by 1024 = 7812.5Hz = 128 microseconds). It seems logical to me that you would then multiply the Timer1 value by 128 to get the total number of microseconds...is this a correct assumption?
If I want the display to be in milliseconds then I just divide the microseconds value by 1000...correct?
I am sorry, I just seem to be having trouble convincing myself that it would be that simple. I have working multiply and divide routines (thanks to Atmel app notes :-) ) so that part is not the problem. I am just having a brain fart I guess.