I inherited some code from a project where the previous design uses a timer overflow to interrupt the code everyone 1ms. He then created some software timers to check different functions (temperature sensor, ADC, button presses). The ADC is used to track a inductive sensor that is used to measure linear motion. The whole program is a state machine that uses the ADC to determine what state the device is in (position wise) and turns on an output signal and corresponding LED. There are four of these possible states and generally we're looking mostly for a transition of state 1 to state 3 (skipping over state 2). The problem I am having is to make sure that state 2 isn't activated while going from 1 to 3 and I am kind of stuck. At first I was trying to do a time based schema but that woudn't obviously work because the device can move whenever it wants and can be easily early or late to determine the state. The current approach is to use an average of the ADC value and compare the current average to the previous average and if they're greater then a certain value we can determine that the state has changed and the device may have stopped. This works much better but still has some blips come through. We tried to make sure we disabled the timer interrupt when updating states to make sure no new data is injected but nothing. Any other ides on how to handle this? We currently take 10 samples and the software interrupt is 11ms for the ADC values.