OT - DSI Fluro Fade Routine

Go To Last Post
2 posts / 0 new
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

/Moderators - please feel free to delete this post from the GCC forum.

Afternoon fellow AVR freaks.

Currently I am writing some firmware (WINAVR) to control DSI fluros. DSI is a simple digital protocol that allows for fluros to be dimmed. Basically DSI sends an 8 bit word that tells the fluro what level to settle to. (Packet tx time=25ms)

0xff (255) = 100% (all on)
0x00 (0) = 0% (all off)

The system is setup so that when an external interrupt occurs (ie from path crossing detector) the AVR will act on it and dim the fluros up to a set level (HIGH) for a set time (DWELL), before returning down (LOW) to the original level. (basically it can be used to light a footpath when people are on it- then go back to a low level when the path isn't being used.)

As is, the firmware can correctly control the fluros using the DSI protocol, however I want more :)

I want to be able to specify the fade time that the fluro spends getting from one level to the other and this is where I am coming unstuck.

The idea is to span stepping the DSI level (8bit) over the fade time.

For example I may want to fade from 10%(0x19) to 95%(0xf2) over 10 seconds. So to do this i need to figure out a efficient way of step through the DSI packets over time (incrementing their values).

One thought I had was to divide the fade time into n sections where n= END - START. and use a timer routine to count through the n steps incrementing the DSI value each time - but this method runs into problems when
(n x PACKET TIME) > fade time
The solution to this is obviously to skip some of the intermediate tx packets between START and END - I just can't figure out a nice way to do it.

(START = normal DSI fluro level - 8bit val)
(END = Trigger DSI fluro level - 8bit val)

Is there a neat way to do this?

My main code loop is as follows:
(trigger is set to 1 when an external interrupt is triggered - subsequent triggering causes a reset of the HIGH level dwell time- ie resets count)

while(trigger==0)transmit_frame(LOW); //default
count=0; //reset x10ms counter
trigger=0; //reset trigger flag


//DWELL is specified wrt .1s.eg DWELL of 100=10s
//send HIGH until dwell time is reached
while(count transmit_frame(HIGH);

Currently my avr is setup with 2 timers.
One interrupting every 833uS (for sync of DSI data)
One interrupting every 10ms (for simple human timing uses.)

Anyway the Insert code here is where i'm getting lost so if anyone can offer any advice it would be greatly appreciated.

I also want the calculations to be done at run time and not compile time as I plan on using the ATMEGA8's external ADCS as value setters (using pots)for HIGH,LOW,DWELL and fade time.

Look foward to some ideas.

Thanks ppl.


  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Instead of using END - START to determine the number of packets, you could use the time needed. You always send packets at predefined times, so you just need to find out how much you have to increase/decrease the dimming value between each packet, and that is easy:

delta = (END - START) / fadetime

And between each packet, just increase the output value:

output = output + delta

Just keep the delta and output variables as floating point numbers, and that should be it. Keeping it all as integers, and still have it work, is possible. It just takes a little more work.
I have used the same principle in my LED fader (It's in the academy, project 177).