Ok I can't figure this out. I basically want to take two bytes and reduce it to one byte. So that ffff is ff and 0 is 0.
I tried this
unsigned char time = ( ( v1 | (v2<<8) ) / 0xff);
I display by result with 8 lights. and I seems to get 0-256. So I figured just take 1 off. Well that is -1 to 255. What did I miss? is there a better way to do this.
Also ffff/ff is 0x101(257) ?
and ff00/ff is 0x100(256) which is more then ff so I must have something wrong.
Sorry for such a dumb question, but I know I need to learn a less here. More then likely something simple I should already know :roll: Is it because hex and binary use 0 as a bit where is that has no amount for decimal?