Suppose r17 and r18 each have a 3-bit value between 0 and 7.

I want to subtract r18 from r17, multiply the difference by 16, then divide it by 7.

I'm doing something wrong... but I'm not quite sure what.

r17 = 0, r18 = 7

r17 - r18 = 0-7 =11111001 (two's complement of -7)

test bit 7; if 0, left-shift the value 4 times... otherwise:

subtract 1: 11111000, then invert: 00000111 ("7")

left-shift 4 times: 07 -> 0E -> 1C -> 38 -> 70

invert: ~70 = 8F, then add one: 8F + 1 = 90 (binary: 10010000)

Now, I KNOW -112 / 7 = -16, and -16 in binary is 11110000. But when I try repeatedly subtracting 7 from 0x90 until I overshoot, I get 0x14 (00010100)

SO... what SHOULD I be doing to 10010000 with 00000111 so I end up with a 11110000 somewhere? With positive values, I'd repeatedly subtract 7 from the first value (incrementing a counter each time carry is still clear and zero isn't set), and it works perfectly. But when one of the values is negative (like the example above), something goes astray.