The "need one resistor to ground, on the output of the DAC" part. Do you mean the 2R grounded on the last bit of the DAC? I dont think the output end of the DAC is shouldn't be grounded.
Why not? Think of it as a two black boxes, one is the DAC with normal 5V output range and some 10k source impedance, the other black box is a divider to make it 1V output range. You could use two resistors for the divider, but since the DAC already has some output impedance that is known, you can just put a single resistor to ground and voila you have a DAC and divider combined.
I'll put in a buffer and see how that goes. With a 2-bit DAC, I don't need a buffer
at all and it works fine. Any reason why? (The link in my orignal post explains why for the 8-bit DAC. Just wondering it is good also to put a buffer for the 2-bit DAC as well.)
I don't know a reason why you need a buffer with 8-bit DAC and not need it with 2-bit DAC. There is no difference. Except if you have large source impedance and you really need small source impedance. With a 2-bit DAC, you can select your resistors to provide the necessary voltages and source impedances for 75 ohm video and that is fine. With a 8-bit DAC you most likely can't make the R=75 and 2R=150 as that would consume a larger current, and the output impedance of the pins would affect the output accuracy a lot.
And since an 8-bit DAC is able to give 256 different values, you should use resistors better than 1% which are accurate to one part in 100 only. But since IO pins have some output resistance too, you should keep the IO pin current as low as possible, meaning the DAC resistor R values should be so large that the IO pin output impedance is also less than 1% of the DAC resistor R value.