Given that the 2313 doesn't have an ADC, what would be the best way to measure a voltage fairly accurately ? I know about using the resistor/cap combo and timing the charge/discharge cycle for it, but how accurate and calibrated can you get with that ? How would you actually calibrate it, and keep it calibrated ?
Another method is using one port as an R2R ladder DAC into the comparator against the voltage being measured. This would require high-accuracy resistors for the R2R - Bourns has some R2R resistor networks (4116R-R2R-503), as do other manufacturers.
I have a ton of 2313's on hand, and would like to use them up :)
Actually, I need to build a small calibration tool for another project. This would produce a square wave at a given freq, displaying it on 4 LED displays. That part is pretty easy, using a good crystal.
I also need to provide a known voltage, accurate to at least 0.1V, preferable to 0.05V, over 12V (8 bits). Given that the voltage would probably be a little bit off, I want to be able to display exactly what is being provided.
This way the calibrator can be sent out to be plugged into the other project. A switch selects between displaying freq or voltage being applied, and another selects between two voltages.