So I'll try to spit out all the details.
I have an I2C signal conditioning A/D chip running at ~2000 16 bit samples a second. Meaning every 1/2000 seconds, or 0.0005(X) s, I get a sample, this takes time.
I have a usb to comport reading the com out of an atmega 128 running at 115200 baud(7.3728 MHz system clock). Meaning one bit every 0.00000868056 seconds. Now while I'm just txing the raw 16 bit not rs232 standard ascii, I still have to add in the start and two stops after every byte. This means 11 bits per byte and 22 bits per sample or 0.00019097222(Y) seconds to transfer one sample through the comport. This means I can buffer up to 2ish samples in memory, (X/Y), before a dump from the mcu to the host pc. This should achieve the desired ~2000 samples/sec output.
Or is my logic failing me
I was wondering if someone here has some experience in quasi realtime a/d to a pc.
What should I look out for.
Speed is more important to me than accuracy, if I lose a sample or three every 100 I'm ok with it. I don't have access to a pc with a real comport, so I have to use a usb dongle. I have done some research that points to the fact that there is a 4 ms overhead time to handle the usb protocol. Though I haven't found anything that states what happens if I open the comport before initiating the transfer of data.
Would it just be better to tx every sample between readings?
In this while loop I have to check the receive buffer for the stop character.
There is some post processing that has to be done to the data to correct for range gain and sensitivity of the transducer, but I think I can count that up to system lag. I just don't want to be sampling faster than I can transfer and I don't want to lose every other sample due to the TX.
What do you think?