I have a Tektronix TDS 2012B (100MHz, 1.0GS/s) and its not clear to me how frequency is calculated in various readout, and the manual isn't much help.
I am measuring what should be an 8Mhz clock signal. On the scope, the trigger frequency displays a rock solid 8.00012 Mhz (measured at 50% on either raising or falling edges). That's great and it seems to indicate that I have the load caps correct, etc.
But when I hit the "MEASURE" button on my scope and watch the frequency of the channel, it fluctuates from 7.991 to 8.009 and everywhere in between.
My guess is that the trigger display is an average. So I changed the channel acquire mode from "sample" to "average 128". After that frequency is more stable, but still moves occasionally between 7.996 and 8.000.
1. Do you think my guess about the trigger being an average is correct?
2. Seems odd the the trigger average is above 8.0 while the channel measure average is below 8.0
3. Any clever way to determine if the changing values are due to the measurement equipment or if they are really happening?