I am doing some XMega tests that require a 32.5MHz clock speed, and found this in the datasheet...
The 32MHz run-time calibrated internal oscillator is a high-frequency oscillator. It is calibrated during production to
provide a default frequency close to its nominal frequency.
This oscillator can also be adjusted and calibrated to any frequency between 30MHz and 55MHz. The
production signature row contains 48 MHz calibration values intended used when the oscillator is used a full-speed USB
Now this really has my interest, but I think I am not understanding the proper meaning of this text. What I get form this is...
1) I can change the 32MHz internal clock to anything betwen 30MHz and 55MHz by messing with the RCOSC32M bits in Studio6. If, so great!!
2) The 32MHz osc is actually set to run at 48MHz and not 32MHz from the factory??! This can't be correct, I have verified that mine is indeed running at 32MHz.
I'm sure I just misunderstood point #2, but the part about being able to alter the frequency is of great interest to me right now.
So ignoring this 48MHz text, I know that the internal osc is running at 32MHz right now. So if I just bumped it up to known percentage of 32, I should be able to get it to 32.5 MHz easily or any other value up to 55MHz.
I cannot seem to find a detailed doc on using RCOSC32M/A and how this affects the calibration. is this a 16 bit value of fractional increments of some kind?
I am away from my lab right now, or I would just hack around until I figured out the values.