Imagine you need to design a power converter for an aplication. What rules are there to decide the maximum ripple voltage the converter should present?
I've always seen 50mV as a usual value, but I actually can't find an explanation for it. And, 50mV certainly isn't the same safety margem for all output voltages.
If I wanted to power an AVR (think "no external devices" and "no ADC" or analogic stuff), I guess I would use the freq versus Vcc specs or minimum input high voltage, what is more strict. And maybe take into consideration and EMI that could be present in the system's production environment.
Is any "vcc stability" spec in the AVR datasheet that has escaped me?
What do you think about "ripple voltage" and what criteria do you use to set a maximum value for it?