Hi,
I have read many resistor datasheets but can't find a curve or anything that would relate to resistors being used in surge conditions, only steady-state DC or AC conditions.
So how can I know if a 10R 100mW resistor can be used with five times the rated current and thus 25 times the rated power for 160 microseconds, or what power rating would be enough? I would hate if it started acting like a fuse and blow up.
I have two devices that connect together with a cable, and the master device gives also +5V power to the slave. There is some 10uF capacitance in the slave box, but the current consumption is very low, like 10mA max, so I have put a 10R resistor to limit the inrush current to 0.5A when the cables are connected on the fly.
Datasheets say that in steady-state DC or AC conditions a 100mW power rating is more than enough as it is calculated from resistance and current. 10R@100mW=100mA.
The resistors seem to be tested at surge conditions of 2 or 2.5 times the nominal rating, but the thing is that in my case the surge current starts at 500mA.
And since 5V with 0.5A means 2.5W of initial power dissipation, and because of the exponential decay, the power dissipation level of 100mW is reached after 160us.