Hello everyone!
I'm going to be dropping a max 8.65V across a resistor and want to select the smallest resistor possible to minimize the effects of current draw under nominal conditions (much less than 8.65V). Looking at the max resistor size I can use for the application (an 0603 thin film resistor, 0.1% tolerance, 25 ppm), it looks like the max wattage I can get is just under 200 ohms (technically, 199.527 ohms).
Because of how specific the resistor specifications are, my options appear to be 200 or 301 ohms. My question is is there a general rule for selecting power of a resistor (ie select a power that is 1.5 times more than what you would use, you can use the max power rating assuming you don't exceed it, etc). For this application, it would be a one time PULSE of 8.65V so I'm assuming I might be able to safely choose a 200 ohm resistor without any problems but I figured I'd ask if anyone knows of a general guideline for selecting resistor values based on power ratings. Let me know when you get the chance. Thanks!