Hi, I want to go from 400V to 7.2V (only have 400VAC to run my board off).
But I could only get 230V to 7.2V transformers.
So I figured no problem, as long as I draw less than the specified power at the secondary it will be ok. (ie Higher voltage less current).
So I calculated I need about 200mA at 5V (1VA).
I got a 230V to 7.2V, 5.5VA Transformer to account for losses and power supply drop.
So now I figured I have a 400V to 12.5V transformer. And since the output is 12.5 V instead of 7.2V, then I can only draw 5.5VA/ 14.5 = 440mA
Now the problem is the Transformer gets VERY hot when I am only drawing 34mA.
The primary draws 88mA at 400VAC,
The secondary draws 34mA at 14VAC
35W in, 0.5W out!! All the energy is going into the transformer itself???
If the primary is connected to 400VAC, and the secondary is not connected to anything, it still gets hot.
I have been reading a bit on transformer design and cant find anything about a transformer being designed to work at a certain voltage. Its always frequency, power out, and ratio of primary to seconday.
Are transformers voltage dependant?
The only fix I can think of is a big resistor in series on the primary, which is pretty stupid.
The circuit board is already made and everything mounted. I just couldnt test this power module part because I dont have 400VAC at home.
Thanks for any wisedom ! :oops: