Voltage and Amps

Go To Last Post
5 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi all,

Just a quick question, but if I have a 30V 50Ah power supply and a motor that takes 60V 25Ah, will stepping up the 30V to 60V reduce the amps the power supply gives off to 25 (implying there is no resistance)? What's the relation between amps and volts when it comes to stepping the volts up or down, if there is one?

 

Thank you.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

Greetings and Welcome to  AVR Freaks!

 

Stepping up the voltage from 30V to  60V doubles the current. That is, a motor that takes 60V at 25A WOULD take (at the supply) 50A if you were to get that power by "stepping up" from 30V. If this is DC, that would be a significant undertaking since that is 1500W. If it is AC, then it would "only" need a transformer, but that transformer would have to be fairly large.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Fri. Jun 28, 2019 - 05:14 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

What's the relation between amps and volts when it comes to stepping the volts up or down, if there is one?

 There are many types, so no single answer.

For a resistor it is linear...double the voltage===>>> double current, quadruple watts

For a constant power load, double the voltage, ===>current is cut in half, same watts  (example a 100 % effcient poweupply driving a fixed load)

For an LED, current climbs rapidly with applied voltage....output proportional to current

For motors, it is usually nonlinear, due to the motor & dynamics of whatever is driving (fan, pump, drill...) 

   for a DC motor  speed proportional to volts, torque proportional to amps

 

All wires & connectors have resistive losses that waste power (heater), that is proportional to current squared....

That is why your power line is around maybe 13000 volts...that reduces the current flow until it get right to your house transformer (down to 120VAC).

If they just sent out 120VAC the currents would be so high that the losses would be astronomical & the city would be like a giant heater.

 

By the way they DO overload utility lines in remote winter areas to prevent ice buildup. There may be a power line 25 miles from nowhere in a forest blizzard..just heat the wires to prevent ice  & pass along the cost.  I have the equation for that around here somewhere.

 

 

 

 

  

 

 

 

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

SwordMan526 wrote:

Hi all,

Just a quick question, but if I have a 30V 50Ah power supply and a motor that takes 60V 25Ah, will stepping up the 30V to 60V reduce the amps the power supply gives off to 25 (implying there is no resistance)? What's the relation between amps and volts when it comes to stepping the volts up or down, if there is one?

 

Thank you.

 

If your motor is designed for 60V and 25A then that is what it will take from a 60V supply.

Assuming an AC motor,   you would just use a (perfect) transformer.

 

If it is a DC motor you use a DC to DC converter,

 

Nothing is ever perfect.   There will be heat losses in either AC transformer or DC-DC converter.

Motors are difficult to "drive".    They take massive startup currents.    And have bad Power Factor numbers when running.

 

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Ah is the unit of battery capacity - amps per hour. You want Amps.

 

30V times 50A = 1500W. Watts is the unit of work

 

If the motor is 60V, 25A, then the Watts is 1500W.

 

In any form of conversion, there is loss. Depending on the technique, this can be significant.