Create your free account now! Sign up

Re: amperage draw


"AC power consumption in WATTS divided by a factor of 43 = amps required. This factor of 43 is based on engineering studies of Philips power supplies."

This is just Ohms Law (I = P/V) using a worst-case scenario. The "engineering studies" are really just saying, "we've found that worst case will be around 43 volts so use that all the time." Since it actually could be as high as 90V, that's a huge margin of error being worked in there.

The reason an amplifier is spec'ed in Watts now is because its power supply only draws as much amperage as it needs. The average current draw (amperage) will change depending on the voltage it sees (I=P/V). This makes it much more efficient but also makes pencil-and-paper calculations more inaccurate and error-prone.

In the old days of linear power supplies in amplifiers (we're talking 330Mhz or prior stuff here) the amperage (I) was constant. People got used to making those pencil-and-paper calculations that were pretty accurate. It's not that way anymore.

Fairly close calculations can be done using Ohms Law and some experienced guesstimating, but having some type of software that does the math is best.


This is CABL.com posting #125245. Tiny Link: cabl.co/mGKf
Posted in reply to: Re: amperage draw by Buffalo Chips
There are 0 replies to this message