# PC requiring too much power?



## Rainmaker22 (Dec 21, 2020)

We're configuring a new server with dual power supplies (Dell T340)... in our home. It has 2 power supplies of 450W each. Documentation shows each power supply is 100-240V, 6.5 Amps - 3 Amps. Plugged the server into a home receptacle last week and ran it for hours with no problem. Powered it down for a few days, then today it wouldn't power up using that receptacle. There are no circuit breakers that have tripped.

Plugged it into another receptacle (that may be on a different circuit breaker, not sure), and it powered up just fine, but other devices (i.e. VOIP phone, another computer) started going through their own power cycles.

Then, plugged it back into the original receptacle and it powered up just fine.

We're drawing too much power, yes? But my limited electrical knowledge requires me to ask these newbie questions: 1) Wouldn't it seem logical that one of the circuit breakers would have tripped? 2) How can I test another receptacle - specifically, in another location... a small office building - to confirm that there's enough "juice" there to power this server? 3) Can my Craftsman 82139 multimeter test the suitability of other receptacles? 4) Are we facing risk of fire by running this on a, how-you-say, "normal" office receptacle?

Thanks for your input...


----------



## jmon (Nov 5, 2012)

Rainmaker22 said:


> We're configuring a new server with dual power supplies (Dell T340)... in our home. It has 2 power supplies of 450W each. Documentation shows each power supply is 100-240V, 6.5 Amps - 3 Amps.


Maybe it got overheated, gfci somewhere, loose neutral/wire somewhere, wirenut, etc. idk. However, imo, that's really pushing the typical 15 amp breaker that most outlets are on. Breaker was probably ready to pop.

If you are going to be running server(s) with dual power supplies 24/7, and other things on that circuit as well, etc., if possible or doable for you, consider upgrading to a 20 amp breaker for that outlet circuit that you are running the server on just to be on the safe side. Just a suggestion.

Check first and make sure you have 12 gauge wire on that circuit. 12 gauge wire is required for 20 amp. Most newer houses and houses than have been re-wried in the past 10 years are usually wired with 12 gauge wire so all you would need to do is upgrade the breaker to 20 amp at the panel and change the outlet to the new 20 amp outlet type. Plenty of videos out there showing how to do this safely if you never done it before. Just a suggestion.


----------



## joed (Mar 13, 2005)

Sounds like a loose connection somewhere.
If it was drawing too much power the breaker would have tripped.


----------



## AllanJ (Nov 24, 2007)

Breakers don't trip at exactly their rated amperes, although they should not trip at less than that. Still, it is not a good idea to draw close to rated amperes for long* periods of time.

You could have plugged the computer into circuits with more or less breaker tolerance.

Unless the distance to the panel was very great, a slight overload in amperes will not result in a significant voltage drop.

( * The standard is that heavy circuit load should be no more than 80% of breaker rating for periods of usage more than 3 hours.)


----------



## Rainmaker22 (Dec 21, 2020)

Thank you all for those EXCELLENT replies. We'll check for a 20 amp outlet and breaker, and 12 gauge wire. Last question: regarding the amperage, can someone decipher this spec for us? "Documentation shows each power supply is 100-240V, 6.5 Amps - 3 Amps." Basically, in the US, how many amps are we using? 6.5, 3, or something in between? Thanks!


----------



## jmon (Nov 5, 2012)

Rainmaker22 said:


> Thank you all for those EXCELLENT replies. We'll check for a 20 amp outlet and breaker, and 12 gauge wire. Last question: regarding the amperage, can someone decipher this spec for us? "Documentation shows each power supply is 100-240V, 6.5 Amps - 3 Amps." Basically, in the US, how many amps are we using? 6.5, 3, or something in between? Thanks!


You are welcome.

Imo, It means each power supply has an automatic adjustable voltage range from 100-240v so it will work with any voltage in that range. In US usually people use 120v. In European countries they use 240v.

Same with the amps. It's a range from 3-6.5amps *depending on the voltage you are using.* You are running two of them, in the US at 120v so I'm guessing around 7 or 8 amps.

Wattage is 450w each X 2= 900 watts.

For now, here is what I would do; Everything is working fine, so just keep an eye on it. If it continues to happen, do what joed recommended, check for a loose wire/connection somewhere. If still no good, consider upgrading to higher amperage breaker/outlet.


----------



## Joeywhat (Apr 18, 2020)

Rainmaker22 said:


> Thank you all for those EXCELLENT replies. We'll check for a 20 amp outlet and breaker, and 12 gauge wire. Last question: regarding the amperage, can someone decipher this spec for us? "Documentation shows each power supply is 100-240V, 6.5 Amps - 3 Amps." Basically, in the US, how many amps are we using? 6.5, 3, or something in between? Thanks!


It's simple. When you supply it with 120v, it draws 6.5 amps. When you supply it with 240v, it draws 3 amps. Double the volts, half the amps. The power supply draws the same amount of power (watts) regardless of the input voltage.


----------



## Rainmaker22 (Dec 21, 2020)

jmon and joeywhat, thanks for the updated info. I've learned something. Volts x Amps = Watts. So 450 Watts/120 volts = 3.75 amps. So two of these power supplies would mean 7.5 amps. And a 15 amp circuit breaker SHOULD be ok, give or take. It always amazes me that people take time out of their busy days to answer questions such as these. Thanks again.


----------



## Joeywhat (Apr 18, 2020)

Keep in mind that the power rating of a power supply is typically the OUTPUT rating. The input required will be higher than that, due to losses in the process. An 80% efficiency rating means your output power is 20% less than your input. The unit should have both listed, so ensure you're reading everything correctly.

If it lists the input amps as 6, then assume it draws 6 amps. Do not base that off the output power as it won't be accurate.


----------



## HotRodx10 (Aug 24, 2017)

Just a guess, but I would be checking the power supplies independently to see if one of them is defective. Check to see if one is getting much hotter than the other; that would be a good indication of a defective power supply. The draw for a server should not be anywhere close to 900 watts. My understanding is that the only component type that should be drawing any significant wattage is some graphics cards.


----------



## Rainmaker22 (Dec 21, 2020)

HotRodx10, thanks for your reply. We ordered dual power supplies for the redundancy. 2 x 450 watts is what we got. I agree - 900 watts is not needed for this server.


----------



## adamz (May 13, 2018)

Perhaps re-seat the power supply bricks. They might have loosened in shipping. I've seen dual power supplies behave in that manner where the unit doesn't power up because it thinks one brick is bad.

If it happens again, Iike Rod said above, remove one of the power bricks to see if it fixes it. Then contact Dell, I suppose, if it does fix it.


----------



## user_12345a (Nov 23, 2014)

The power supply rating does not indicate what it typically outputs, just the maximum allowed.

Most computers draw far less power than the maximum from the power supply, and hence the input is lower.

900w is an insane amount for a server.
I don't know why it would need two power supplies other than redundancy - you can buy a single psu with a 1000w rating.


----------



## HotRodx10 (Aug 24, 2017)

If its drawing enough power to cause issues on a 15 or 20 amp circuit, something is seriously wrong with one of the power supplies. I may also be creating out of phase power feedback causing the issues with the other devices.


----------

