Hi! I've just learnt that UPS specifications usually use "V.A." instead of Watts. I knew Watts = Volts x Amp but it seems they use a conversion factor as: "Watts = 60% V.A." Being "VA" maximum amount delivered by PSU for a while. And "Watts" as the usual power transmitted. I've checked that values in every specifications I've seen and they fulfill rule "Watts = 0'6 x VA", except the one I just bought: TRUST 1000VA which outputs 480W ( amazon.co.uk/Trust-15600-1000VA-UPS-PW-4100T/dp/B0012RS976/ref=sr_1_1?ie=UTF8&s=electronics&qid=1263238155&sr=1-1 ) My PSU is 530 W (for core i7, nvidia GTS250 and 2 drives)... can I use that UPS for my PC ? Or do I need an UPS for 530 real Watts? Price increases a lot for that UPS's. Thanks.
You do not need to buy a UPS based on the PSU rating. 530W is just the maximum rating it is capable of outputing, doesn't mean your system is using that much power all the time. I have a 480W PSU, but at idle, my system is only consuming about 120W, and it will last for 45mins before the dies. More than sufficient.
Ah, ok. But when power isn't gone and my PC needs full power, would UPS give sufficient power ? When there is electric power, would UPS give enough power no matter its limits ?
Why would you need full power when there's a power cut? If you are worried about data lost, A typical desktop like your system should not even exceed 400W usage. AnandTech: AMD’s Radeon HD 5770 & 5750: DirectX 11 for the Mainstream Crowd Take a look at this power usage when at load, for a similar system spec as yours, Core i7, GTS250, the load usage is only hovering at 300W, that is assuming if you are gaming during the power cut. Adding the additional power of 50W for your LCD monitor, I think 350W is a safe value for your system. I'm presonally using a 1000VA, I used to power 2 PC with it. But now I'm using it for my own PC, and it is capable of handling 300W with ease.
The UPS will have no problem giving you enough power. The issue would be how long will it keep your PC (and monitor) powered.
That's a really weird question. When there's no power cut, it basically bypasses the UPS (or charging the UPS if it is not at full charge).
Erm, wrong on most counts. VA is the apparent power that the source sees the other end as drawing. Generally it is named apparent because no all loads (the PC, devices connected to the UPS) draw power in a fully linear (Resistive) fashion. Inductive loads, Capacitative loads, distortive loads and so on due to their nature consume higher apparent power although their "Real Power" measured in watts is basically lower. Generally this is because these loads cause changes to the system in terms of voltage and current without actually consuming any real power, thus you need apparent power to compensate. The ratio of apparent power to real power is called power factor. Thus if you have a 1000 Watt PSU that draws a power factor of Unity, meaning 1, you can mate it with a 1000VA UPS without issues. The reason for 0.6 is that because it's the lowest mandated EU standard for computer PSU power factors (requiring passive PFC). A PSU of 600 Watts real power with a PFC of 0.6 has an apparent power of 1000 VA. (600 / 0.6). BUT, this is for only ONE power consuming object, IF you have many different devices connected to the UPS with different power factors, the calculation is vastly more complex of course. If you require the method for calculating PF and combined PF, you can just post here and i'll give the method, but in the meantime, 0.6 is a good and safe figure to work with. The End.
Oh! Thanks Empire. It's OK now. I thought PSU power output meant "PSU delivers that power no matter there were a blackout or not". )