So I discovered that my dad has one of those clamp meters that can measure AC current draw. Multiply that by the AC voltage value (~240V here) and voila, you get the number of watts drawn by your power supply First off, my system specs: AMD Opteron 146 @ 290x10, 1.4V x 104% DFI nF4 Ultra-D, LDT 1.3V and chipset 1.6V 2x512MB OCZ Value VX @ 207MHz, 2.9V HIS Radeon X1800GTO @ 16 pipes, 612/612, 1.2/2.004/2.004V Chaintech AV-710 200GB Seagate 7200.9 SATAII BenQ DW1640 Toshiba DVD-ROM Various fans PSU is an Enermax 535W Noisetaker II, runs at close to 80% efficiency as tested by SPCR. Current draw when idle: 0.46-0.50A D2OL (CPU 100% load): ~0.63A Gaming: 0.82-0.83A So the PSU draws around 200W when gaming, give or take. At 80% efficiency, this would mean about 160W of power going to the actual components. I also wanted to test something else - whether CPU clock speed scaling affects the amount of power it draws when the Vcore remains unchanged. So I downclocked my CPU to 207x10MHz (to maintain the same RAM speed) at the same voltage and got the following reading when running D2OL: 0.54-0.57A Quite a big difference; I didn't really expect that. Hope you guys found this interesting
thats quite interesting particularly as my PC is a similar config 146 opti DFI expert 7800gt 320hdd 120hdd \antec 550w neo psu
200W only when gaming?! That's so low, considering that you have a 535W psu. I guess we really dont need a big psu huh?
Well personally I got a good deal on this PSU 2 years ago; I knew that it was going to be more power than I would use but it was worth it You still have to put a bit of thought into the PSU selection process though; a PSU rated for 500W that cannot provide enough amps on the 12V rail(s) is not very suitable for modern components. Still, you make an interesting point
then what psu is recommended for a system running E6400, x1950pro, 1GB DDR2 and all the other usual stuff...??
I don't find 200W average power consumption while gaming to be anything but expected. It would be a lot nicer to see a graphical representation of the power consumption over time, with scaling down to at least per 10ms. ... or at least get the peak value as well. I suspect the power consumption peaks at power on (to spin up the HDDs) and then occasionally during gaming when both the CPU and GPU work at their peak load. /Olle
Was that directed at me, specifically? Anyway, the reading when powering up was around 0.6x..I would imagine that hard drive power consumption is not that much, compared to modern CPUs and GPUs. Perhaps it would be a different situation if we were talking about a server with many hard drives, but that's not the case here...
Yeah it would be nice to have a little 'booger' like that to use on my system to measure current. It may scare me to know the amount of real *juice* it's actually using
Sort of, but also made as a general comment. It's always good to know the peak consumption if/when tight fusing on the supply line comes into play. Like on the LAN party where every eight users had to share 10A (~275W each). I suppose you're right. I made that remark based on my own experience with my first computer that an IBM 486SL2/66 CPU and some Trident graphics card. When I added a second harddrive the 50W PSU wasn't enough to spin up both drives at once. /Olle
Well in regards to the first point, I'm working with an A$65 (~$50 USD, for reference) piece of equipment that has ±2% accuracy, so this was never meant to be a comprehensive "scientific" test; it's just to satisfy a bit of curiosity
TBH, that's really good enough. There's no real need for you to know the actual power consumption up to the closest 0.001W.