From what I've read even the 4870 X2 has been known to show as high as 370+ watt consumption in synthetic bench tests. I think you're oversimplifying your method of trying to determine max power consumption.
The only figures I can find that really hammer a 4870X2 (with Furmark) is on Overclock3D. 495w for a system about as close to my Core i7 rig as I've seen - 3.8GHz 920, 6GB Corsair, Gigabyte UD5, XFX 4870X2... I've got more HDDs though... that's from 336w idle in Windows, so that's a 160w idle/load jump. Now I know that the 4870X2 (and the 4000 series in general) isn't all
that efficient at idle, but it's not horrible...
http://www.overclock3d.net/reviews.php?/gpu_displays/xfx_ati_4000_series_round_up/6
Just found some
TechPowerUp data showing
card power figures. 4870X2 idle is 73w, average 236w, peak is 273w and maximum is 381w, which yes, seems a little scary. I bet that was Furmark hitting all 1600 shaders for everything it could.
Compare that to the 5870 at 19w (idle), 122w (average) 144w peak, 212w (maximum)...
On paper, adding up the power provided from each possible source of power should be sufficient. However it's obvious that in specific scenarios designed to hit the cards as hard as possible, there is no limit on what they draw, short of blowing the PSU.
Does make me wonder what the maximum power draw of GT300 will be, though... :shock:
And as far as the talk of the GT300 being "all still FUD or fanboy supposition", not necessarily.
Actually, I meant about both the 5870X2 and GT300. Pics and figures are all very well, but I'll only be happy when we've got solid figures from multiple sources, rather than fifteen places all referencing one Fudzilla, Inq or BSN article... because all three of those places are good at 'imagining' stories for page hits. ;)
Check this out:
http://vr-zone.com/articles/-rumour-nvidia-gt300-architecture-details-revealed/7763.html?doc=7763
This would appear to verify 3 key things, 2 of which falsify earlier rumors. A 40nm process, contrary to ATI's so called report that was likely intentionally mistranslated ( I think Nvidia was being kind with those words), a non GT200 architecture, and 3 friggin billion transistors!
I say again, 3...BILLION... TRANSISTORS!
So I ask you at this point, does it really sound like the GT300 using 300 watts makes it inefficient, or that it makes it a stand alone, no SLI required freak of nature?
Yeah, looks sweet. You did notice the [Rumour] tag, though, yeah? ;)
Only solid info we've got to go on about performance of any of these cards is the fact that the 5870 is released. All the other cards are guesswork at best or outright lies at worst...