Widescreen Gaming Forum

[-noun] Web community dedicated to ensuring PC games run properly on your tablet, netbook, personal computer, HDTV and multi-monitor gaming rig.
It is currently 06 Nov 2024, 02:50

All times are UTC [ DST ]




Post new topic Reply to topic  [ 63 posts ]  Go to page Previous  1, 2, 3, 4, 5 ... 7  Next
Author Message
 Post subject: GT300 News
PostPosted: 29 Sep 2009, 21:23 
Offline

Joined: 02 Jan 2006, 18:49
Posts: 913
Still though guys, if what's being said here is true of the GT300 competing with the 5870 X2 (and with 3 billion transistors why wouldn't it?), 300 watts is very reasonable power consumption given that it's much less than what two 5870s draw.

My point on the over simplifying was that though power/wattage calculations in essence are cut and dried, testing methods to determine them are not. That is why you have all these ways of measuring audio amps. Most reputable brands use A weighted, but some do not.

Point is we don't really know if that 300 watt figure was derived via an unrealistic max wattage scenario (PSU blowing as you put it) or via what the card typically draws under normal circumstances.

I also find it hard to believe that die shot even though blurred is photo shopped or something as far as the legible specs on it.

Could you guys please respond to my last paragraph which I edited in after you responded though?

"My only concern at this point is what speed a 920 will need to be clocked at to avoid a bottleneck. Preferably I'd like to avoid overvolting the CPU, but at this point I'm wondering if it will be necessary just to run this monstrous GPU without the system being unbalanced."

So what do you think, if what's been said about the GT300 is true, do you think I could do a moderate non overvolted OC on a 920 without having a bottleneck? I'm assuming I could hit at least 3.6GHz without overvolting.

Some say 3.4GHz is fine for any dual GPU setup prior to the debut of the 5870, but I'm not sure even 3.6GHz would suffice for a single GT300 with such specs.


Top
 Profile  
 


 Post subject: GT300 News
PostPosted: 29 Sep 2009, 21:34 
Offline
Insiders
Insiders
User avatar

Joined: 15 Apr 2005, 23:27
Posts: 1172
3.6 is well within reach i think yep :D

and other thing is that at very high res and/ or AA its more GPU limited anyway

_________________
P8Z68-V Pro | 2600K | HR02 | HD5850 | 2x4GB Vengeance LP | 128GB M4 + 6TB | X-Fi > HD595 | AX850 | Tai Chi | PB278Q | G110 + Deathadder 2013
P8Z77-V | 3570K | Mugen 2 | HD5850 | 2x4GB Vengeance LP | 500GB | X-750 | Fractal R3 | U2212HM | G110 + G400
P8H77-I | G860 | 4650 | 2x2GB XMS | 320GB | CX500 | Prodigy | T22B350EW | MX518
DC3217IYE | 1x4GB Vengeance | 64GB M4 | TX-42VT20E


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 29 Sep 2009, 21:38 
Offline

Joined: 02 Jan 2006, 18:49
Posts: 913
3.6 is well within reach i think yep :D

and other thing is that at very high res and/ or AA its more GPU limited anyway
What I keep thinking too Gill is that with DX11 allowing way more processing on the GPU like multi threading, physics, etc, there will be less need to OC the CPU to balance the system load.

I seem to recall saying on one forum that the 5870 was well timed considering even more load will be handled by the GPU with DX11, but Nvidia appear to be addressing that fact even more so.


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 29 Sep 2009, 21:58 
Offline
Editors
Editors
User avatar

Joined: 14 Oct 2003, 13:52
Posts: 5706
In multi-GPU scenarios or low detail settings, CPU frequency is important. Not so much in single-GPU or extreme detail settings. And sorry, I can't see many people buying GT300 then playing in 1024x768@low. :lol: ;)

You're right, Frag, in that if that 300w draw is 'max' then it's OK. If it's something like 'average' draw, then we could look forward to it trying to draw 400w+ as it's 'maximum'. Which, frankly, would be bloody dangerous if it's through a PCI-E slot, and a 6- and 8-pin power lead. Twin 8-pin might just cope. Might. Each of those 2-pins is supposed to carry 25w, but I seem to remember somewhere doing a power draw test on a 1000w PSU and finding that each of those 2-pin PCI-E bits could provide about 40w... that's 320w right there with twin 8-pins loaded above-spec. I'll try to find it. It was a 'take a PSU to a proper test facility' rather than a 'shove it in a PC and see if it runs'... might have been Johnny Guru, would make sense. Can't find it there, though. :( Might have been on a forum. Spent way too much time surfing hardware forums in my free time in recent weeks. :oops:

...

While this is a fairly educated guess on my part, in absolute terms I hit 3.8GHz without a voltage bump on my i7 CPU (I've since tweaked that 1.275v back down to 1.25v and it's still 100% stable) but as each chip is different I'd say 3.6 was the better target.

As for what would eliminate a CPU bottleneck? Hm. I'm going to presume you'll be running 1920x1200@either 4 or 8xAA/16xAF, as there really seems little other point in getting GT300/5870 otherwise. ;) From that, I'd say that you're unlikely to encounter a CPU bottleneck with a single card. If you went dual card (for whatever reason) you might begin to encounter bottlenecking there.

However, it's also totally dependent on the games you play - as many people have pointed out, Half-Life 2 continues to show framerate improvements even as you scale a Core i7 into the 4.6GHz region. But the Source engine is probably the most CPU sensitive engine in common use now.

I dunno how DirectX 11 will take GPU works. If it puts each task onto the processor that can deliver with best efficiency, then GPU physics would probably be a good idea. As for overclocking CPU to balance system load... who knows. The CPU will still need to assemble it all, particularly if everything is threaded off neatly. DX11 might finally deliver games that actually use these multicore CPUs properly! :)


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 00:38 
Offline
User avatar

Joined: 29 May 2006, 02:23
Posts: 873
my take on this is...well, we've seen this before. ATI has just launched to 9800pro. The 4870 was the 9600pro.

Now NVIDIA is claiming that the 5950U will kill it.

Any one of us knows that when NVIDIA is behind, they are not in their A game. The 8000 series was fantastic...why? the 7000 series was incredible. Just something I've noticed...they don't like losing, and they tend to cut corners to win.


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 10:29 
Offline
Editors
Editors
User avatar

Joined: 14 Oct 2003, 13:52
Posts: 5706
Hm. Interestingly, I've not seen a great deal of bragging from nVidia about this launch.

What I have seen is them rubbishing the competition (standard practice now, unfortunately) and downplaying how much use DirectX 11 will be... (I suspect once they have DX11 hardware out their tune will suddenly change...) I've not seen leaked benchmarks or NDAs being 'broken'.

Which says one of two things to me; either GT300 needs more time to mature on the process they're using, and they're scrabbling to get clockspeeds up high enough to compete (like ATi did with the X1800 cards and the soft grounding issue they had which delayed it by a good eight months...) or it is very good, but they learned that mouthing off about "opening cans of whoop-ass" doesn't work when the competitors products are significantly better than anticipated. Because you're not telling me that nVidia thought the 4000 series was going to be as good as it was, or they wouldn't have done that quick about-face on pricing of the GTX260/280. A $200 price drop is a serious market miscalculation.


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 11:04 
Offline
Insiders
Insiders
User avatar

Joined: 15 Apr 2005, 23:27
Posts: 1172
http://www.tomshardware.co.uk/radeon-hd-5850,review-31692-14.html

interesting benches, the 2nd one in particular.

shows that even @ 4x AA and 8x AA @ 1920x1200 the CPU does play a role if its got a hefty OC.

but only on some games. And this may change in the near future with the advent of the DX11 game

_________________
P8Z68-V Pro | 2600K | HR02 | HD5850 | 2x4GB Vengeance LP | 128GB M4 + 6TB | X-Fi > HD595 | AX850 | Tai Chi | PB278Q | G110 + Deathadder 2013
P8Z77-V | 3570K | Mugen 2 | HD5850 | 2x4GB Vengeance LP | 500GB | X-750 | Fractal R3 | U2212HM | G110 + G400
P8H77-I | G860 | 4650 | 2x2GB XMS | 320GB | CX500 | Prodigy | T22B350EW | MX518
DC3217IYE | 1x4GB Vengeance | 64GB M4 | TX-42VT20E


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 11:45 
Offline
Editors
Editors
User avatar

Joined: 14 Oct 2003, 13:52
Posts: 5706
Three issues I have with that data:

1) It's Tom's Hardware.
2) Left4Dead is Source engine based and therefore known to favour increasing CPU speeds.
3) Other differences are within statistical margins of error.

I'd be saying the same no matter whose CPU/GPU was on the bench - show me differences in the 5%+ region and I'll say they're significant. But show me 1fps difference and I'm just not gonna call it significant when FPS are >30, sorry. That said, there is a 6% difference with RE5, too, which is just entering into significant territory. But Capcom's last game (DMC4) gobbled CPU cycles for increased framerate as well, so again it's not totally unexpected.

But most of those games show <3% difference, which for a 36.5% clock boost is... a poor indication that they're CPU bound, sorry.


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 15:42 
Offline

Joined: 02 Jan 2006, 18:49
Posts: 913
Not to mention that where differences are shown the frame rates are way higher than you'd notice in game anyway. Not really worth it IMO to ramp up the CPU higher than you need to just for bragging rights or some misconception over bottleneck worries. It's gonna run smoother, cooler and longer at a comfy 3.6GHz.


Top
 Profile  
 
 Post subject: GT300 News
PostPosted: 30 Sep 2009, 16:00 
Offline

Joined: 28 Jun 2009, 22:17
Posts: 760
shows that even @ 4x AA and 8x AA @ 1920x1200 the CPU does play a role if its got a hefty OC.

Agreed with Paradigm Shifter & Frag Maniac
TomsHardware isn't what it used to be...

The CPU can play a role in some conditions while gaming but this doesn't show the right conditions
(games that require a lot of CPU power: X3 in big fights and/or with AI mods, GTA4 :roll: , Supreme Commander, a good % of strategy titles like Civ4 when pushed to their limits and so on ...)


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 63 posts ]  Go to page Previous  1, 2, 3, 4, 5 ... 7  Next

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users and 6 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  




Powered by phpBB® Forum Software © phpBB Group