Sign-in / Sign-up
Your question

GT300 series, real specs revealed.

Tags:
  • Graphics Cards
  • World Of Warcraft
  • Fortune
  • Graphics
Last response: in Graphics Cards
April 27, 2009 6:36:40 AM

http://www.brightsideofnews.com/news/2009/4/22/nvidias-...!.aspx

wow, seems really powerful; I hope it doesn't got a fortune though.

More about : gt300 series real specs revealed

April 27, 2009 9:48:15 AM

Looks very promising. At least then we will have something that can crash crysis.
By the way have you seen the RV870's specs? It looks like there will be a serious battle in the GPU market. I cant wait :bounce:  for the bloodshed/bleeding edge graphics and the competitive price cuts! I am going to get myself one of these for very cheap :D 
April 27, 2009 10:44:34 AM

I'm extremely excited for these cards, I just wish it didn't seem like nVidia was going for the huge chip again and would go the way of ATI with efficiency, not brute computational power. They might still though, either way I'm excited.
Related resources
Can't find your answer ? Ask !
a b U Graphics card
April 27, 2009 11:22:50 AM

two slightly smaller chips seem to scale poorly in most apps - thats the biggest issue, they need some kind of third "master" chip to properly and efficently use all that power etc...

wonder when larrabee will show up too...
April 27, 2009 11:32:53 AM

AMD have always had price/performance their strategy to win over the GPU market as opposed to Nvidia's brute strength strategy.
The info states that the GT300 single chip card will yield 3+ T Flops as opposed to AMD/ATI's 2.16TFlop 5870. The GT300 will definitely be riding on a 512bit memory bus and the 5870 a 256bit wide memory bus.
Clearly the info points out that the GT300 will be quite pricey due to production cost.
April 27, 2009 11:38:27 AM

It seems that the new nVidia flagman will again be HUGE. I dont know how scallable will this chip be as if they want to be in the mainstream segment it will be too expensive to make such huge chips and sell them cheap. On the other hand ATI have great advantage in this area. I even expect the 5870 to be less size than the 4870 and still be around 50% faster.

a b U Graphics card
April 27, 2009 12:35:32 PM

The new 8800Ultra.

BTW, the 8800Ultra is probably the one card that actually was worth paying >$500 for; its still competitive, even today.
a b U Graphics card
April 27, 2009 4:38:37 PM

now i'm hoping these next-gen cards from nvidia will sort of allow me do some SQL programming via some native visualc++ plugin different from CUDA. just in time for my thesis. imagine how fast a single search query @ 3tflops would be, im thinking i'll be getting an A+ for that.
a b U Graphics card
April 27, 2009 4:52:01 PM

Unbelievable hoe GPUs are bucking the trend of micro-sizing, shrinking. Motherboards, CPUs, RAM all getting packed into smaller and smaller PCD yet GPU cards are growing. Why not make an LGA style GPU chip and HSF and use that method?
a b U Graphics card
April 27, 2009 4:53:03 PM

What I wonder is, what are Nvidia going to do at the mainstream?

Even at 40nm eventually, the G200b is going to make a pretty expensive mainstream card. The reason Nvidia are losing the mainstream market now is because their parts cost too much to manufacture.

Anyway, at the enthusiast end the 5870 is going to pretty much annihilate everything currently available, until Nvidia release the gt300, then ATI double up with the 5870 X2 and....wait a minute doesn't this sound awfully familiar?
April 27, 2009 5:01:41 PM

Seen both specs and I am also really excited. Looking to see how the new MIMD fairs over SIMD and where ATI will actually sit with their final ROP and TMU count
a b U Graphics card
April 27, 2009 5:09:11 PM

Excited to see the new architectures for sure. Hope they both end up relatively on par so we see some nice prices. Seems the relation between the 300 and 5000 series will be similar to the 4000 and 200. sure hope so as I love price wars :) 

Either way, the 5870x2 and the 380gtx sure will spew out a ton of FLOPS
April 27, 2009 5:18:36 PM

Quote:
Theo Valich


Stopped reading there
April 27, 2009 7:08:42 PM

GTX 395, you are mine!!!
a b U Graphics card
April 27, 2009 7:09:24 PM

Yea, dont take these specs as absolutes, they are speculated specs if you will.
As far as slightly smaller vs behemoth chips, xbits has an excellent review showing the 4890 in CF holding its own vs the 285 in SLI.
Lets face it. nVidia has turned towards the gpgpu solution, and thats a major part of their strategy as well as their die size. Theyll be going DP in a major way for this, and that costs realestate on the gpu. Theyre trying to drive the market in this direction, trying to preempt LRB, and want to compete and lead in that market. What I wonder is, will they continue heading in this direction , having both gpgpu and gaming perf together, or make 2 units, each aimed specifacately to each market?
April 27, 2009 7:53:20 PM

JAYDEEJOHN said:
Yea, dont take these specs as absolutes, they are speculated specs if you will.
As far as slightly smaller vs behemoth chips, xbits has an excellent review showing the 4890 in CF holding its own vs the 285 in SLI.
Lets face it. nVidia has turned towards the gpgpu solution, and thats a major part of their strategy as well as their die size. Theyll be going DP in a major way for this, and that costs realestate on the gpu. Theyre trying to drive the market in this direction, trying to preempt LRB, and want to compete and lead in that market. What I wonder is, will they continue heading in this direction , having both gpgpu and gaming perf together, or make 2 units, each aimed specifacately to each market?


I just read that review and it showed that 4890 crossfire and GTX 285 SLI are "identical" at high resolution with each having a game that runs decidedly better on that particular platform. (both are terrible games to be honest.) All I can say is wow, I might be trading my 4870 X2s in for 3 4890s!
May 11, 2009 5:00:55 AM

Wow. These specs are awesome. Hopefully Nvidia and ATI will have this "Price War" so I get get myself one of these puppies! :sol: 

I think this pushes me to wait to do my upgrade. Gonna get windows 7 64-bit, overclock my C2D E8400 to 6Ghtz, go to 8gb of ram, and most importently, dump my 8800GTX and get a GTX 380 or a 5870 X2!

(Yes, I mean 6 Ghtz!) :sol:  :sol:  :sol: 

:lol: 
May 11, 2009 1:21:00 PM

turboflame said:
Quote:
Theo Valich


Stopped reading there


Who ever gave him a thumbs down needs there head looking at. Theo Valich is one of the worst tech reporters I've ever come across; mis-leading, biased, badly source, poor written are just a few of the terms that can be used to describe Mr Valich's reporting style. He even worked for Toms Hardware for a bit as a reporter, I haven't seen a news item from him for ages so I can only assume that someone fired him (good on them).

It's going to be interesting to see just how much faster the GTX 380 is then the HD 5870 and how much more it's going to cost to buy. If the ratio's are the same as the last years (GTX 280 Vs HD 4870) then I'll be buying a the Radeon to replace my ageing 8800GT, if it get thumbed into Oblivion (8800GTX Vs HD 2900XT) then I'll be looking at getting a GTX 360 or equivalent.
May 11, 2009 1:53:11 PM

To Do List:
-need to get USB wireless adapter
-need to get Crysis Maximum Edition
-need to get Razor Lycosa keyboard

Wait a Year:
-Get a 1920x1200 120Hz 3D screen and Nvidia 3D bundle
-Get GTX 395
a b U Graphics card
May 11, 2009 2:02:04 PM

lol at GTX395.. if they make the GPU any bigger it would be 2 feet long...
a b U Graphics card
May 11, 2009 2:33:02 PM

I suspect most of what we will see in these speculated specs will be aimed at gpgpu usage, not that the G300 wont be a killer card, but using these numbers, which are pure speculation at this point, and saying theyll be directly used for rendering/gpu use mainly/only wont happen. Look at the difference in transistor count and die size between the R700 and the G200. The G200 isnt almost 50% better/faster than the R700, more like 15-20% at best, and loses at worst, depending on game. So, I say take this all with a grain of salt
May 11, 2009 3:14:34 PM

I just wish ATI would switch to full-fledged 512-bit.
a b U Graphics card
May 11, 2009 3:21:49 PM

They will when they have to. What is the difference? If they used GDDR5 with 512bit they would have twice the bandwidth they need and cost 50% more to produce..

When they need the bandwidth they will go 512.. until then enjoy the price savings. Why would you want to pay extra cash for bandwidth you don't use? I assure you the ATI techs know what they are doing, and won't be puttin gout a card that is starved for memory bandwidth, or one that has a hell of a lot extra. I'm sure that should nvidia have gone with Gddr5 this gen they would also only use 256bit too. They are both in the business of making money, not adding unbalanced features to increase the cost for no performance gain.
a b U Graphics card
May 11, 2009 4:33:47 PM

512 bit buses are overkill. The reason why Nvidia generally has a tiny performance lead is because they use 50% more transistors on their gpu's. The r800 will have a lot more than the r700 had.

Anyway, expect the exact same as last year with the 5870 X2 holding a huge lead until Nvidia figure a way to glue two g300's together without causing whole cities to black out.
May 11, 2009 6:49:30 PM

jennyh said:
512 bit buses are overkill. The reason why Nvidia generally has a tiny performance lead is because they use 50% more transistors on their gpu's. The r800 will have a lot more than the r700 had.

Anyway, expect the exact same as last year with the 5870 X2 holding a huge lead until Nvidia figure a way to glue two g300's together without causing whole cities to black out.



At what resolution is 512-bit memory "worth it"? I think 512-bit would defiantly be worth it at 1920x1200. Right? More interface with the memory should always make things faster, especially at higher resolutions with more stream processors (like 512, for instance).

Also, how much more performance can we expect out of the GTX 380 then the GTX 280? Maybe double? I wanna know if I should get a 5870X2, or a GTX 380. I'm kinda leaning towards the 380 because it'll most likely draw less power and Nvidia has better Driver support.
a b U Graphics card
May 11, 2009 7:00:57 PM

You are mistaken Uber. Memory bandwidth is a function of the bus and speed (in fact it is teh multiplication of teh two). Right now Nvidia uses double the bus, ATI uses effectively double the speed (GDDR5, not 3). There is no difference in the two methods from a performance view.

When the bandwidth needs to be increased beyond what a simple clock increase can provide both companies will use a larger bus width, both will probably use Gddr5 in the next generation of cards, not sure what the bus width will be.

And Uber.. noone can say for sure.. Yuo won't be able to pick a card (5870X2 or 380) until they are realsed, noone can guess as to which will be better or why. Driver support is realtively a moot point at this point in time as neither is particularly better at the moment (I regret saying this as I'm sure it will encur the wrath of both sides demanding to be called the best...) and its a clean slate as far as DX11 goes.. they may both F it up :D 
May 12, 2009 2:29:20 AM

gamerk316 said:
The new 8800Ultra.

BTW, the 8800Ultra is probably the one card that actually was worth paying >$500 for; its still competitive, even today.


Be that as it may, it was just an overclocked 8800GTX, when 8800GTX's could easily be set at Ultra speeds. So, still competitive? Yes. Ever worth paying >$500 when you could get a cheaper GTX and overclock it to the same speeds? No way. Ultra was the sucker's card.
a b U Graphics card
May 12, 2009 3:40:48 AM

Understand that if nVidia goes to GDDR5, theyll most likely drop their 512 bus' also. GDDR5 may be smokin fast by the time these cards are released.. To have a huge wide bus and super fast memory is reduntant, as 1 or the other is enough. The wider bus' cost alot more money, and I havnt seen any advantages having a wider bus vs faster memory, and currently, theres really almost no way to test one against the other, but looking at the 4870 vs the 4850, often the 4870 is faster in games than the % difference we see in the core clock alone, meaning, the GDDR5 is boosting the 4870s perf, whereas the 4850 cant get it done with its GDDR3
May 14, 2009 2:38:21 AM

What if my dreams are that I want Nvidia to fail miserably and be gobbled up by Via?
May 14, 2009 2:49:25 AM

Dekasav said:
What if my dreams are that I want Nvidia to fail miserably and be gobbled up by Via?


i thought Nvidia was the one doing the gobbling up? oh well, seems like you are hollow inside.
I dont buy nvidia cards but appreciate the competition they put up and cause the graphics card prices to come down. We dont need a monopoly!
May 14, 2009 3:29:14 AM

..... I forgot. Sarcasm is dead on teh internetz.
May 14, 2009 2:28:46 PM

Yeah... that link is largely biased B.S. with a few good points.
May 14, 2009 10:20:37 PM

We'll see, we'll see.
May 15, 2009 5:50:49 AM

yea, we might see great things yet. Nvidia maybe has some secret project up their sleeves FOR gt300 and will give good, if not great competition for ATI. I do believe this generation will happen much like the last. Except maybe ATI won't pull a X2 as soon because ATI wants a bigger chip and won't be able to compress that into an X2 unless they dumb the specs and have a die shrink like the 295.

Hopefully there will be similar cards this time around so I get get the most for my life savings! :lol:  (J/K...maybe not...lol)

May 19, 2009 1:14:37 AM

That guy who said those things on the inquirer has a horrible track record!

He even said that the 8800GTX would be bad! If that's not enough proof then look at most of the responses on that page!


So, I do think the GT300 will have 512 shader cores, at least 1536mb of ram, and 512-bit bus GDDR5. Hopefully 2gb :D  Maybe it'll even have 576 cores for all we know! :ouch:  If the GT300 has all the supposed specs, I've crunched the numbers over and over and, The GTX380 will be have 3x more fps in crysis then my 8800GTX! Talk about wow! :D 


Even though I'm somewhat a Nvidia fanboi, I do want ATI to up their specs a bit. I do (not) want a repeat of the 8800GTX! lol... The prices were off the charts for quite some time. Maybe ATI will have 2000 shader cores! lol..........why do I kid myself...


Any Idea when the 5870 will come out?? I've heard Q4 but really I think next year Q1 sounds more realistic.
May 19, 2009 2:13:42 AM

God I hope the GT300 is nothing like the above mentioned monster.

512-bit bus of GDDR5? A waste of money for no performance gain

1.5Gb-2Gb of GDDR5? 1GB of GDDR5 is a bit of a waste anything more is likely going to be useless.

Again, nVidia needs to get away from this monolithic GPU thing they have started, it hasn't exactly paid off with the GTX 2xx series.
May 19, 2009 3:37:02 AM

The_Blood_Raven said:
God I hope the GT300 is nothing like the above mentioned monster.

512-bit bus of GDDR5? A waste of money for no performance gain

1.5Gb-2Gb of GDDR5? 1GB of GDDR5 is a bit of a waste anything more is likely going to be useless.

Again, Nvidia needs to get away from this monolithic GPU thing they have started, it hasn't exactly paid off with the GTX 2xx series.



I have to disagree with you. Graphics cards need to progress as much as possible to create more efficient lower-end cards, and a higher performance on average for users.

With bigger and badder high-end graphics cards, it helps (to a degree) to raise efficacy for lower end models. With more efficient low-end models, the average user can get a better graphics card and companies can start making better video games. With better and better video games, well, who wouldn't want to play a game that comes on 5 blu-ray discs and has grahpics that make crysis look like pong, on max settings. We all will, eventually. It just takes time for graphics cards (and new consoles) to push the video game industry into making better and better games.

there's my two cents :) 
May 19, 2009 10:44:07 AM

I was point out that a few of the rumored specifications are useless and that the large die, low yields and high production cost of the GTX 2xx series hopefully will not translate to the GT300.

"Bigger and Badder" does not equal efficiency, just look at the GTX 2xx series compared to the ATI 4xxx series. The 4xxx series gets about the same and sometimes better performance through a much more efficient design.

We don't need to pay more for extra CUDA features or specifications that do nothing but look good to uninformed consumers.

I just hope that ATI and nVidia truly pull out something next generation and not just a incremental improvement.
a b U Graphics card
May 19, 2009 5:36:41 PM

Until nVidia creates a two tier system, one top card, or cards, that incorporate gpgpu/Cuda abilities, with high DP etc, and a killer gpu for gaming, Im afraid we'll keep seeing these monsters. That in and of itself is one reason why G200 isnt as efficient as R700, having all of that, tho even so, the R700 holds its own, either way.
LRB will be quite good at doing gpgpu, and nVidia wants to meet it head on, so for now, this is what we will see
July 14, 2009 3:26:57 PM

What would the math be for calculating the performance-benefit of nVidia going with GDDR5, what with their 512bit-wide memory band?
a b U Graphics card
July 14, 2009 3:37:05 PM

GDDR5 is twice as fast per clock as GDDR3... so multiply by 2......

In the future please just use google to look for something as trivial as this info.. don't be lazy, and don't needlessly bump a 2 month old thread.