GT300 series, real specs revealed.
Tags:
-
Graphics Cards
- World Of Warcraft
- Fortune
-
Graphics
Last response: in Graphics & Displays
terminus
April 27, 2009 6:36:40 AM
http://www.brightsideofnews.com/news/2009/4/22/nvidias-...!.aspx
wow, seems really powerful; I hope it doesn't got a fortune though.
wow, seems really powerful; I hope it doesn't got a fortune though.
More about : gt300 series real specs revealed
cybot_x1024
April 27, 2009 9:48:15 AM
Looks very promising. At least then we will have something that can crash crysis.
By the way have you seen the RV870's specs? It looks like there will be a serious battle in the GPU market. I cant wait
for the bloodshed/bleeding edge graphics and the competitive price cuts! I am going to get myself one of these for very cheap
By the way have you seen the RV870's specs? It looks like there will be a serious battle in the GPU market. I cant wait
for the bloodshed/bleeding edge graphics and the competitive price cuts! I am going to get myself one of these for very cheap
The_Blood_Raven
April 27, 2009 10:44:34 AM
Related resources
- GT300 specs, gtx380,360 gts350,340 - Forum
- Sapphire's R9 and R7 lineup specs revealed bar the 290 and 290x - Forum
- Real GT300 "Fermi" Game Performance Figures Leaked - Forum
- GTC Fermi board is a fake - will the real GT300 please stand up? - Forum
- Intel's G3 X25-M SSD Specs Revealed - Forum
cybot_x1024
April 27, 2009 11:32:53 AM
AMD have always had price/performance their strategy to win over the GPU market as opposed to Nvidia's brute strength strategy.
The info states that the GT300 single chip card will yield 3+ T Flops as opposed to AMD/ATI's 2.16TFlop 5870. The GT300 will definitely be riding on a 512bit memory bus and the 5870 a 256bit wide memory bus.
Clearly the info points out that the GT300 will be quite pricey due to production cost.
The info states that the GT300 single chip card will yield 3+ T Flops as opposed to AMD/ATI's 2.16TFlop 5870. The GT300 will definitely be riding on a 512bit memory bus and the 5870 a 256bit wide memory bus.
Clearly the info points out that the GT300 will be quite pricey due to production cost.
rawsteel
April 27, 2009 11:38:27 AM
It seems that the new nVidia flagman will again be HUGE. I dont know how scallable will this chip be as if they want to be in the mainstream segment it will be too expensive to make such huge chips and sell them cheap. On the other hand ATI have great advantage in this area. I even expect the 5870 to be less size than the 4870 and still be around 50% faster.
What I wonder is, what are Nvidia going to do at the mainstream?
Even at 40nm eventually, the G200b is going to make a pretty expensive mainstream card. The reason Nvidia are losing the mainstream market now is because their parts cost too much to manufacture.
Anyway, at the enthusiast end the 5870 is going to pretty much annihilate everything currently available, until Nvidia release the gt300, then ATI double up with the 5870 X2 and....wait a minute doesn't this sound awfully familiar?
Even at 40nm eventually, the G200b is going to make a pretty expensive mainstream card. The reason Nvidia are losing the mainstream market now is because their parts cost too much to manufacture.
Anyway, at the enthusiast end the 5870 is going to pretty much annihilate everything currently available, until Nvidia release the gt300, then ATI double up with the 5870 X2 and....wait a minute doesn't this sound awfully familiar?
bruce555
April 27, 2009 5:01:41 PM
Excited to see the new architectures for sure. Hope they both end up relatively on par so we see some nice prices. Seems the relation between the 300 and 5000 series will be similar to the 4000 and 200. sure hope so as I love price wars
Either way, the 5870x2 and the 380gtx sure will spew out a ton of FLOPS
Either way, the 5870x2 and the 380gtx sure will spew out a ton of FLOPS
turboflame
April 27, 2009 5:18:36 PM
terminus
April 27, 2009 7:08:42 PM
Yea, dont take these specs as absolutes, they are speculated specs if you will.
As far as slightly smaller vs behemoth chips, xbits has an excellent review showing the 4890 in CF holding its own vs the 285 in SLI.
Lets face it. nVidia has turned towards the gpgpu solution, and thats a major part of their strategy as well as their die size. Theyll be going DP in a major way for this, and that costs realestate on the gpu. Theyre trying to drive the market in this direction, trying to preempt LRB, and want to compete and lead in that market. What I wonder is, will they continue heading in this direction , having both gpgpu and gaming perf together, or make 2 units, each aimed specifacately to each market?
As far as slightly smaller vs behemoth chips, xbits has an excellent review showing the 4890 in CF holding its own vs the 285 in SLI.
Lets face it. nVidia has turned towards the gpgpu solution, and thats a major part of their strategy as well as their die size. Theyll be going DP in a major way for this, and that costs realestate on the gpu. Theyre trying to drive the market in this direction, trying to preempt LRB, and want to compete and lead in that market. What I wonder is, will they continue heading in this direction , having both gpgpu and gaming perf together, or make 2 units, each aimed specifacately to each market?
The_Blood_Raven
April 27, 2009 7:53:20 PM
JAYDEEJOHN said:
Yea, dont take these specs as absolutes, they are speculated specs if you will.As far as slightly smaller vs behemoth chips, xbits has an excellent review showing the 4890 in CF holding its own vs the 285 in SLI.
Lets face it. nVidia has turned towards the gpgpu solution, and thats a major part of their strategy as well as their die size. Theyll be going DP in a major way for this, and that costs realestate on the gpu. Theyre trying to drive the market in this direction, trying to preempt LRB, and want to compete and lead in that market. What I wonder is, will they continue heading in this direction , having both gpgpu and gaming perf together, or make 2 units, each aimed specifacately to each market?
I just read that review and it showed that 4890 crossfire and GTX 285 SLI are "identical" at high resolution with each having a game that runs decidedly better on that particular platform. (both are terrible games to be honest.) All I can say is wow, I might be trading my 4870 X2s in for 3 4890s!
Uber nobody
May 11, 2009 5:00:55 AM
Wow. These specs are awesome. Hopefully Nvidia and ATI will have this "Price War" so I get get myself one of these puppies!
I think this pushes me to wait to do my upgrade. Gonna get windows 7 64-bit, overclock my C2D E8400 to 6Ghtz, go to 8gb of ram, and most importently, dump my 8800GTX and get a GTX 380 or a 5870 X2!
(Yes, I mean 6 Ghtz!)
I think this pushes me to wait to do my upgrade. Gonna get windows 7 64-bit, overclock my C2D E8400 to 6Ghtz, go to 8gb of ram, and most importently, dump my 8800GTX and get a GTX 380 or a 5870 X2!
(Yes, I mean 6 Ghtz!)
The_Blood_Raven
May 11, 2009 10:41:06 AM
michaelmk86
May 11, 2009 12:54:30 PM
JeanLuc
May 11, 2009 1:21:00 PM
turboflame said:
Quote:
Theo ValichStopped reading there
Who ever gave him a thumbs down needs there head looking at. Theo Valich is one of the worst tech reporters I've ever come across; mis-leading, biased, badly source, poor written are just a few of the terms that can be used to describe Mr Valich's reporting style. He even worked for Toms Hardware for a bit as a reporter, I haven't seen a news item from him for ages so I can only assume that someone fired him (good on them).
It's going to be interesting to see just how much faster the GTX 380 is then the HD 5870 and how much more it's going to cost to buy. If the ratio's are the same as the last years (GTX 280 Vs HD 4870) then I'll be buying a the Radeon to replace my ageing 8800GT, if it get thumbed into Oblivion (8800GTX Vs HD 2900XT) then I'll be looking at getting a GTX 360 or equivalent.
terminus
May 11, 2009 1:53:11 PM
I suspect most of what we will see in these speculated specs will be aimed at gpgpu usage, not that the G300 wont be a killer card, but using these numbers, which are pure speculation at this point, and saying theyll be directly used for rendering/gpu use mainly/only wont happen. Look at the difference in transistor count and die size between the R700 and the G200. The G200 isnt almost 50% better/faster than the R700, more like 15-20% at best, and loses at worst, depending on game. So, I say take this all with a grain of salt
rags_20
May 11, 2009 3:14:34 PM
They will when they have to. What is the difference? If they used GDDR5 with 512bit they would have twice the bandwidth they need and cost 50% more to produce..
When they need the bandwidth they will go 512.. until then enjoy the price savings. Why would you want to pay extra cash for bandwidth you don't use? I assure you the ATI techs know what they are doing, and won't be puttin gout a card that is starved for memory bandwidth, or one that has a hell of a lot extra. I'm sure that should nvidia have gone with Gddr5 this gen they would also only use 256bit too. They are both in the business of making money, not adding unbalanced features to increase the cost for no performance gain.
When they need the bandwidth they will go 512.. until then enjoy the price savings. Why would you want to pay extra cash for bandwidth you don't use? I assure you the ATI techs know what they are doing, and won't be puttin gout a card that is starved for memory bandwidth, or one that has a hell of a lot extra. I'm sure that should nvidia have gone with Gddr5 this gen they would also only use 256bit too. They are both in the business of making money, not adding unbalanced features to increase the cost for no performance gain.
512 bit buses are overkill. The reason why Nvidia generally has a tiny performance lead is because they use 50% more transistors on their gpu's. The r800 will have a lot more than the r700 had.
Anyway, expect the exact same as last year with the 5870 X2 holding a huge lead until Nvidia figure a way to glue two g300's together without causing whole cities to black out.
Anyway, expect the exact same as last year with the 5870 X2 holding a huge lead until Nvidia figure a way to glue two g300's together without causing whole cities to black out.
Uber nobody
May 11, 2009 6:49:30 PM
jennyh said:
512 bit buses are overkill. The reason why Nvidia generally has a tiny performance lead is because they use 50% more transistors on their gpu's. The r800 will have a lot more than the r700 had.Anyway, expect the exact same as last year with the 5870 X2 holding a huge lead until Nvidia figure a way to glue two g300's together without causing whole cities to black out.
At what resolution is 512-bit memory "worth it"? I think 512-bit would defiantly be worth it at 1920x1200. Right? More interface with the memory should always make things faster, especially at higher resolutions with more stream processors (like 512, for instance).
Also, how much more performance can we expect out of the GTX 380 then the GTX 280? Maybe double? I wanna know if I should get a 5870X2, or a GTX 380. I'm kinda leaning towards the 380 because it'll most likely draw less power and Nvidia has better Driver support.
You are mistaken Uber. Memory bandwidth is a function of the bus and speed (in fact it is teh multiplication of teh two). Right now Nvidia uses double the bus, ATI uses effectively double the speed (GDDR5, not 3). There is no difference in the two methods from a performance view.
When the bandwidth needs to be increased beyond what a simple clock increase can provide both companies will use a larger bus width, both will probably use Gddr5 in the next generation of cards, not sure what the bus width will be.
And Uber.. noone can say for sure.. Yuo won't be able to pick a card (5870X2 or 380) until they are realsed, noone can guess as to which will be better or why. Driver support is realtively a moot point at this point in time as neither is particularly better at the moment (I regret saying this as I'm sure it will encur the wrath of both sides demanding to be called the best...) and its a clean slate as far as DX11 goes.. they may both F it up
When the bandwidth needs to be increased beyond what a simple clock increase can provide both companies will use a larger bus width, both will probably use Gddr5 in the next generation of cards, not sure what the bus width will be.
And Uber.. noone can say for sure.. Yuo won't be able to pick a card (5870X2 or 380) until they are realsed, noone can guess as to which will be better or why. Driver support is realtively a moot point at this point in time as neither is particularly better at the moment (I regret saying this as I'm sure it will encur the wrath of both sides demanding to be called the best...) and its a clean slate as far as DX11 goes.. they may both F it up
scooterlibby
May 12, 2009 2:29:20 AM
gamerk316 said:
The new 8800Ultra. BTW, the 8800Ultra is probably the one card that actually was worth paying >$500 for; its still competitive, even today.
Be that as it may, it was just an overclocked 8800GTX, when 8800GTX's could easily be set at Ultra speeds. So, still competitive? Yes. Ever worth paying >$500 when you could get a cheaper GTX and overclock it to the same speeds? No way. Ultra was the sucker's card.
Understand that if nVidia goes to GDDR5, theyll most likely drop their 512 bus' also. GDDR5 may be smokin fast by the time these cards are released.. To have a huge wide bus and super fast memory is reduntant, as 1 or the other is enough. The wider bus' cost alot more money, and I havnt seen any advantages having a wider bus vs faster memory, and currently, theres really almost no way to test one against the other, but looking at the 4870 vs the 4850, often the 4870 is faster in games than the % difference we see in the core clock alone, meaning, the GDDR5 is boosting the 4870s perf, whereas the 4850 cant get it done with its GDDR3
cybot_x1024
May 14, 2009 1:53:59 AM
to shatter all your dreams go to the following link:
http://www.theinquirer.net/inquirer/news/1137331/a-look...
http://www.theinquirer.net/inquirer/news/1137331/a-look...
Dekasav
May 14, 2009 2:38:21 AM
cybot_x1024
May 14, 2009 2:49:25 AM
Dekasav said:
What if my dreams are that I want Nvidia to fail miserably and be gobbled up by Via?i thought Nvidia was the one doing the gobbling up? oh well, seems like you are hollow inside.
I dont buy nvidia cards but appreciate the competition they put up and cause the graphics card prices to come down. We dont need a monopoly!
Dekasav
May 14, 2009 3:29:14 AM
The_Blood_Raven
May 14, 2009 2:28:46 PM
smithereen
May 14, 2009 10:20:37 PM
Uber nobody
May 15, 2009 5:50:49 AM
yea, we might see great things yet. Nvidia maybe has some secret project up their sleeves FOR gt300 and will give good, if not great competition for ATI. I do believe this generation will happen much like the last. Except maybe ATI won't pull a X2 as soon because ATI wants a bigger chip and won't be able to compress that into an X2 unless they dumb the specs and have a die shrink like the 295.
Hopefully there will be similar cards this time around so I get get the most for my life savings!
(J/K...maybe not...lol)
Hopefully there will be similar cards this time around so I get get the most for my life savings!
(J/K...maybe not...lol)
Uber nobody
May 19, 2009 1:14:37 AM
That guy who said those things on the inquirer has a horrible track record!
He even said that the 8800GTX would be bad! If that's not enough proof then look at most of the responses on that page!
So, I do think the GT300 will have 512 shader cores, at least 1536mb of ram, and 512-bit bus GDDR5. Hopefully 2gb
Maybe it'll even have 576 cores for all we know!
If the GT300 has all the supposed specs, I've crunched the numbers over and over and, The GTX380 will be have 3x more fps in crysis then my 8800GTX! Talk about wow!
Even though I'm somewhat a Nvidia fanboi, I do want ATI to up their specs a bit. I do (not) want a repeat of the 8800GTX! lol... The prices were off the charts for quite some time. Maybe ATI will have 2000 shader cores! lol..........why do I kid myself...
Any Idea when the 5870 will come out?? I've heard Q4 but really I think next year Q1 sounds more realistic.
He even said that the 8800GTX would be bad! If that's not enough proof then look at most of the responses on that page!
So, I do think the GT300 will have 512 shader cores, at least 1536mb of ram, and 512-bit bus GDDR5. Hopefully 2gb
Maybe it'll even have 576 cores for all we know!
If the GT300 has all the supposed specs, I've crunched the numbers over and over and, The GTX380 will be have 3x more fps in crysis then my 8800GTX! Talk about wow!
Even though I'm somewhat a Nvidia fanboi, I do want ATI to up their specs a bit. I do (not) want a repeat of the 8800GTX! lol... The prices were off the charts for quite some time. Maybe ATI will have 2000 shader cores! lol..........why do I kid myself...
Any Idea when the 5870 will come out?? I've heard Q4 but really I think next year Q1 sounds more realistic.
The_Blood_Raven
May 19, 2009 2:13:42 AM
God I hope the GT300 is nothing like the above mentioned monster.
512-bit bus of GDDR5? A waste of money for no performance gain
1.5Gb-2Gb of GDDR5? 1GB of GDDR5 is a bit of a waste anything more is likely going to be useless.
Again, nVidia needs to get away from this monolithic GPU thing they have started, it hasn't exactly paid off with the GTX 2xx series.
512-bit bus of GDDR5? A waste of money for no performance gain
1.5Gb-2Gb of GDDR5? 1GB of GDDR5 is a bit of a waste anything more is likely going to be useless.
Again, nVidia needs to get away from this monolithic GPU thing they have started, it hasn't exactly paid off with the GTX 2xx series.
Uber nobody
May 19, 2009 3:37:02 AM
The_Blood_Raven said:
God I hope the GT300 is nothing like the above mentioned monster. 512-bit bus of GDDR5? A waste of money for no performance gain
1.5Gb-2Gb of GDDR5? 1GB of GDDR5 is a bit of a waste anything more is likely going to be useless.
Again, Nvidia needs to get away from this monolithic GPU thing they have started, it hasn't exactly paid off with the GTX 2xx series.
I have to disagree with you. Graphics cards need to progress as much as possible to create more efficient lower-end cards, and a higher performance on average for users.
With bigger and badder high-end graphics cards, it helps (to a degree) to raise efficacy for lower end models. With more efficient low-end models, the average user can get a better graphics card and companies can start making better video games. With better and better video games, well, who wouldn't want to play a game that comes on 5 blu-ray discs and has grahpics that make crysis look like pong, on max settings. We all will, eventually. It just takes time for graphics cards (and new consoles) to push the video game industry into making better and better games.
there's my two cents
The_Blood_Raven
May 19, 2009 10:44:07 AM
I was point out that a few of the rumored specifications are useless and that the large die, low yields and high production cost of the GTX 2xx series hopefully will not translate to the GT300.
"Bigger and Badder" does not equal efficiency, just look at the GTX 2xx series compared to the ATI 4xxx series. The 4xxx series gets about the same and sometimes better performance through a much more efficient design.
We don't need to pay more for extra CUDA features or specifications that do nothing but look good to uninformed consumers.
I just hope that ATI and nVidia truly pull out something next generation and not just a incremental improvement.
"Bigger and Badder" does not equal efficiency, just look at the GTX 2xx series compared to the ATI 4xxx series. The 4xxx series gets about the same and sometimes better performance through a much more efficient design.
We don't need to pay more for extra CUDA features or specifications that do nothing but look good to uninformed consumers.
I just hope that ATI and nVidia truly pull out something next generation and not just a incremental improvement.
Until nVidia creates a two tier system, one top card, or cards, that incorporate gpgpu/Cuda abilities, with high DP etc, and a killer gpu for gaming, Im afraid we'll keep seeing these monsters. That in and of itself is one reason why G200 isnt as efficient as R700, having all of that, tho even so, the R700 holds its own, either way.
LRB will be quite good at doing gpgpu, and nVidia wants to meet it head on, so for now, this is what we will see
LRB will be quite good at doing gpgpu, and nVidia wants to meet it head on, so for now, this is what we will see
WhiteWelcomer
July 14, 2009 3:26:57 PM
Related resources
- Views on the nVidia GT300 series Forum
- SolvedW/ a driver program found out all my real specs-wrong DDR3 Forum
- now that the official specs for watch dogs have been revealed, would i be able to run it? Forum
- Draft 802.11n Revealed: Part 1 - The Real Story on Throughput vs. Range Forum
- The Elder Scrolls V: Skyrim System Specs Revealed Forum
- SolvedCorsair Carbide Series SPEC 02 or Corsair Carbide Series SPEC 03 ?? or other good ? Forum
- Witcher 2 system specs revealed Forum
- Crysis 2 Recommended Specs Revealed Forum
- Crysis 2 Specs have been revealed.... Forum
- SolvedWill an ASUS R9 280X DirectCU ii Top fit inside a Corsair Carbide Series Spec-02? Forum
- Geforce GTX 275 specs revealed Forum
- SolvedCorsair Carbide Series Spec-01 Additional Airflow? Forum
- Common with this series of power supply or real trouble? Forum
- MSI low and mid range 8000 series cards revealed. Forum
- Get ready for HL2 : System Specs. revealed! Forum
- More resources
Read discussions in other Graphics & Displays categories
!