Sign in with
Sign up | Sign in
Your question

GT300 wins over 5870

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
September 16, 2009 5:35:14 PM

From Fudzilla,


Several people confirm


Nvidia is still not revealing any details about its chip that we still call GT300 but one thing is getting quite certain. A few people who saw just leaked performance data on ATI's soon to launch Radeon HD 5870 have told Fudzilla that they are absolutely confident that GT300 will win over Radeon HD 5870.

Since GT300 has its GX2 brother they should not fear much against Radeon 5870 X2 whenever that comes out, as if Nvidia wins a single chip battle, they can win the dual one as well.

We can only confirm that GT300 is not a GT200 in 40nm with DirectX 11 support. It’s a brand new chip that was designed almost entirely from the ground up. Industry sources believe that this is the biggest change since G80 was launched and that you can expect such level of innovation and change.

The launch as far as we know is still slated for late November and hasn’t been confirmed.

More about : gt300 wins 5870

a c 86 U Graphics card
a b Î Nvidia
September 16, 2009 5:42:46 PM

lol of course they'll say that, no matter what's the truth

and good luck with the dual gt300 card, it's gona be mighty difficult to get it under the 300W limit, if the rumors are true about the size and transistor count of the chip...
a c 273 U Graphics card
a c 172 Î Nvidia
September 16, 2009 5:45:15 PM

Until there is some solid evidence it's just unicorn poo.
Related resources
Anonymous
a b U Graphics card
September 16, 2009 5:46:23 PM

people only want faster card but remeber that 5870x2 will have nearly 380W TDP
a c 86 U Graphics card
a b Î Nvidia
September 16, 2009 5:52:05 PM

Quote:
people only want faster card but remeber that 5870x2 will have nearly 380W TDP

I highly doubt ati is going to release a card that doesn't fit into the Jedec's PCI-e standards. That 380W is just twice the 5870s tdp, and not 'real' as such. I believe they will bring it down to around 280W or below, just like the 4870x2
Anonymous
a b U Graphics card
September 16, 2009 5:55:18 PM

but they will need to downclock it a lot to this and its not double the 5870 performance
September 16, 2009 5:59:58 PM

Quote:
From Fudzilla,

It’s a brand new chip that was designed almost entirely from the ground up. Industry sources believe that this is the biggest change since G80 was launched and that you can expect such level of innovation and change.

The launch as far as we know is still slated for late November and hasn’t been confirmed.

If this is true then I wouldn't doubt that it might beat the 5800 series, but its just speculation until we see some real world benches. "Industry Sources" cant be trusted.....
a c 86 U Graphics card
a b Î Nvidia
September 16, 2009 6:00:28 PM

Quote:
but they will need to downclock it a lot to this and its not double the 5870 performance

or use better binned chips with lower voltage...
Anonymous
a b U Graphics card
September 16, 2009 6:00:58 PM

Its good news that nvidia is not just sitting and waiting to see what ati has to offer. Competition in this battle will be very intresting and will bring price wars in all segments what is good for us
a c 86 U Graphics card
a b Î Nvidia
September 16, 2009 6:04:53 PM

^^believe me, they are allready hard at work with the next gen chips, u know, the ones that come after the HD5xxx and gt300. It takes like 3 years to come up with a new chip...
September 16, 2009 6:14:32 PM

http://hardocp.com/news/2009/09/15/nvidia_gt300_yields_...

If this is true, there's no way that Nvidia is hiting a november release. And certainly not getting a dual GPU card, at least until their next die shrink.

By that time, ATI will have won over a ton more market share, similar to the 48xx series release.

We've been hearing amazing things about the 58xx series for months now. We've heard next to nothing (aside from an unsourced hint from Fudzilla (really?)) about the GT300 GPUS.

I'd be taking my stock out of Nvidia right now if I had any.
September 16, 2009 6:14:47 PM

Also, don't forget you can push the gpu from 850Mhz to 1.2Ghz on the new 5800's!!!
Anonymous
a b U Graphics card
September 16, 2009 6:26:40 PM

If the news from Inquirer are really true, then Nvidia's GT300 will have 512 shader processors and 512bit wide GDDR5 memory interface.
September 16, 2009 6:35:53 PM

Mousemonkey said:
No matter how many ways people link to the same source it's not going become any more factual until other much more reliable voices speak up.

Ehh, I hold the same pesimistic attitude.

But "IF" it's true, nvidia is scaroowed. If the source here turns out reliable, then there are some MAJOR design flaws in the GT300 that will require a whole lot more than a couple months for redesigns, revisions, manufacturing, branding, driver creation, and shipping.

Quote:
If the news from Inquirer are really true, then Nvidia's GT300 will have 512 shader processors and 512bit wide GDDR5 memory interface.


At least post links. No one has the time to waste to search out obscure articles you refference.
September 16, 2009 7:14:19 PM

It better be or nVidia is screwed. However I really hope its faster, as then we will see some nice priced cards. If ati keeps the dominance we wont get sub $200 5800 series card any sooner
September 16, 2009 7:30:48 PM

proof is in the pudding
a b U Graphics card
September 16, 2009 7:31:29 PM

OK, if the rumored perf for the 5870 is right, then we see the G300x2 needing to be at least 3 times faster than the 285, which is doable perf wise, but maybe not physically, much like wjat we saw last time when nVidia had to wait for the 295 at lowered clocks and a smaller node
September 16, 2009 7:34:27 PM

Has anyone noticed how Fuad spreads lies... er.. FUD about ATI and props nVidia? For example, he lied about 5870 X2 TDP (376W... yeah right, he has no slightest clue about it), and now endorses nVidia PR? If you dont believe Charlie about extremely low yields so far, Kyle Bennet from HardOCP says his sources confirmed it. Yet Fuad says GT300 will surely beat 5870 (possible, but how does he know?), how nVidia shouldnt fear 5870 X2 because... "Since GT300 has its GX2 brother they should not fear much against Radeon 5870 X2 whenever that comes out". I find it very funny how its worded, we dont know when GT300 itself will be launched after all delays and low yields, yet by Fuad words it may seem GX2 will be on the shelfs soon, just its not clear when 5870 X2 comes out :sarcastic:  How on earth nVidia will be able to launch GX2 at all before next refresh?

This leaves only two possibilities IMO:

1. Fuad is payed by nVidia and actually has inside info, while Charlie, Kyle Bennet and others are being lied to about yields, possible release dates, etc. (disinformation is part of the war).

2. Fuad is payed by nVidia... to spread FUD, no more, no less, at least to spoil 5800 series launch a little bit. Since he has no credibility, he doesnt care if in a month or two he will be caught anyway.
a b U Graphics card
September 16, 2009 7:44:23 PM

I think its a third scenario, and long time no see Harrisson.
I think FUaD knows Charlies postition, sees him getting hits on his position, and FUaD is just being the opposite, for the hits, simple as that.
Hes not that gullable, to think a x2 card actually uses twice the power, he just isnt. Hes riding the current rumor streak, just playing the devils advocate, and getting hits and attention, whether its true or not, and most likely not.
I did hear those people leaking their info to him had mysterious names like N and V and P and R heheh
a b U Graphics card
September 16, 2009 8:15:59 PM

Sure Nvidia will probably produce a chip 20% faster than the 5870, but it'll be a lot more expensive and they'll release it months later so it's not really a fair comparison (by that time the 5890 will be waiting around the corner anyway.)
a c 173 U Graphics card
a b Î Nvidia
September 16, 2009 8:20:56 PM

Gulli said:
Sure Nvidia will probably produce a chip 20% faster than the 5870, but it'll be a lot more expensive and they'll release it months later so it's not really a fair comparison (by that time the 5890 will be waiting around the corner anyway.)


Even if it is that much faster and every one wanted one there wont be very many on the market with a yield rate of only 1.7%. So I guess that either they find a way to improve the yields or go 32nm and go through hell that will bring not to mention delays.
September 16, 2009 8:30:58 PM

The Gt300 has a catch though,

To make it work you need to rub it between your butt cheeks to create enough static electricity to power it up before you plug it in.

Mandatory^, they replaced the main GPU with an Excel mint, so now its minty fresh:D 

F*ckzilla said that one, not quite as good as fud, but meh!
September 16, 2009 8:35:48 PM

nforce4max said:
Even if it is that much faster and every one wanted one there wont be very many on the market with a yield rate of only 1.7%. So I guess that either they find a way to improve the yields or go 32nm and go through hell that will bring not to mention delays.

Not only would there be a market shortage, they wouldnt even have enough yeild to test and revise. If the rumour turns out true, they wont be able to get it out the door for a long long time.

And I just realized something............................

Nvidia is just now getting their very first silicon yeilds. If that's true, even if they came back golden, there's no way they could hit a november release. And besides that..... First yeilds are typically purely for testing the integerity of a design. There's not a blue frozen flying pigs chance in hell that anyone knows what the GT300 performance will be like.

Let me repeat that: Fudzilla coudln't vacuum more. Fail.
a c 106 U Graphics card
a b Î Nvidia
September 16, 2009 8:41:17 PM

Even if it is faster it won't gain much traction if it costs twice as much. Sure the 285 was faster than the 4870, but the 4870 sure sold more and forced nVidia to lower prices on the 285 and 260. If the 5870 is fast enough, it could force nVidia to sell their G300 cards without much profit, or even at a loss (as was the case of the dual PCB GTX 295). Worst case for nVidia is that it turns out to be their 2900 card, faster on paper but not so much in practice and too expensive to make.

Anyway since the G300 launch doesn't even seem to be close, I think the 5800 series cards will be at the top of the heap for at least a few months.
a c 273 U Graphics card
a c 172 Î Nvidia
September 16, 2009 8:50:08 PM

megamanx00 said:
Worst case for nVidia is that it turns out to be their 2900 card, faster on paper but not so much in practice and too expensive to make.

Now that would suck sweaty donkey balls.
a c 173 U Graphics card
a b Î Nvidia
September 16, 2009 8:51:33 PM

curnel_D said:
Not only would there be a market shortage, they wouldnt even have enough yeild to test and revise. If the rumour turns out true, they wont be able to get it out the door for a long long time.

And I just realized something............................

Nvidia is just now getting their very first silicon yeilds. If that's true, even if they came back golden, there's no way they could hit a november release. And besides that..... First yeilds are typically purely for testing the integerity of a design. There's not a blue frozen flying pigs chance in hell that anyone knows what the GT300 performance will be like.

Let me repeat that: Fudzilla coudln't vacuum more. Fail.


I could not agree more. I got burned today like so many lately but with a much older card with the age old problem of incompatibility the first gen GX2 cards. Not much help thus far with my vintage 7900 gtx duo sigh got one of my dream cards and can't use it. Look at my avatar its the card in the middle. :( 
September 16, 2009 9:14:53 PM

JAYDEEJOHN said:
I think its a third scenario, and long time no see Harrisson.

Thx, its exciting time for enthusiasts ;) 

JAYDEEJOHN said:

I think FUaD knows Charlies postition, sees him getting hits on his position, and FUaD is just being the opposite, for the hits, simple as that.
Hes not that gullable, to think a x2 card actually uses twice the power, he just isnt. Hes riding the current rumor streak, just playing the devils advocate, and getting hits and attention, whether its true or not, and most likely not.
I did hear those people leaking their info to him had mysterious names like N and V and P and R heheh

To post rumors is fine, it generates traffic, etc., but to post nonsense isnt very smart for the long term, because nobody will take him seriously. For example Charlie is biased, but often he has grain of truth, and if you visit his forums, he has quite a lot of readers who respect him. Does ANYONE respect Fuad? Anyone? If he keeps posting nonsense, he will lose not only the last bits of respect anyone would have, but also the readers.
a b U Graphics card
September 16, 2009 9:19:59 PM

Nvidia needed a die shrink to make a dual gpu 1.4 billion transistor each card.

If the G300 is 3 billion transistors, it will need another die shrink before being made into an X2 card. If G300 is 2 billion transitors, it's slower than the 5870 anyway, and the g300 x2 will be slower than the 5870 x2.

ATI will hold the performance crown with the 5870 X2 for months, possibly even as long as a year.
a b U Graphics card
September 16, 2009 9:26:14 PM

maximiza said:
proof is in the pudding

The CAKE is a LIE!

Anyways, I'd wait for the actual reviews.
Anonymous
a b U Graphics card
September 17, 2009 8:51:38 AM

jennyh said:
Nvidia needed a die shrink to make a dual gpu 1.4 billion transistor each card.

If the G300 is 3 billion transistors, it will need another die shrink before being made into an X2 card. If G300 is 2 billion transitors, it's slower than the 5870 anyway, and the g300 x2 will be slower than the 5870 x2.

ATI will hold the performance crown with the 5870 X2 for months, possibly even as long as a year.



you only want to crush nvidia you are scared of them, what a stupididity
September 17, 2009 9:12:28 AM

i dont think ati needs to realease a 5870x2 even a 5890 OC super edition will probably eat GTX380 easily according to specs of 5870.
a b U Graphics card
September 17, 2009 9:17:25 AM

rescawen said:
i dont think ati needs to realease a 5870x2 even a 5890 OC super edition will probably eat GTX380 easily according to specs of 5870.


And you know that because you looked into your crystal ball to see what the GTX 380 will bring?
a b U Graphics card
September 17, 2009 9:22:11 AM

I would still suspend judgment until the real ones are released and reviewed. No point in mumbling with uncertain and unconfirmed "facts". Besides, anyone could say anything about something that is currently in-existent.
September 17, 2009 9:59:09 AM

Gulli said:
And you know that because you looked into your crystal ball to see what the GTX 380 will bring?



I know that because the HD5870 benchhmarks were done using Phenom II 965 (not OC'd). Think about that for a second before responding.
a b U Graphics card
September 17, 2009 10:02:30 AM

sedaine said:
I know that because the HD5870 benchhmarks were done using Phenom II 965 (not OC'd). Think about that for a second before responding.

sources? links?
September 17, 2009 10:06:11 AM

masterjaw said:
sources? links?



Suddenly the results are now in perspective!
September 17, 2009 10:17:54 AM

HD5870 Crysis Benchmark Score
CPU:AMD Phenom II X4 955BE CPU: AMD Phenom II X4 955BE
Win 7 RTM Win 7 RTM
VGA:HD5870 1GB VGA: HD5870 1GB

Crysis 1900x1200 4AA+16AF DX10 Very High Crysis 1900x1200 4AA +16 AF DX10 Very High
min:30.** min: 30 .**
avg:43.** avg: 43 .**
max:54.** max: 54 .**
a b U Graphics card
September 17, 2009 10:18:26 AM

To confirm if you are saying bullsh*t.

And btw, until several sites posted their versions of the review then I won't commend with your views. Thoughts based on just a single source is immature.
September 17, 2009 10:23:29 AM

masterjaw said:
To confirm if you are saying bullsh*t.

And btw, until several sites posted their versions of the review then I won't commend with your views. Thoughts based on just a single source is immature.



Yeah it was bulsh*t - it's not the 965, it's actually the 955BE oops. My bad. Though from what some sites say it was OC'd to 3.5Ghz.

a b U Graphics card
September 17, 2009 10:29:56 AM

sedaine said:
Yeah it was bulsh*t - it's not the 965, it's actually the 955BE oops. My bad. Though from what some sites say it was OC'd to 3.5Ghz.


Doesn't matter whether the CPU was overclocked or not: the source is a shady rumor (there are NO official benchmarks out yet) and it doesn't say jack *** about the GTX 380.
September 17, 2009 10:35:22 AM

Gulli said:
Doesn't matter whether the CPU was overclocked or not: the source is a shady rumor (there are NO official benchmarks out yet) and it doesn't say jack *** about the GTX 380.


Why will the GTX 380 be better than HD5870? What will it do that HD5870 won't do? Cause from the looks of things - HD5870 can play anything thrown at it + support 6 frameless LCD's.

Tell me what you expect the GTX 380 to do!


a b U Graphics card
September 17, 2009 10:39:43 AM

sedaine said:
Why will the GTX 380 be better than HD5870? What will it do that HD5870 won't do? Cause from the looks of things - HD5870 can play anything thrown at it + support 6 frameless LCD's.

Tell me what you expect the GTX 380 to do!


You could have asked the same question when the GTX 295 was about to be released: "what will it do that the 4780x2 won't do?" In fact, ATI will release a 5870x2 so obviously there is a demand for cards that are more powerful than the 5870.
September 17, 2009 10:56:41 AM

I have heard rumours that the nvidia is going to file for bankruptcy, like the 3DFX.
a c 175 U Graphics card
a b Î Nvidia
September 17, 2009 10:57:56 AM

I for one fully believe that the GT300 will be faster then the 5870. I don't believe it will be better. If the 5870 has double the SP at 1600, and if the GT300 more then doubles their count to 512SP, then it stands that the faster GTX280 upgraded card will be faster then the 4870 upgraded card. If the GTX280 is faster then the 4870 and you more then double on the faster side and only double on the slower side, the slower side will still be slower.

This doesn't mean the GT300 will be the better chip however. It might consume more power, generate more heat, cost more due to the huge die size, and if it scales down as well as the GT200, it will leave Nvidia without a midrange and lower DX11 card. Unless something changes that we don't know about, I expect the GT300 to be faster then the 5870, but will probably be worse in every other measurable way.
September 17, 2009 10:59:41 AM

Gulli said:
You could have asked the same question when the GTX 295 was about to be released: "what will it do that the 4780x2 won't do?" In fact, ATI will release a 5870x2 so obviously there is a demand for cards that are more powerful than the 5870.



The only reason they released GTX295 and 4780X2 is cause GPU were bottlenecking the poor i7's. Also nothing on the market could answer "Can you play Crysis on it?" So they released them. The problem now is that you can play Crysis on the HD5870 with Phenom II 955BE - the only reason AMD may release the other higher end parts is so that all Phenoms get good fps in games etc.

But if you own an i7 you'll get about 20% better results anyway. So I'd guess 50-60fps (avg) for Crysis with this card.

In time - there will be a more demanding game released. HD5870 pwnd Crysis!!!! ;) 
September 17, 2009 11:01:10 AM

4745454b said:
I for one fully believe that the GT300 will be faster then the 5870. I don't believe it will be better. If the 5870 has double the SP at 1600, and if the GT300 more then doubles their count to 512SP, then it stands that the faster GTX280 upgraded card will be faster then the 4870 upgraded card. If the GTX280 is faster then the 4870 and you more then double on the faster side and only double on the slower side, the slower side will still be slower.

This doesn't mean the GT300 will be the better chip however. It might consume more power, generate more heat, cost more due to the huge die size, and if it scales down as well as the GT200, it will leave Nvidia without a midrange and lower DX11 card. Unless something changes that we don't know about, I expect the GT300 to be faster then the 5870, but will probably be worse in every other measurable way.


The HD5870 is actually an upgrade of the 4870X2. Think about it.
a b U Graphics card
September 17, 2009 11:04:28 AM

sedaine said:
The only reason they released GTX295 and 4780X2 is cause GPU were bottlenecking the poor i7's. Also nothing on the market could answer "Can you play Crysis on it?" So they released them. The problem now is that you can play Crysis on the HD5870 with Phenom II 955BE - the only reason AMD may release the other higher end parts is so that all Phenoms get good fps in games etc.

But if you own an i7 you'll get about 20% better results anyway. So I'd guess 50-60fps (avg) for Crysis with this card.

In time - there will be a more demanding game released. HD5870 pwnd Crysis!!!! ;) 


As we all know a lot of people buy hardware they don't need, they just want the latest and greatest and will pay for that, that's why there will be a 5870x2 and a GTX 380 and yes, there will be more demanding games in the future.
Then there are people who won't play at anything lower than 2550x1600.
September 17, 2009 11:04:53 AM

4745454b said:
I for one fully believe that the GT300 will be faster then the 5870. I don't believe it will be better. If the 5870 has double the SP at 1600, and if the GT300 more then doubles their count to 512SP, then it stands that the faster GTX280 upgraded card will be faster then the 4870 upgraded card. If the GTX280 is faster then the 4870 and you more then double on the faster side and only double on the slower side, the slower side will still be slower.

This doesn't mean the GT300 will be the better chip however. It might consume more power, generate more heat, cost more due to the huge die size, and if it scales down as well as the GT200, it will leave Nvidia without a midrange and lower DX11 card. Unless something changes that we don't know about, I expect the GT300 to be faster then the 5870, but will probably be worse in every other measurable way.



Also you bring up good points. The problem is not creating a fast card - the problem is creating a fast card under a certain power envelope. The points you brought up are the exact points I was mentioning. AMD managed to improve performance dramatically while keeping power on par with HD4870. In fact, one could argue that it could potentially be more efficient with the 28W idle.
!