Sign in with
Sign up | Sign in
Your question
Solved

GT300 Additional Info!

Last response: in Graphics & Displays
Share
September 30, 2009 9:30:53 AM

http://www.brightsideofnews.com/news/2009/9/30/nvidia-g...

If it is true......................................damn! i have misjudged Nvidia by miles away.
Wonder why they released the news this late. :o 

Edit: change from question to discussion

More about : gt300 additional info

a b U Graphics card
September 30, 2009 10:00:52 AM

It's an interesting concept, but like everything else Nvidia tries, it will be strangled by their attempts to monopolise everything.

Nvidia's big problem is that everybody is their enemy, and right now they don't know if it's AMD or Intel who are their 'main' enemy. If they could actually work with AMD they'd have a chance with this, but they can't work with anyone.
a c 130 U Graphics card
a b Î Nvidia
September 30, 2009 10:12:34 AM

Then again how good is the source ? Ever since Charlie gate i have been wary of those two.
they are calling the chip the GT300 again and its not its a G300. Also some of the articles on the right side look dubious to me as well. I read some and it looks like tabloid wrighting as far as i can see.

Thanks for the link MB but i will take some salt with this if its all the same to you


Mactronix :) 
Related resources
a c 271 U Graphics card
a c 168 Î Nvidia
September 30, 2009 10:22:55 AM

Theo's been referring to the 'GT300' since April or earlier, and I still can't work out when it started being called that or whether that's correct or incorrect and I feel we may not know until an up to date picture of the HSF shows up, until then keep the rumours coming, some of them may even turn out to be true. :) 
a b U Graphics card
September 30, 2009 10:42:33 AM

The same thing was alluded to in THG's 5850 review, ie no information = no card anytime soon.
a c 164 U Graphics card
a b Î Nvidia
September 30, 2009 11:54:29 AM

Did Nvidia even stop to think small? I don't see how they will scale this down at all. Turn off 2 clusters like they did for the GTX260, and your down to 448"SP". Turn off another 2, and your down to 384. (this could be a main stream GTX340 part?) My problem is that this is only 100+ more shaders then the GTX280. With such reductions, will it still outperform the GTX280? As a cut down chip, will they have the same problem that they had with the GTX2xx series in that the previous generation was faster/cheaper to make?

I had a thought while reading that article, I have no clue if this is possible. Would Nvidia even have to build a tessalator(?) into it, or could they simply pass the information on as a c++ instruction and let the SPs do it? It would be an interesting solution to the "where to put the tessalator on this massive chip" problem.
a c 130 U Graphics card
a b Î Nvidia
September 30, 2009 12:07:10 PM

So we'll see a G350x2, but the G380x2 can not exist without reducing the die size?

This is interesting..either nVid will act fast, or they won't have the fastest GPU this gen.
a c 164 U Graphics card
a b Î Nvidia
September 30, 2009 2:12:46 PM

Depending on how well this arch performs, it might be possible for the GTX360x2 to be as fast or faster then the 5870x2. More so if AMD will have to downclock the 5870 to make the 5870x2 work.
Anonymous
a b U Graphics card
September 30, 2009 2:28:32 PM

ohohoho this will be a beats of a card. This is the answer why nvidia was holding this chip so long, they want to release something special and this is a real revolution in architecture since G80. 512cores 6gb gddr5 and 3 billion of transistors need time to accomodate in one card. Im just waiting to see real bencmarks and say by by ATI.
a c 164 U Graphics card
a b Î Nvidia
September 30, 2009 4:49:34 PM

Don't hold your breathe john, you'll start walking into things. Even if these rumors are true, it looks like Nvidia wants to take on Intel/Larrabee more then AMD. With Eyefinity and a fast card, AMD could very well win this round.
September 30, 2009 5:40:10 PM

5870x2 will eat gtx380 more than 4870x2 beat gtx280. i wish 5890 will beat fermi.
Anonymous
a b U Graphics card
September 30, 2009 5:48:02 PM

rescawen said:
5870x2 will eat gtx380 more than 4870x2 beat gtx280. i wish 5890 will beat fermi.


hahah very intresting nvidia will aslo have gx2 card dont worry....
September 30, 2009 6:25:39 PM

my opinion is this process will start again just like what happened when gt2xx and 4xxx came out. first 4850/70 beat all the 9xxxgt/gtx then nvidia came out with 260/280 which beat all the card. now we are at the same point again just one cycle ahead, dont you think so?

If this info is correct, it is very promissing that gt300 will be very competent with 5850/70.
a b U Graphics card
a b Î Nvidia
September 30, 2009 6:36:00 PM

It will take a fairly big screwup by NVidia not to beat the 5870, but that of course is only half the goal. To have the fastest single card, they'll need an X2. Which again, I'm sure they can do, but at what cost? They had to use 260's last time, but this card will be even more power hungry than before. All I know for sure is that this fight is going to be interesting. Both are trying to go beyond the technology/process they have, so it will be tough. And just one quick clarification, the 260/280 came out first and was very high priced, then the 4800s came out and NVidia had to slash prices.
September 30, 2009 6:45:11 PM

magicbullet said:
we are at the same point again just one cycle ahead, dont you think so?

If this info is correct, it is very promissing that gt300 will be very competent with 5850/70.
I don't believe we are, ATI seems to be in a position to take the crown immediately, or never allowing Nvidia to have it between the 5870x2 or a future endeavor (5890).

Then you have to wonder how much you're spending on a nvidia GPU and how much extra you're spending on what could very well be a useless onboard CPU.

Either way, I find multimonitor resolution gaming more relevant to my user experience than CPGPU encoding.
September 30, 2009 6:47:19 PM

magicbullet said:
my opinion is this process will start again just like what happened when gt2xx and 4xxx came out. first 4850/70 beat all the 9xxxgt/gtx then nvidia came out with 260/280 which beat all the card. now we are at the same point again just one cycle ahead, dont you think so?

If this info is correct, it is very promissing that gt300 will be very competent with 5850/70.



Actually the 260/280 came out before the 4870/4850. When the 4870/4850 were released nvidia had some serious rethinking to do on the 260/280 prices.
a c 130 U Graphics card
a b Î Nvidia
September 30, 2009 9:30:07 PM

homerdog said:
NVIDIA's Fermi: Architected for Tesla, 3 Billion Transistors in 2010

So it's Q1 2010 at the earliest. This is very disappointing but not entirely unexpected I suppose.

PS if I joined in 1970, future me must have invented a time machine.


Its good we finally have a reliable source linked, we still have no idea how it will perform though so the wait is still on and the Nvidia fans can stay in their box's for a while longer :lol: 

Mactronix
September 30, 2009 9:35:25 PM

It looks to me like NVIDIA is trying to compete with INTEL and AMD, the only problem NVIDIA wil have is the cost of GT300. AMD 5870 and 5850 will lower their prices ounce NVIDIA cards become available.
September 30, 2009 10:08:32 PM

That's it folks...

Completely new chip, revolutionary/innovative design and a ton of brute force processing power. In addition, modular cores and scalable interface to make an easy spin-off of simpler versions (and improve yields by recovering chips that does not have all cores running).

If all comes together right, behold the new 8800GTX. That kind of a card that will cost an arm and a leg, but it will last 3+ years and it will left the new 5870 eating dust. If history repeats itself, it will take about two generations for ATI to catch up.

I wonder what the ATI fanboys will talk about it know... Well, maybe they will say that it will be expensive, or the power consumption will be high or that they will wait to see some benches and a real working card before saying anything... It would be better just face the inevitable: Second place and midrange market is where you belong dear newborn 5870... enjoy your few days at the top, because you're going to the mid-shelf row in a heartbeat...

Now I want even more for Christmas to come... That'll make a great gift for my machine!

September 30, 2009 10:10:24 PM

I think nVidia is going in totally wrong direction, and that is only because they cant do anything in the 3D gaming part to counter ATI. Their architecture just doesn't give them any chance for moving.

The thing is that they really admit they have no choice. To be pushed in a corner and then say you wanted to be in that corner is just plain obvious you are in serious trouble.

Putting all their eggs in the GPGPU basket and going strictly in that direction would hurt you or even kill you if you cant succeed. Thats very risky move on nVidia side.

Anyway we will wait and see what are the actual performance of this beast and of course the price :) .

Just one note - So many new things it must be - quote "fckng hard" to write good drivers for that thing to utilize it well. Anyone smell driver problems ATI had at first? :) 
September 30, 2009 10:11:36 PM

GT300 seems pretty awesome. However, the chip is huge, so lower yields = expensive. I also want to point out that the power consumption will be extremely high. With all the extra stuff for compute...you'd have to be crazy to hold your breath on graphics performance before any real benchmarks are out.
September 30, 2009 10:15:30 PM

vmardegan said:
That's it folks...

Completely new chip, revolutionary/innovative design and a ton of brute force processing power. In addition, modular cores and scalable interface to make an easy spin-off of simpler versions (and improve yields by recovering chips that does not have all cores running).

If all comes together right, behold the new 8800GTX. That kind of a card that will cost an arm and a leg, but it will last 3+ years and it will left the new 5870 eating dust. If history repeats itself, it will take about two generations for ATI to catch up.

I wonder what the ATI fanboys will talk about it know... Well, maybe they will say that it will be expensive, or the power consumption will be high or that they will wait to see some benches and a real working card before saying anything... It would be better just face the inevitable: Second place and midrange market is where you belong dear newborn 5870... enjoy your few days at the top, because you're going to the mid-shelf row in a heartbeat...

Now I want even more for Christmas to come... That'll make a great gift for my machine!



Dude you make me laugh. Not because of the rumbling you just spew, but because you are so blind Fanboy that all that matters to you is that your nVidia wins. Too shallow brain. Try to get yourself out of that hole and you will see a better world
a b U Graphics card
a b Î Nvidia
September 30, 2009 10:31:42 PM

Very interesting articles. It would have been nice if they had released more info, but I suppose this early on they don't know yet (power and clocks, that is). In the compute side some of those improvements look huge. And keeping the memory bandwidth at a more reasonable 384 should also help with price.
a b U Graphics card
a b Î Nvidia
September 30, 2009 10:35:23 PM

And to those proclaiming NVidia's domination already, that article was 90% about compute abilities, not graphics. While it is very likely the card will be fast, probably faster than the 5800s, the question is how much faster. Today's articles left too much out to even guess, as real world performance often differs from paper and we didn't even get the full paper.
September 30, 2009 10:35:27 PM

I just hope the competition is even more fierce than the one between GT200 and 4000 series. That will mean only 1 thing, faster better technology as competition drives innovation, and of course reasonable prices :D 

$380 to play games is not much reasonable ;)  (but same applies for $600 for iPhone :D )
September 30, 2009 10:38:29 PM

how believable are these news?
a b U Graphics card
a b Î Nvidia
September 30, 2009 10:40:07 PM

Those links above post info directly from NVidia, though not much about the graphics side is indicated. And also the dates they list are theory, but sound reasonable assuming all goes as planned from here on out.
September 30, 2009 10:40:22 PM

The thin line everyone had to catch is "we are sure it WILL be faster". So they don't KNOW for sure yet :D . They hope - yeah, but not sure. I guess they don't have even drivers ready as they got the first working chips few days ago. Thats the main reason they didnt tell how much faster not the bullshit "we dont want it to affect our current sells" - phfffff

The only thing that will affect it bad is if you dont give people assurement it will be faster and worth waiting for. and believe me if they were SURE they would have told.
a b U Graphics card
September 30, 2009 10:58:07 PM

The day when Nvidia finally admitted defeat, and retreated from the gaming market.

This will be lucky to beat 5870. Both have doubled transitors but Nvidia have spent a lot more on non-gaming stuff. All I can think of is, they intend to bribe the gaming industry with $hundreds of millions in order to adopt this. Otherwise, nobody would be bothered.
September 30, 2009 11:03:23 PM

The fact it runs c++ nativity is quite interesting and nearly sells me to get a version of one when it comes out to mess with it. I'm sure nvidia will set up an decent price vs performance card to compete with ATI i mean it's not like ATI are slouches like when the 2k series were released and nvidia could do w.e they wanted with prices.

It is a bit concerning that they don't know how it stacks up yet against ati flagship in games; maybe they are just unsure about the drivers and will rather reserve judgment till later either way it is turning out to be a very interesting time for nvidia.
a b U Graphics card
September 30, 2009 11:07:27 PM

No chance you'll see one of these before Christmas.
a c 79 U Graphics card
a b Î Nvidia
September 30, 2009 11:08:06 PM

yeah, gaming wise that didn't look as impressive. Sure there is a lot improvements but mainly on the compute related stuff which doesn't do anything on games.
But then again, just doubling pretty much everything gives you a lot of brute force to throw at the games anyways...
a b U Graphics card
a b Î Nvidia
September 30, 2009 11:11:40 PM

Exactly. Nothing amazing or surprising on the graphics end, but doubling everything except maybe memory will still get you places in games
a b U Graphics card
September 30, 2009 11:14:59 PM

It will be 'capable' of running games great of that I have no doubt.

The problem is, it will take programming for it especially. No games dev is gonna be arsed with that...unless Nvidia starts throwing an awful lot of cash at them. It will take a lot of cash however, perhaps this is Nvidias last throw of the dice and they are desperate.

This cannot fail for Nvidia - think what that means. However ATI cannot allow it to get widespread adoption at the expense of their stuff either. We really are at make or break time I feel.

Best solution

a b U Graphics card
a b Î Nvidia
October 1, 2009 12:04:09 AM
Share

vmardegan said:
That's it folks...

Now I want even more for Christmas to come... That'll make a great gift for my machine!


More like a Valentine's day gift for your machine. Guess you didn't read the Anand article posted above;

"The price is a valid concern. Fermi is a 40nm GPU just like RV870 but it has a 40% higher transistor count. Both are built at TSMC, so you can expect that Fermi will cost NVIDIA more to make than ATI's Radeon HD 5870.

Then timing is just as valid, because while Fermi currently exists on paper, it's not a product yet. Fermi is late. Clock speeds, configurations and price points have yet to be finalized. NVIDIA just recently got working chips back and it's going to be at least two months before I see the first samples. Widespread availability won't be until at least Q1 2010.

I asked two people at NVIDIA why Fermi is late; NVIDIA's VP of Product Marketing, Ujesh Desai and NVIDIA's VP of GPU Engineering, Jonah Alben. Ujesh responded: because designing GPUs this big is "FAQ'ing hard"."



So as we've been saying for a while, looks like 2010 for anything real, and also like we said back months ago, nV will keep launching paper until then to keep the Fanbois waiting, just like the NV30/FX strategy and the R600 strategy.

Can't run games on paper, can't develop for them either, especially not with a design that complex, emulation would be pointless.

The strange takeaway from these early articles, is that the nV Tesselation will not be in hardware? Makes that interesting, so it won't technically be DX11 capable hardware, it needs to emulate components in software, which blurs what is is to be 'compliant' and 'capable'.
October 1, 2009 12:08:18 AM

exactly GGA!

They made a hardware which is very general and I am sure it will be able to be capable of running the dx11, however if its emulated by drivers we don't know what kind of utilization it will end up with. Also for developers will be MUCH more easy to just use ATI hardware for developing and now nVidia is in the position of adjusting to the market
a b U Graphics card
October 1, 2009 12:23:44 AM

TheGreatGrapeApe said:
Ujesh responded: because designing GPUs this big is "FAQ'ing hard"."

NVIDIA hit a home run with common sense, too bad they didn't actually run with it.
a b U Graphics card
October 1, 2009 12:24:50 AM

I remember a conversation elsewheres about no ff tessellator, and it didnt get far enough. The question was thought to have been answered by the compliance need, so maybe not, as the 5870, has one, but it can be done in CS.
As for everything else, exactly like weve been hearing, and I think mostly its been Charlie leading the way, despite it all, as hes been as consistant as those with the "insiders facts", so maybe no CUDA for him, but certainly at least CUDOS
a b U Graphics card
October 1, 2009 12:29:59 AM

Charlie Demerjian has been telling us for the past 10 months that this was going to happen. Just about everything he has said has come true, and when we are in 2010 and still no g300 is available, he'll have been 100% correct.

You have to give some credit to Theo Valich also, strangely enough he was the first one to really push how much it would be 'cgpu', and also give reasonably close specifics on architecture.

Strange but true, these guys do know more about the truth that most do. At least this time around they did.
a b U Graphics card
October 1, 2009 12:32:29 AM

When you report every rumour as fact you have to get it right eventually.
a c 164 U Graphics card
a b Î Nvidia
October 1, 2009 12:41:01 AM

Quote:
The strange takeaway from these early articles, is that the nV Tesselation will not be in hardware? Makes that interesting, so it won't technically be DX11 capable hardware, it needs to emulate components in software, which blurs what is is to be 'compliant' and 'capable'.


Is this in the Anand article? I came to this conclusion last night in a different thread about the G(T)300, I'm surprised if I got this right.
a b U Graphics card
a b Î Nvidia
October 1, 2009 3:23:40 AM

Wow, if it is going to be emulating some of the "required" components of DX11, it will need all of the processing power its got. So much for 'both fully on the DX11 wagon'.
a c 271 U Graphics card
a c 168 Î Nvidia
October 1, 2009 3:27:26 AM

EXT64 said:
Wow, if it is going to be emulating some of the "required" components of DX11, it will need all of the processing power its got. So much for 'both fully on the DX11 wagon'.

Nvidia did say that they don't feel that DX11 is not all that important and clearly they don't and after seeing the Battleforge results they might have a point.
a b U Graphics card
a b Î Nvidia
October 1, 2009 3:32:03 AM

Ouch. Yeah, I will agree that DX11 hasn't proven its value yet, but we really need to move beyond DX9, so I'm hoping it will work in the end. Let's wait for a full DX11 game to come out before we write it off though. I really hope we don't have to wait for a DX11 console to come out before it takes hold.
a c 271 U Graphics card
a c 168 Î Nvidia
October 1, 2009 3:39:21 AM

Yeah, I was a bit below the belt on that one but it does sort of show that for the immediate future it's not really worth getting your panties in a twist about DX11 IMHO.
a b U Graphics card
a b Î Nvidia
October 1, 2009 3:46:07 AM

Definitely true. As we saw with the DX10 fiasco, until a real game actually shows the benefits (visually or FPS) that DX might as well not exist. The only thing I'd advise against now is getting an expensive DX10 card. Anyone going high end might as well look at ATI's offerings or wait for NVidia's DX11 line.
a b U Graphics card
October 1, 2009 4:02:47 AM

Compare original DX9 games with current ones and see what DX9 can really do. It takes years before the API is exploited completely, no 1st gen game can do that.
!