The GF310 is here!!!

rawsteel

Distinguished
Oct 5, 2006
538
0
18,990
NOBODY can keep with nVidia mess any more!

I dont have so many brain cells literally to remember all the schemes in their naming/renaming etc.

Now 300 series but from older arch, no DX11, but DX10.1

however GT 240 have DX10.1 but still 200 series

Its like they are intentionally trying to make it IMPOSSIBLE to remember all so you can somehow buy by mistake :D
 

jennyh

Splendid
Well I'm just gonna assume Nvidia are up to their usual tricks, which is ofc based around them getting as much cash as possible at the expense of the consumer who doesn't know better.

On the other hand, I'll also take this as an admission that there will be no low end fermi parts, and no low end dx11 parts for the forseeable future.
 
Well I believe this to be a G200 based card, as are the other DX10.1 cards are.
Scaling down, the G200 doesnt seem to do well, and its been shown elsewheres that compute density was somewhat higher on the G92 than the G200. which is what we may be seeing here.
If nVidia cant scale down their G300 chips, this isnt a good thing for them at all, meaning the low end will be dominated by ATI, as the 4xxx and the 5xxx series scale nicely
 
Bad part here is, nVidia is just now delivering DX10.1 for the babies for PC gaming.
Many of these buyers will get their first shot of PC gaming using these cards in DX10.1.
I would like to see nVidia lead in DX11, not complain and put down DX10.1, not have such a card til the next DX iteration comes out, and not be a day late and a dollar short.

We need nVidia to do well here as well as on the high end
 


Well G80 for 4 (or 3 if this is G200 based) Generations, looks like the low end Fermi part will be the GF610/710 after 3/4 more gens with a G200 solution. :whistle:

And here they said the Fermi architecture was so much more scalable than the ATi architecture, sure doesn't look like it sofar, but then again nVidia never was one for perfecting processes on previous low end, and migrating from top to bottom quickly, so this is par for the course, and not indicative of the new direction they said they were taking. :pfff:


 

invisik

Distinguished
Mar 27, 2008
2,476
0
19,810
Not this again. Im starting to get pissed at this god damn company. There low 8 series is the same as there 9 series. Now the g200 series is a minor improvement dx10.1 and g300 series rebranded g200. (Talking about low end)
They need to step it up on there low end scale.
 
Just looks like a dedicated low profile HTPC card to me, no biggie. As TGGA says just a re branded part as was i think quite probable from the start.
People didn't actually thing it was a Fermi part just because it started with a 3 did they :ouch:

I have been doing a bit of reading about Fermi and I'm actually quite impressed by whats being claimed. Just waits to see how "claimed" stacks up against performs.
As far as i can see all the numbers being quoted are for the full fat Tesla version card. As far as i know we dont yet know what teh Geforce version will be spec wise.
I may be wrong here as i have really only just started showing an interest in this card.
Nivida recon that the full fat Tesla card can raytrace fast enough for realtime game play. Thats either [:lectrocrew:5] or :pt1cable: can i have some of what they are on.
Should be interesting.

Mactronix
 


Can you point me at something about this please, as i said i just now really started to get interested in the possabilities or not ? of this card.

Thanks

Mactronix
 
Speculation here, but somethings to think about.
Look at the Intel cheeseburger/glued approach vs monolithic Phenom.
It took AMD how long before we saw dual cores?
In ATIs lower end cards, they already had/have the 4770/5770, so, itll be earlier, and its been tested and true.
Just a few thoughts to ponder

Now, add in the poor perf were seeing from the huge G200 to the low end, it doesnt show the perf increases wed hope to see, going from G92 to G200, same size etc
 


Well it also depends on the game they are playing, and resolution.

Remember that the HD3870 & 4870 was real-time ray tracing gameplay over a year ago;

http://www.tgdaily.com/hardware-features/38145-watch-out-larrabee-radeon-4800-supports-a-100-ray-traced-pipeline-using-directx-9

So a generalized statement like that is just too vague on the surface. It sounds impressive, but it's like saying my CPU can render a game real-time... but how impressed are you when I tell you it's the original Quake @ 640x480 with 16 bit colour?
 
As far as i can see JD those slides and the general discussion is based around the full fat Tesla chip, now i know you seem to be much more connected to the pulse than me but what im hearing is that the Geforce version that will be the one joe public buys for gaming may well not be the whole 512 cores. Who knows what else might get trimmed ?
The BSN article even had an update saying the tesla guys had got hold of them and indicated as much.

Mactronix
 

randomizer

Champion
Moderator
oldnewnvidia.gif
 

Ahh yes the OEM line, sorry I missed that.


GeForce Products - only available in prebuilt (OEM) systems
300 Series
GeForce 310

200 Series
GeForce GTS 240
GeForce 205

100 Series
GeForce GTS 150
GeForce GT 130
GeForce GT 120
GeForce G100