Sign in with
Sign up | Sign in
Your question

The GF310 is here!!!

Last response: in Graphics & Displays
Share
a b U Graphics card
November 25, 2009 1:40:37 PM

Sporting DX10,1 and no other new specs, just a new name!!!
http://h10010.www1.hp.com/wwpc/uk/en/sm/WF06c/A1-329290...

My question is, whatre they gonna call the low end Fermi type card, the GF410?

More about : gf310

November 25, 2009 1:45:02 PM

What is that thing? It looks too small to be any good.
m
0
l
a b U Graphics card
November 25, 2009 1:46:40 PM

Wow, that is underwhelming.
m
0
l
Related resources
November 25, 2009 2:03:39 PM

NOBODY can keep with nVidia mess any more!

I dont have so many brain cells literally to remember all the schemes in their naming/renaming etc.

Now 300 series but from older arch, no DX11, but DX10.1

however GT 240 have DX10.1 but still 200 series

Its like they are intentionally trying to make it IMPOSSIBLE to remember all so you can somehow buy by mistake :D 
m
0
l
a b U Graphics card
November 25, 2009 2:19:51 PM

Well I'm just gonna assume Nvidia are up to their usual tricks, which is ofc based around them getting as much cash as possible at the expense of the consumer who doesn't know better.

On the other hand, I'll also take this as an admission that there will be no low end fermi parts, and no low end dx11 parts for the forseeable future.
m
0
l
a b U Graphics card
November 25, 2009 2:28:16 PM

Well I believe this to be a G200 based card, as are the other DX10.1 cards are.
Scaling down, the G200 doesnt seem to do well, and its been shown elsewheres that compute density was somewhat higher on the G92 than the G200. which is what we may be seeing here.
If nVidia cant scale down their G300 chips, this isnt a good thing for them at all, meaning the low end will be dominated by ATI, as the 4xxx and the 5xxx series scale nicely
m
0
l
a b U Graphics card
November 25, 2009 2:34:20 PM

Bad part here is, nVidia is just now delivering DX10.1 for the babies for PC gaming.
Many of these buyers will get their first shot of PC gaming using these cards in DX10.1.
I would like to see nVidia lead in DX11, not complain and put down DX10.1, not have such a card til the next DX iteration comes out, and not be a day late and a dollar short.

We need nVidia to do well here as well as on the high end
m
0
l
a b U Graphics card
November 25, 2009 3:56:44 PM

JAYDEEJOHN said:

My question is, whatre they gonna call the low end Fermi type card, the GF410?


Well G80 for 4 (or 3 if this is G200 based) Generations, looks like the low end Fermi part will be the GF610/710 after 3/4 more gens with a G200 solution. :whistle: 

And here they said the Fermi architecture was so much more scalable than the ATi architecture, sure doesn't look like it sofar, but then again nVidia never was one for perfecting processes on previous low end, and migrating from top to bottom quickly, so this is par for the course, and not indicative of the new direction they said they were taking. :pfff: 


m
0
l
a b U Graphics card
November 25, 2009 3:59:21 PM

I can't figure it out except that the low end fermi parts are very, very late. Like 1 year from now late.

Why else would they release a low end part starting with a '3' now?
m
0
l
a c 172 U Graphics card
November 25, 2009 3:59:45 PM

Nvidia come on you suck. :pfff: 
Ever wonder why only a few of you cards ever sell any more while the low end slime takes on dust and warehouse space.
m
0
l
a b U Graphics card
November 25, 2009 4:04:49 PM


A quick look at the product code; VG885AA shows it's as what people were selling as the G210/GT218 a few months ago, so looks like a simple re-re-naming of a few month old part. :heink: 
m
0
l
a b U Graphics card
November 25, 2009 4:09:18 PM

Not this again. Im starting to get pissed at this god damn company. There low 8 series is the same as there 9 series. Now the g200 series is a minor improvement dx10.1 and g300 series rebranded g200. (Talking about low end)
They need to step it up on there low end scale.
m
0
l
a c 130 U Graphics card
November 25, 2009 4:44:01 PM

Just looks like a dedicated low profile HTPC card to me, no biggie. As TGGA says just a re branded part as was i think quite probable from the start.
People didn't actually thing it was a Fermi part just because it started with a 3 did they :ouch: 

I have been doing a bit of reading about Fermi and I'm actually quite impressed by whats being claimed. Just waits to see how "claimed" stacks up against performs.
As far as i can see all the numbers being quoted are for the full fat Tesla version card. As far as i know we dont yet know what teh Geforce version will be spec wise.
I may be wrong here as i have really only just started showing an interest in this card.
Nivida recon that the full fat Tesla card can raytrace fast enough for realtime game play. Thats either [:lectrocrew:5] or :pt1cable:  can i have some of what they are on.
Should be interesting.

Mactronix
m
0
l
a b U Graphics card
November 25, 2009 4:57:29 PM

Problem looks like Fermi wont scale well tho, just as were seeing with the G200 rebrands/DX10.1
It may turn out, nVidia has a decent top card, then alot of, well, who knows? Doesnt look great at this point
m
0
l
a c 130 U Graphics card
November 25, 2009 5:09:13 PM

JAYDEEJOHN said:
Problem looks like Fermi wont scale well tho, just as were seeing with the G200 rebrands/DX10.1
It may turn out, nVidia has a decent top card, then alot of, well, who knows? Doesnt look great at this point


Can you point me at something about this please, as i said i just now really started to get interested in the possabilities or not ? of this card.

Thanks

Mactronix
m
0
l
a b U Graphics card
November 25, 2009 6:40:06 PM

Speculation here, but somethings to think about.
Look at the Intel cheeseburger/glued approach vs monolithic Phenom.
It took AMD how long before we saw dual cores?
In ATIs lower end cards, they already had/have the 4770/5770, so, itll be earlier, and its been tested and true.
Just a few thoughts to ponder

Now, add in the poor perf were seeing from the huge G200 to the low end, it doesnt show the perf increases wed hope to see, going from G92 to G200, same size etc
m
0
l
a b U Graphics card
November 25, 2009 8:50:05 PM

mactronix said:

Nivida recon that the full fat Tesla card can raytrace fast enough for realtime game play. Thats either [:lectrocrew:5] or :pt1cable:  can i have some of what they are on.


Well it also depends on the game they are playing, and resolution.

Remember that the HD3870 & 4870 was real-time ray tracing gameplay over a year ago;

http://www.tgdaily.com/hardware-features/38145-watch-ou...

So a generalized statement like that is just too vague on the surface. It sounds impressive, but it's like saying my CPU can render a game real-time... but how impressed are you when I tell you it's the original Quake @ 640x480 with 16 bit colour?
m
0
l
a c 169 U Graphics card
November 26, 2009 6:13:17 AM

I though after making generations of 8800 cards with new names,we would finally see something new but i guess we won't
m
0
l
a c 130 U Graphics card
November 26, 2009 7:43:39 AM

As far as i can see JD those slides and the general discussion is based around the full fat Tesla chip, now i know you seem to be much more connected to the pulse than me but what im hearing is that the Geforce version that will be the one joe public buys for gaming may well not be the whole 512 cores. Who knows what else might get trimmed ?
The BSN article even had an update saying the tesla guys had got hold of them and indicated as much.

Mactronix
m
0
l
a b U Graphics card
November 26, 2009 7:50:27 AM

m
0
l
a c 271 U Graphics card
November 26, 2009 8:30:08 AM

Has this card been found anywhere else other than the HP site?
m
0
l
a b U Graphics card
November 26, 2009 8:39:56 AM

Er... didn't you notice the GIF above? It's on the NVIDIA site.
m
0
l
a c 271 U Graphics card
November 26, 2009 8:43:31 AM

randomizer said:
Er... didn't you notice the GIF above? It's on the NVIDIA site.

Ahh yes the OEM line, sorry I missed that.


GeForce Products - only available in prebuilt (OEM) systems
300 Series
GeForce 310

200 Series
GeForce GTS 240
GeForce 205

100 Series
GeForce GTS 150
GeForce GT 130
GeForce GT 120
GeForce G100




m
0
l
a b U Graphics card
November 26, 2009 8:44:37 AM

G100? Wasn't that the rumoured 8800GTX killer (AKA 9800GTX) about 2 years ago?
m
0
l
a b U Graphics card
November 26, 2009 11:34:05 AM

We've long since lost track random.
m
0
l
a c 130 U Graphics card
November 26, 2009 11:39:06 AM

jennyh said:
We've long since lost track random.


+1
I still dont undeerstand how it can be said it wont scale well when we dont know what it is yet ? Im well confused at this point :) 

Mactronix
m
0
l
a c 271 U Graphics card
November 26, 2009 11:44:22 AM

mactronix said:
+1
I still dont undeerstand how it can be said it wont scale well when we dont know what it is yet ? Im well confused at this point :) 

Mactronix

I can't help with your confusion mate, but maybe some fluffy kittens will help in some way.
m
0
l
a b U Graphics card
November 26, 2009 1:42:09 PM

mactronix said:
As far as i can see JD those slides and the general discussion is based around the full fat Tesla chip, now i know you seem to be much more connected to the pulse than me but what im hearing is that the Geforce version that will be the one joe public buys for gaming may well not be the whole 512 cores. Who knows what else might get trimmed ?
The BSN article even had an update saying the tesla guys had got hold of them and indicated as much.

Mactronix

Keep your eye on that thread, as they do mention a cut down 360, and possibilities.
Like I said, you just have to know whos saying what and when, and cipher out the other useful, but not necessarily real things
m
0
l
a b U Graphics card
November 26, 2009 2:15:06 PM

It just gets worse for Nvidia.

http://tech.tbreak.com/2009/11/nvidia-past-present-futu...

Nv invented PhysX now, same as they invented the gpu.

Quote:
"If we dont revolutionize the GPU again then we’ve already hit the wall. Thats exactly the reason why we invented PhysX and why we’re gonna bring ray-tracing to the market place."


This one is mind blowing -

Quote:
I’m not cynical about the market but If you give them the same thing over and over again, they’ll get tired.


I guess nobody will understand that point better than JHH. I swear you couldn't make this stuff up.
m
0
l
November 26, 2009 2:23:31 PM

JHH: "We will kick Intels ass when we go to court next year."

:lol: 

Poor Huang, while busy "inventing" PhysX, he forgot threats and trashtalking Intel backfired already, and they lost the market worth more than a billion, still he hasnt learned a thing... Shareholders HAVE to cut him if they want Nvidia to survive in the long run, because he is ruining the company, fast.
m
0
l
a b U Graphics card
November 26, 2009 2:36:36 PM

Again

Who needs tesselation, DX10.1 or DX11.
I find his statements rediculous, as Crysis implemented some advanced things into gaming, and its what we all want, and things like tesselation and better DX models makes these things easier for devs and HW.
Him preaching against this, and putting his lil proprietary thing above them is sad, all the while, we see his company creating nothing new, just more rebranding, slowing dev progress down in games etc etc etc.
He and nVidia just has to know by now, anything they/he says will be scrutinized, and not just by gamers, as hes mentioned Intel many a time in not so favorable ways.
It gets attention, and yes, putting a picture of the new coming best thing ever on facebook does keep people talking about his product, but in what ways?
Even if Fermi/G300/GF100 or whatever comes out and does great, doesnt change the fact itll be late, and now, itll even have to be better than it would have,since its carrying with it all the baggage nVidia has accumulated over this time with their actions, non actions, such as renaming etc, holding up fake cards, so, therell be no shock and awe, no matter what.
m
0
l
a c 130 U Graphics card
November 26, 2009 2:53:21 PM

I know what you are saying JD but the simple truth is this.
It dosent matter one single bit what they said/did even as close as yesterday. The memory of the consumer only lasts untill the nect good new thing turns up.
If Nvidia release Fermi and its better than the 5 series then it sells, simple facts are what reviews (the good ones LOL) deal with and so it really dosent matter to the general public whats happened in offices or around boardroom tables etc.
It wont matter to the people who just want the latest and greatest if they blow up the moon on tuesday, people will buy the GPU on wednesday if its the best there is.
The baggage and its lateness probably only matters to a small percentage of users.
If its good it sells and Nvidia rule again, if its crap then they may as well leave town. For that reason i cant see it being bad.

Mactronix
m
0
l
a b U Graphics card
November 26, 2009 3:11:14 PM

But it does matter. Consumers now are tired of this crap. Can you find a 58xx series card?
And theres more to come, and theres lots of time, and therell be more nVidia crap coming, and alotta nothing as well.
Now, by the time nVidia does come out, itll be a day late dollar short, as ATI refresh will be around the corner, drivers will have matured etc etc etc.
Thats the pressure no great review will save, no perf will fix, and I havnt even mentioned cpu bottlenecks, which will play a part this gen, at least until the DX11 games come to challenge these cards, and again, with no product, the influence wont be done on the nVidia side.
So sure, it could be a great card, but imagine if it were already here? And no one notices their continual rebranding, no fake holdups, no face(palm)books? How much better will it be recieved?
Theyll have to hit it out of the park in everything for it to be anywheres near considered what might have been, and everything isnt going to happen, which would have been exscusable, but no longer now

PS And consider this, if it doesnt completely own, do you actually think these theatrics will go unforgotten? Even if its a great card? No reviewer will mention these things?

Their low end is already being poorly recieved, theyve lost on wins, as OEMs are dropping their low end for ATI cards, with better ones to come, and nowheres a mention or peep of mid and low end coming to be competitive.
Its why I say, keep an eye on that and maybe a few other links about lowend/mids coming from nVidia, as this IS thier bread and butter, and EOLing their old DX10 cards have left the channel bare as well, while losing OEMs wins on low end, simply because of this foolishness, rebranding etc.
All those who defended the rebranding,moral etc choices nVidia has made lately, it appears it doesnt work even in a dog eat dog business climate
m
0
l
a c 130 U Graphics card
November 26, 2009 3:33:30 PM

Its all been and gone before JD. People forget so quickly. Seriously if people actually gave a crap about how a company operated and what kind of ethic's were involved then Intel would have gone bust by now.
What your saying is a bit utopian im afraid, it seems to me thats how you think people should respond to Nvidia. Sure i agree but i think you need a dose of reality here JD.
I will repeat myself for you.
"It wont matter to the people who just want the latest and greatest if they blow up the moon on tuesday, people will buy the GPU on wednesday if its the best there is.
The baggage and its lateness probably only matters to a small percentage of users.
If its good it sells and Nvidia rule again, if its crap then they may as well leave town. For that reason i cant see it being bad."

I cant coment on the CPU bottleneck side of things but cant imagine how that can be, I only read a review the other day that had a 5970 playing games at various resolutions and settings, the graphs didnt seem to indicate any such thing. However as always im happy to learn if you have a handy link or two you would like to share.

As far as how the new card is received i guess we will have to agree to differ and wait and see what the out come is.

Mactronix :) 
m
0
l
a b U Graphics card
November 26, 2009 3:53:11 PM

Thats not my point. Theres NO room for failure because of nvidias tactics here, thats my point.
Rebranding isnt better, its rebranding, and the only reason I brought it in as a moral issue is obvious, but, in the end, it doesnt bring anything new, and as a business model, it fails eventually.
So, putting that asside, nVidias set themselves up for a longer lasting suicide if Fermi fails, have done nothing for the mid/low end, which is their bread and butter, the G200 series looks to be a failure for thse solutions as well, meaning no G92/last gen to save their butts, and in every way, the G300 has to win, and win big, from high to low end, and it doesnt appear its going to, except possibly at the high end.
Its bad enough to create old products and rename them as new, but its suicide to think theyll sell when your competitor has moved on with better product.

By the time nVidia has a full portfolio in market coverage, ATI can simply refresh all their cards from top to bottom on mature drivers, still having a higher compute density
m
0
l
a b U Graphics card
November 26, 2009 4:39:09 PM

And then? Theres no line up showing anytime soon, the past 2 shrinks from their past 2 gens have seen to be mediocre at best, so its not just the top here, and agreed, if Fermi has no competition at the highend thats true, But I see ATI easily having a 20% faster solution by the time Fermis drivers have matured, which then puts agreat Fermi into the, a lil faster Fermi, and still no mid/low ends.
Itll have to be a killer G80 type of card, and not just thought of that way, but fps wise
m
0
l
a c 130 U Graphics card
November 26, 2009 7:15:10 PM


@ JD
As far as i can tell your basing all this on speculation and conjecture and what you have gleaned from postings made by people you know and trust on other forums ? That's fine if that is the case we all do it from time to time but your posts are reading as if what you are saying is fact. Please if you have some hard proof that Fermi wont scale and something that says for definite that there is to be no line up then please post it.

As far as Fermi having to be good, well i said as much didn't I. People keep quoting history, same as you saying well the last two gens didnt shrink well, so if history did keep strictly repeating itself then there wouldnt be the swing from Nvidia being top to ATI being top would there ? Unless you count that as a historic trend. But it cant work both ways. Your saying the last Arch didnt shrink well so this one wont either, well by that logic Nvidia will never make an Arch that will shrink well ever again.

Mactronix
m
0
l
a b U Graphics card
November 26, 2009 8:08:22 PM

To me it says something much simpler - there will be no low end fermi part. No need to wonder about scaling on something that isn't going to exist, at least for another year.

Can you think of any other reason why Nvidia would rebrand this to a 310? Maybe I'm missing something, but to me that says there will be no low end fermi part.
m
0
l
a c 271 U Graphics card
November 26, 2009 8:13:17 PM

jennyh said:
To me it says something much simpler - there will be no low end fermi part. No need to wonder about scaling on something that isn't going to exist, at least for another year.

Can you think of any other reason why Nvidia would rebrand this to a 310? Maybe I'm missing something, but to me that says there will be no low end fermi part.

Or it's just the designation given to the OEM equivalent of the retail part, which is the GT210. :pfff: 
m
0
l
a b U Graphics card
November 26, 2009 8:38:43 PM

On the bright side, it's got 1x Display Port :lol: 
m
0
l
a b U Graphics card
November 26, 2009 9:26:15 PM

Mousemonkey said:
Or it's just the designation given to the OEM equivalent of the retail part, which is the GT210. :pfff: 


Kinda strange name to give it isn't it? Anyway weren't those gt210 already the gt110? Christ it's so confusing I can't even remember half of it.
m
0
l
a b U Graphics card
November 26, 2009 11:12:39 PM

Im basing it on cpus history, gpus history and timing.
Now, these are heavy changes nVidia is trying to do, so I can give them a pass somewhat on this, even tho it goes completely against history.
There are exceptions , but even those exceptions (such as C2D or G80) were on time, but again, lets take a look at the last major change, or the G80.
It was 3 ways lucky, the R600 was late, opposite here, they had a new DX model and the time for their drivers,opposite here, and the R600 wasnt a good part, again, opposite here.
Itll be lucky to be a great card, but thats all, and even then, it has to fight a card with mature drivers on it, and if it doesnt arrive til April, as is speculated, in 2-3 months, itll be facing a refresh part.
Now, if the 280 had come out against the 4890, with bad baby drivers, and the 4890 having fully mature ones, the 280 would have lost.
I see nothing in what we do know about its arch to show itll be anything but slightly better at best than a refreshed mature ATI part.
Thats not speculation, thats how it rolls.

So, the likelyhood of Fermi coming in and kickin butt isnt great at all, even discounting history, just on going with what we already know.
Now, you add in the later coming lower end parts, where if we just ignore whats been happening lately, theyll be extremely late, wont have alot of wins with OEMs, and again, lets look at the G200 cutdown cards. They suck. Sorry, doesnt sound professional nor respectful, but a pigs a pig.
Going from that large a scheme I believe is holding nVidia back, period.
What was the first thing ATI did after the 2900 debacle? Shrink shrink shrink.
Now, we see a smaller bus on the G300 coming, this means nVidia is getting wiser, but again, its still larger. Theres pointers all over to raise red flags here, and I havnt even mentioned the dev/rel that ATI currently has has since June.
Itll be almost a years worth of dev/rel, and unless nVidia gets lucky, and ATI drops the ball, theyll have another mountain to climb, besides the fact that the tesselation is further advanced on ATI.
Im not saying fermi wont be good, but acting like holding up fake cards, having no new cards for ages, renaming them, and then if Fermi doesnt come in killer, itll have effects.
I already see some writers giving the benefit of the doubt to nVidia, in differing ways, that in itself is a red flag
m
0
l
a b U Graphics card
November 27, 2009 12:30:17 AM

JAYDEEJOHN said:
Now, if the 280 had come out against the 4890, with bad baby drivers, and the 4890 having fully mature ones, the 280 would have lost.


Good point actually. Just how hard would the 280 have lost if up against the 4890?

This is what we could be looking at, and every passing month makes it more likely.
m
0
l
a c 130 U Graphics card
November 27, 2009 7:07:59 PM

Your probably right JD but its still all guess work. Until we see the specs and see how it actually works real world its all just guess work, educated guess work i grant you but none of what you are posting is fact.
You posted a link to a forum with a nudge nudge wink wink, " Like I said, you just have to know who's saying what and when, and cipher out the other useful, but not necessarily real things"
When people post about performance its usual to post benchmarks to back up what you are saying. When you post about other stuff, speculation, rumour you would generally post something intelligible to support your view point.
You make a good argument and as i said you may be right but there is nothing to say Fermi cant/wont be so great it will blow ATI away. I'm not trying to defend the card in fact if anything I'm as close to being an ATI fan boy as possable. Just not fanatic is all.
Fermi is a brand new Arch and as such it could be anything, it could stink, could cost too much and perform like a dog. Even then some people will buy it. Nvidia have a following same as ATI did when the G80 got so lucky.
On the other hand the card could be something really special. Nvidia are quoting 6 times as good as the previous gen, they probably mean compute wise with the Tesla card i would guess. More conservative/realistic is 2.5 times.
If its anything like that good ATI will be in competition again big time.
I want this card to be good for the competition it will cause which can only be good for consumers, that and any advances it might bring.


Mactronix
m
0
l
a c 130 U Graphics card
November 27, 2009 7:18:12 PM

jennyh said:
Good point actually. Just how hard would the 280 have lost if up against the 4890?

This is what we could be looking at, and every passing month makes it more likely.


No its not a good point as it didnt happen. And comparing G80 and R600 isnt fair as MS had more of a say in that little fight than either card company did. Had they stuck to the spec Nvidia wouldnt even have had a card.

Mactronix
m
0
l
a b U Graphics card
November 27, 2009 7:49:05 PM

mactronix said:
No its not a good point as it didnt happen. And comparing G80 and R600 isnt fair as MS had more of a say in that little fight than either card company did. Had they stuck to the spec Nvidia wouldnt even have had a card.

Mactronix


Try to look at it logically.

Fermi and Evergreen were supposed to be out together, or at worst within 1-2 months of each other.

These designs were finalised months-a year ago, possibly even further back than that. You can't just change things all of a sudden.

The late Fermi will only be as good as it was ever going to be. This isn't a revised Fermi coming out to compete with (what will be) a refreshed 5870 - it's the same initial Fermi minus bugs.

So yes it's a good point. Had this happened last year it could have been a gtx280 vs an HD 4890. Considering the 4890 beats the 285 a lot now, it should be clear enough what this dragging on and on of Fermi could actually mean.

The 2900 wasn't that much worse than the 8800, it was just so late and therefore far behind driver-wise it would never get close enough when it mattered. Any reason why Fermi should be different? Or do you expect it to miraculously appear with updated drivers too?
m
0
l
a c 271 U Graphics card
November 27, 2009 7:55:42 PM

Has anyone got a definite date for the release of the 5 series refresh?
m
0
l
!