Nvidia renames 2xx parts to 3xx

invisik

Distinguished
Mar 27, 2008
2,476
0
19,810
If they made the gtx260-gtx275 a midrange card with dx11 with higher clocks and power efficient for the gtx 3xx series i wouldn't personally mind.
 

4745454b

Titan
Moderator
If true, wow. I've been saying for awhile I don't see how Nvidia will handle the move to DX11 well. They never shrank the GT200 down to the mainstream parts, and they never supported DX10.1. Their chips are alredy quite large, and they need to get even larger if they will add in the parts that DX11 require. Its been purposed that Nvidia will only support DX11 in the high end, as their mid range/low end parts are renamed G92/G94 parts.

If their yields are bad and they're just going to rename old parts as new parts to give them a full product line up, it looks like DX11 for high end only might be true. Worrisome times for Nvidia.
 

randomizer

Champion
Moderator
I think this deserves to be reposted considering the subject of the thread.

nv_recycling.png
 

turboflame

Distinguished
Aug 6, 2006
1,046
0
19,290


Well then it wouldn't be a rebrand because that requires more than a new sticker.
 

Zen911

Distinguished
Jan 22, 2008
88
0
18,630


well, this is technically not possible, dx11 is a suprest of dx10.1 which isn't even supported by those cards, to make a card able to run dx11 u need a hardware modification to make full use of its features (tessellation for example).

now putting in mind that the gtx 3xx series will be marketed as a dx11 cards, it doesn't make sense to ruin that with a gtx 3xx card that doesn't support dx11 even when it's a mid-range card. it won't stand a chance against AMD's mid-range dx11 cards.
 

invisik

Distinguished
Mar 27, 2008
2,476
0
19,810
With a little hardware modification (dx11 support) wouldn't they be able to use the gtx 2xx series and rename them to the gtx 3xx series or would the card have to be completely redesigned?
 

Zen911

Distinguished
Jan 22, 2008
88
0
18,630


The HD2xxx till the HD4xxx had built in tessellator but a non functioning on due to the lack of support of the API side (DX10) and even though those cards won't support the full features of dx11. a "little hardware modification" might be the word in that case. but in case of the gtx 2xx, it won't be the proper word to describe it, and i'm not sure if has to be completely redesigned either, but in both cases this is far from being rebranding.
 

4745454b

Titan
Moderator
Depends. If Nvidia has a tessellation engine in the G92 and simply never turned it on, then a "simple" tweak to make it DX11 compliant and active is all it would take. If however they never included one, then the chip would have to be redesigned to include the things needed to support DX11. If they do rename G92 as 300 cards, then the lower 300 series won't support DX11.
 
MX up the cards is all I have to say

If this is true, and we have DX10 still dragging its way thru this gen, we get what we deserve

I am/will be extremely discouraged if this comes about. May be time to ditch PC gaming, and find greener pastures, and no, not consoles either
 
'rant'
OK, Ive heard enough, seen enough, watched enough, and basically, if were stuck with DX10, and this is what were given by the supposed leader in sales of discrete gfx cards, this is nothing but a shame.
Anyone who supports this action, defends it, well, you better kiss nVidia goodbye, because the only thing standing between LRB and everything else is the DX model.
This is my personal opinion, and some may disagree, everyone may, but its the way I see things trending.
I dont like it either, but thats the way it goes. The only option each one of us has to move gaing along is thru our pocketbooks.
They have plans for us, cloud gaming, all SW driven, eventually no discrete at all in the way we know it.
You may find yourself in a few years not being able to own a game, just a download.
It may not have better gfx, but, its good enough, it sells, everyone makes money, so....why not?
Maybe the enthusiasm has died in PC gaming, I just dont know, but things like this, rehashing old DX models that arent even now currently up to date, let alone the new one coming, well, to me, it speaks volumes.
I hope Im wrong, I hope these rumors are also wrong, and we see PC gaming improving in the future, but, Im not so sure anymore. People are trite with the way things have always gone etc, and yes, itll be over 4 years to a new adoption by alot of claims here on Toms, by many here.
In the end, if thats moving forwards, and if thats exceptable for people willing to spend hundreds of dollars per year, and literrally thousands over than 4 year stretch, and this is all we have is grudgingly inch forwards, one has to ask, is it worth it?

If people accept this, theyll get wjat they deserve
end rant
 
Maybe you need to ask them what cards they use, or possibly theyre giving their opinions.
Always post a link if you have one, at least then, people can form a better opinion one way or the other
 
Latest rumors on true DX11 parts are, nVidia ships small quatities of G300 around secong week of November, whereas the x2 product from ATI has been rumored as that time of release has been going around for awhile now.
So, it may mean no top end unit for nVidia at all this time around, not even long enough for them to drastically cut their prices 200$
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290

yep, this very well could be the end of nvidia.. and even though i dont like them, we all kinda need them on the market or else amd/ati has free reign over the market of graphics cards.
i mean, nvidia still hasnt even gotten to gddr4 yet and AMD is already at gddr5.

things are lookin pretty grim...
 

turboflame

Distinguished
Aug 6, 2006
1,046
0
19,290


GDDR4 offers little benefit so it makes sense that Nvidia is skipping it.
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290

not nessisarily. its clocked alot higher than gddr3 and can OC much more smoothly, not to mention its more power efficient
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810


I highly doubt that nvidia isn't releasing a high end card. However WHEN they release it is another story. I think if they drag their feet and we seen a 5890 x2 (or just even crossfiring normal ones) before we see a 395 nvidia is going to lose this round. To many hardcore gamers and computer builders will switch to Ati, some will do it saying it is just to tie them over till 395 but then they will see how powerful and smooth ATI runs lately and may just make the switch.

They could lose a lot of the "fanboy high end" buyers with a move like that.

 

vmardegan

Distinguished
Jul 31, 2008
83
0
18,640
Well, I may be the only one thinking this way, but here it goes...

I do not think that nVidia or ATI is really bringing a game changer card in the next generation. The reason behind this is that today most of producers are targeting multi-plataform developments so they can reach a greater part of the consumer market.

In fact, some of the new games are coming first to the consoles and later to PC. In addition, I don't remember any game other than Crysis that was PC exclusive and that it would push hardware beyond its limits. I know that some RTSs and MMORPGs came in the meantime to PC only, but I would not consider them game changers or a disruptive leap-frog in hardware/software technology.

If the above is correct and knowing that even today's video cards are already way more powerful than consoles, I don't believe developers are just waiting for super duper DX11 cards to bring something new to the market.

Honestly I think that we will only have a leap-change in the games when we have a refresh in the console market, which in my opinion, is far from happening. Current consoles require HDTV and a Home Theater to show it's full strength and I don't believe that the majority of homes, regardless of the region of the world, are massively HDTV ready today.

In the end, I believe that future PC games will have the core of the software coded for videogames, but running with some extra-features and higher resolution, until something really drastic happens in the market.

The only thing (in my mind) that could push hardware for PC a little further is the 3D gaming. Since you need basically "double performance" to run a game in 3D versus in "2D", that would justify an increase in performance in the hardware by a factor greater than 2. However, this technology is not widely supported yet, still have some flaws, it does not work with the current installed base hardware (regular LCDs monitors) and it has an expensive entry cost. But, who haven't tried the shutter nVidia glasses yet should check it. It's really impressive and truly adds something that consoles can't do today.

I'm a proud owner of almost all gaming platforms available today (Xbox360, PS3, Wii and one E8400 with 8800GT SLI hooked in a 50" DLP TV with a Hi-Def Receiver and the 3D Vision glasses - and another I7 with GTX275 SLI in a 24" Monitor) and I can see advantages in every one of them, but unfortunately, when it comes to PC gaming today, I get myself asking why all of that tremendous processing power not being put at work as it should...
 
OK, I say this. The only way devs will change is thru money, our money.
If they insist on throwing crap our way, then ditch them and their backwards games.
Just wait til they drop in price, dont buy them til they do, wait 3 months.
Thatll get their attention.
Im tired of this. We want better? We need to somehow demand better, and by delaying thier planned early high buck return on investment, all platform only style sales is one way to do so
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
While most came come out across the board the PC version is always different, especially when it came out on the consoles first. They always have better graphics, usually but not always have higher multilayer map counts (until M.A.G comes out for the PS3 that is) and almost ALWAYS have something different about them. Such as harder difficulties, different muiltplayer maps, more unlockables, more DLC etc. Same thing that happens between PS3 and 360 most the time.

And I think the next gen of cards main "ground breaking event" is DX11. Even if it isn't going to be used for a while there is a possibility that with some of it's features we can see some great things added to gaming. If Tessellation is actually used correctly you can see a far larger detail in graphics coming.

Then there is Crysis. Even though I don't think as a game it is all that amazing, good but not amazing, it sparked something. So many people ran out and bought that game, so many people who didn't play PC games went out and upgraded their PC's or bought new ones just for Crysis.

That is all it takes to sell amazing hardware and games. If 1 game comes out that truly pushes the limits then people buy hardware to play it, and since they have all that high end gear they buy other games that look good as well. For instance most people had upgraded for Crysis so they also got Far Cry 2. That is how PC gaming as always been, and always will be. A hand full of games that really need it push the rest.

Cards very rarely bring the game changers, it is the devs that do that.