boudy

Distinguished
Mar 13, 2009
498
0
18,790
http://www.fudzilla.com/index.php?option=com_content&task=view&id=12857&Itemid=34

They dont have any other ideas, so instead they will be coming out with the GTX 295 on 1 PCB instead of 2 bolted together.

Well, I guess that just means more money for ATI. lol jk



EDIT: btw, I just made the topic "Poor Nvidia" so that all the Nvidia fanboys would say "OH NO WHATS HAPPENING TO NVIDIA?!", and so that all the ATI fanboys would say "Oooh, time to beatdown those Nvidia fanboys", therefore giving me both sides of the graphics card enthusiasts :D.
 
Im thinking its more that the sandwich design is eol'd permanately. Theyll be using the newer GDDR5 in their next gen, so it wouldnt really make sense to use 2pcbs anymore, besides, theyre cutting costs every way they can, and a 2 pcb solution is more costly
 

boudy

Distinguished
Mar 13, 2009
498
0
18,790
I seriously wonder if this is going to be a tad bit more power efficient than the 2 PCB design.........just wondering.


So will we probably expect to see some sort of alteration to the 4870X2?
 

4745454b

Titan
Moderator
How is this poor Nvidia? They are (finally) coming out with a single PCB GTX295. This is a good thing for all involved. Should use less power and be cheaper to make.

The only part that sucks is they aren't introducing anything new/faster. If the TMSC 40nm process has the issues that it has, I don't see them making a 40nm chip any time soon, if at all. I really get the feeling that GTX295 is it, until they come out with their DX11 part. After all, why make a faster DX10 part, if it will be passed by the DX11 part only a few short months later?

My other question is if they NEVER developed a DX10.1 card, will their DX11 card be worth while? MS came up with the DX10 original spec, then dumbed it down after Intel and Nvidia whined. Neither company has built the things required for DX10.1 (AKA "real" DX10) while AMD has been toying with it since they built the Xenos so many years ago. I almost feel that Nvidia is going to fall far behind once we move to DX11, unless MS dumbs it down for them again. Time will tell.

For the sake of balance, I don't see AMD coming out with anything either. They will release the 4890, but that is possibly all we will see from them for DX10/10.1. I do believe we will see DX11 cards from them first, its a smaller step from 10.1 then from 10.
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
I think the worldwide economy, and this "turn to green" wave is going to slow advancement all together. I wouldnt be suprised if these companies start moving towards a 12-15month cycle instead of the current 9 months or whatever it is.
 

boudy

Distinguished
Mar 13, 2009
498
0
18,790


By the looks of it, we should be seeing ATIs first 40nm chip this year :).
 
zzf is right, the 770 is right around the corner, and it makes sense as well, going to 40nm with R700 and getting 75% power usage and possibly a slight dip in perf on the cheap. The gate widths were too small if I recall, going to 40nm, http://www.altera.com/literature/wp/wp-01058-stratix-iv-40nm-process-node-custom-logic-devices.pdf check fig. 2. Its what were facing in all things silicon, up the core speeds, heighten the leakage. Take measures to prevent the leakage, it slows the core speeds.
Anyways, 40nm will be here soon, and well see how high those clocks go. ATI releasing a mid level chip shows there is problems with the 40nm IMHO, but with LRB around the corner, neither nVidia nor ATI is going to be sitting on their hands
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
The 4770 is probably the best card they have comming out of this. But, isnt there a "4850" rework comming that dosent require a power plug, or am i getting this mixed up with teh 4770?
 
Sorta. Theyre the "same". Ive heard the 4770 is not only downsized, but has fewer trannys as well. Plus the narrower bus. Maybe what youre thinking about is a R700 refresh across the whole line, which has been a rumored release? Q3 I think it is, while R800 isnt due til Q4, but to me that makes no sense, tho theres been rumors.
IMHO, ATI needs to get out their mobile, and the 4770 is the result of that
 
Going by tranny density alone from here (good article) http://translate.google.com/translate?sourceid=navclient&hl=en&u=http%3a%2f%2fwww.pcgameshardware.de%2faid%2c679763%2fReale-Chipgroessen-von-G80-GT200-und-GT200b-nachgemessen%2fGrafikkarte%2fNews%2f
Theres only a small difference in size between the 92b and the 48xx series, so shrinking the 4xxx series allows for power savings, and it allows for a perf increase as well. To me, ATI has divied that up pretty evenly with the 4770 going to 40nm. ATI has always held the process lead, or mostly as of late, and are reaping the benefits of it, especially when these mobiles come out, itll blow away nVidias cards for mobile.
nVidia needs to increase their tranny density IMHO, as its killing their yields, and trying to go 1 large monolithic core vs multi plays in the opposite direction as far as I can tell
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Yea, some of those laptops are pretty stupid. If your laptop needs to be pluged in to game for more than 10 minutes than it completely eliminates the whole purpose of owning a laptop.
 
Thats why the 4770 is a great alternative, as itll give full perf with decent power usage. You wont sacrifice anything, be it IQ, portability and perf. nVidia has stretched the g92 too far, tho Im thinking they really didnt expect the perf that came from R700, and thought they could ride this one out. I remember reading this from a good source back in February last year, and no one really knew how the R700 would perform, and since I do favor ATI, it was a bit disheartening hearing this, but things did change, and quickly
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
Or they could do a hybrid, making it portable for work and basic use by using integrated, and plugged in for heavy gaming. I mean you could game in power saving mode, but it would need to be older games:) Since the VIdeo card down clocks.
 

roofus

Distinguished
Jul 4, 2008
1,392
0
19,290
Single pcb 295..What took them so long! I have no fondness for multi-gpu cards (ala compact CF/SLI designs) but everything on one pcb makes alot more sense.
 

boudy

Distinguished
Mar 13, 2009
498
0
18,790


If im not mistaken, the 4770 will be replacing the 4850, and the 4750 will be replacing the 4830......is that right?


Single pcb 295..What took them so long! I have no fondness for multi-gpu cards (ala compact CF/SLI designs) but everything on one pcb makes alot more sense.

The 4870X2 was on 1 PCB from the beginning :)



They have come out with the 4860 (its 40nm :D), so maybe that will help all of you laptop gamers.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
no the 4870 512 was 1 pcb, the 1 gig came after the 4870 X2;) :lol:

So think of the 260 GTX as the 4870 512, and the 275 GTX as the 4870 1 gig

oh and I question the thread title as well:) Its not like its a bad idea for a card, costs less than the 280 GTX, and looks like it might perform more, when both tested at stock.


:hello:

To OP, I saw ur edit lol, so you want a flame fest and not an informative thread?

Good luck:) :pt1cable: :pt1cable: :pt1cable: :pt1cable:
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Ive seen the 4860 pop up on a few charts, but ive yet to actually see it in stock anywhere. ATI is getting just as bad as Nvidia when it comes to these card names. I mean is there really a need to wedge a card between the 4850 and 4870? No, there isnt.
 

jennyh

Splendid


I agree there isn't a need, however if Nvidia release a card at every 10mhz step then ATI have to follow suit.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
ATI has to why?

they been fine with the 4850 and 4870 so far, why are you defending their wrong move?

Its been a plus for them that they haven't followed Nvidia, every1 agreed that its what made them look innovative in the public's eye. Now they do it, and they are still angels.

You can't close your eyes, and be blind to the fact that ATi is doing a wrong move as well with that card.

It really sounds stupid when you blame a company for what the OTHER company is doing.

Well Intels doing it, so that means I have 2 do it.

It works out worse for the 2nd company that did it, when Nvidia did it, every1 was pissed and I agreed 2, but seeing ATI do makes me think they know it pisses us off, they just don't care.

So stop blaming the green cuz u like red, and vice versa.