4xxx series specs

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980
Unimpressive specs. I doubt the rv770s will carry that much memory. Amd will cut it in half due to lack in GDDR5 chips. All of a sudden these cards aren't looking too impressive, and now that AMD wont be getting the Jump on Nvidia they were supposed to, it wont have as large an impact.
 

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980

You know its funny that there are more speculation threads about hte Rv770s than the GT200s. I think that people know the GT200s will be ghastly but they probably cannot afford such expensive products.
 
The 4870 will have that memory, and be clocked at least 100 Mhx faster . Its not twice the 9800GTX like the GTX280 might be, but itll be a good bit faster, faster than anything thats been released, ever, until the GTX200, so its not too bad at 350 USD. Look for the GTX280 to be a killer, in heat, power and fps, oh and your wallet. But it will be a monster
 


Nice find.
Now if only I could read it....

But seriously, thanks for making me feel old...
 

ro3dog

Distinguished
Mar 20, 2006
243
0
18,680
AMD has something up there sleve,because of how they are chalenging Nvidia.AMD main-stream comes out along with Nvidias top of the line.The 4870x2 will be stronger than what the populus are saying,and there tactics are sound.The next step for Nv would be GTX260x2.Just my 2 pennies
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980



Might walk over RV770.... probably won't do much walking over R700 though.

And since the GTX280 is gonna be such a power hungry space heater - Nvidia simply cannot double up at this time.


AMD have the right approach from a multitude of angles*, Nvidia are pressing on like Intel with the Netburst arch.

For something that is intrinsically parallel in nature, it really should be relatively easy to scale up GPU performance with core numbers (if and when developers start to support it properly).



* 2 heat sinks on 2 smaller chips work better than 1 on one big one.
* 2 simpler smaller chips means yields improve
* common chips means partners can be quicker to respond to market demands
* simpler common chips lowers design cost
* simpler common chips lowers design time



Also, the key thing for AMD/ATI is a smaller chip that is built to operate in tandem with others (shared mem etc) will be so much easier to integrate for fusion.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280


The hardware developement is ahead of the software developement at the moment.

Most game designers have barely gotten things like SLI and quad core cpu support implimented in their games successfully.

I really think that as others have mentioned - the new video card wars are going to be more about excellent performing single cards, lowering heat production and power consumption - and less about SLI/Crossfire power setups (although the option is still there for the hardest of the hardcore)

Nvidia's next gen could be a serious deal breaker because of this. I really think that the PCB we have seen on the net is more than likely a fake; either that or Nvidia is getting a bit stupid. I mean maybe Nvidia is making 10.5" video cards with obscene power requirements their standard, but that hardly means its sensible.

Anyways, gt280 is supposed to be a new gen card - I would hope that the gt280 will be like the g80 was, it'll be a precursor to an evolution.

g80 has evolved to g92; g92 has comparable performance for significantly less cost - 8800 GTS 512 vs 8800 g80 GTX anyone? Its very comparable performance but the 8800 GTS 512 is way, way cheaper.

So maybe gt280 will evolved into gt290 (the comparable evolution at a lower cost)

Its kinda funny thinking about this; but its like the 9000 series outside of 9600 GT doesn't even exist to most people. Sure some people will go with 9800 GTX 3 way or 9800 gx2 Q-SLI - but a large portion of even the enthusiasts are still much more keen about 8800 g92 cards like GTS 512

I think of all the cards i've heard spoken about on this particular forum in the last while, the GTS 512 has been the most highly regarded overall
 
And that has opened it up for the 92b. What these cards will be is a good guess. My guess would be that theyd open up the bus, use the die shrink for higher clocks and have some great cards. Improved G92s. With the 2 monsters on top, competing mainly with the 48xxs. The low end looks real good at this point IMO, and may open doors for higher dev in games, since anyone can afford these "lower" end cards
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
I still wonder why people care about Nvidia cards. It's not like Intel's quads vs. AMD's quads. It's more like a monster GPU at the high end beats an ATI card slightly in some games, but not in all. Also, while both ATI and Nvidia seem to be dodgy with drivers regarding Futuremark Vantage, it's Nvidia who gets dodgy with drivers for popular game demos.

Correct me if I'm wrong, but Nvidia fudged water quality in the Crysis demo to get better benchmarks just before the game arrived, then corrected it when the game actually arrived. With the TWIMTBP program, Nvidia should have drivers down for many games way before release, but they're still struggling with DX10 and don't have DX10.1 support.

Though the 9600gt is probably the best card for the money Nvidia has, a few reviews relied upon Nvidia background boosts of the PCIe bus on cherry picked Nvidia chipset boards to get a few more fps than a 3870, with the same happening between SLI 9600gt's and a 3870x2.

Is 4-10 fps more in a popular FPS the holy grail that makes people buy Nvidia's aging tech over ATI's innovations? Sure, AMD flubbed it with the X2900xt, but back then, Nvidia's 8800gts 320 and 640 weren't up to Vista and DX10 either. Nvidia didn't make a comeback until the 8800gt, 8800gts 512 and 9600gt. ATI made a comeback with the much better bang for the buck 3850 and 3870 first.

Plus there's AVIVO, which probably means less to the gamer's here than sheer fuzzy framerates (are Nvidia images still blurred like in the 7xxx series days?). All in all, people seem to act like there's a megachurch of Nvidia that they must worship at. All the while Nvidia puts out hasty, poorly designed cards to "beat" ATI one more time, like the 9800gx2.

If developers did their job and didn't take whatever Nvidia's offering to get into that TWIMTBP program, then they'd work with both ATI and Nvidia to ensure that their games ran smoothly under both series cards, under both operating systems and under both Crossfire and SLI.

Then, it would be a matter of technological innovation, which ATI's had hands down for many years and which Nvidia can only counter with the rants from Huang, aimed mostly at Intel because he's worried about Larrabee (IMHO, he shouldn't be -- he should be worried about Fusion in the notebook and entry level market).

Will G280 beat the 4870? Probably, but it might not beat the 4870x2. They leapfrog each other, ATI and Nvidia do, but when it comes to tech that will prepare for the future, it's ATI that has it, and Nvidia that relies upon the trickle down effect of overpriced cards few will own to maintain it's rep.

Yes, I'm an ATI fan, but when I ditched the old P4 Northwood on an i865 board with an AIW Radeon 9800 Pro, I first went Nvidia 6100 chipset with a 7600gs and was so unimpressed with image quality that within a year, I switched the CPU to a cheap 690 board and got a 3870x2 instead of an 8800gts 512.

I'm tired of Nvidia "beating" ATI. I want to see the same level of innovation from Nvidia that I've seen from ATI. Then, I'd have a choice in which card to buy, which motherboard chipset to buy and which multi GPU setup to support. So far, Nvidia's flubbing it big time.



 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
Wow, that was well said. Very good ideas and facts to back them up. I for one, am ashamed to own an Nvidia graphics card. Beleive me, if I wasn't going for the "best of the best" video card out there I would be using an ATI product. My favorite card was a x1950xtx, good old ATI.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Well, your definition of best of the best seems to be based on sheer framerates in some games, and not in the value of the cards overall. Back when I had a 7600gs, I hated Pure Cinema. Plus, the 7xxx series had blur issues that gave those cards a boost to framerates but hurt image quality.

I was going to keep that budget PCIe 8x motherboard and get an 8800gts 320, but that card's performance wasn't there. Nvidia did better than ATI's 2900, but by the time the 3870 arrived, Nvidia's minimal framerate advantage wasn't there under Vista and DX10.

So, you don't have the best of the best. You have the best marketed. Enjoy your card and pine away for the G280 you may not be able to afford and let that influence your decision to buy whatever midrange card Nvidia comes out with next.

IMHO, when you count image quality, AVIVO video playback capabilities and overall thermals and performance, ATI wins in the 3xxx series and will do well enough in the 4xxxx series to show that they're innovating while Nvidia's stagnantly putting out monster cards that, like the 8800 Ultra, do not give performance equal to the cost.

I want to see Nvidia innovate and not just juggle model numbers. Let their "best of the best" actually earn the title for once. Nvidia managed to stay on top in sales back when they had the lousy FX series, so I kind of view them as the Intel of GPU's, they innovate when they are forced to and they pawn inferior products on the market when they can get away with it.

At least ATI goes for the gold, even if they end up with the silver.
 
I agree with most of what you said, but this one thing. FPS matters. In innovation, there really isnt any comparisons between ATI and nVidia. nVidia still is on DX10. nVidia still charges for there product, while avivo comes with ATI cards. nVidia is still on an old process (65nm), I could go on. But nVidia is No.1 in FPS, and that does matter. This gen, we will see ATI come to the tabvle from top to bottom with competing products, which will have all those innovations either unobtainable on nVidia products, or come with a certain extra fee. Im looking forward to it. Its good for everyone, ATI, nVidia (since they need to be pushed) and us the consumer. I think its also good because with Intel lurking closer to their (in their own words) dead solution of a gfx card, we need this
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Okay, I'm at work and can't research everything right now, but does 10 fps difference matter all that much? Huge differences matter, I know, but there aren't major differences between, say, a 9600 and a 3870.

A friend had just IGP for WoW and I gave him my old 7600gs, so now he gets 65 fps when he plays, but he started playing LOTR online and noticed that he's not getting the image quality I do, or the framerates. So now he's thinking of an 8800 series card and Vista for DX10 mode. I'm not sure he'll get the image quality, though he'll get the framerates.

Huge fps differences sometimes exist between generations, definitely exist between entry level, mainstream and performance. Then there are site benchmarks, which often are influenced by high end CPU's most people don't have when they buy the cards. Still, there aren't major differences between ATI and Nvidia's fps at equal price points; regardless of who wins in which game.

LOTR Online must have optimized their game in some of the updates, because I got basically 3870 performance at first (people on their boards tell me the game didn't support SLI or Crossfire all that well), but it's improved greatly the past month. I might get slightly better performance with a 9800gx2 but would it make a difference in my gameplay? Probably not.

There's a point where games are playable at high settings, with the ultra settings being set for future hardware. So if the games are playable under both ATI and Nvidia, where's the need for that 10 extra fps? Especially if something's compromised, like the quality of the water in Crysis?

Have you heard anything more about how both ATI and Nvidia are fudging Futuremark Vantage results with their drivers? When I get an 8750, then I'll give it a try, but not before then. I hate my CPU bringing down 3DMark scores.
 
Ive been on these forums awhile, and the best answer I can give is this. Futureproofing doesnt exist. Never has. More forward looking? Yes, thats the right kind of thinking. If you are currently happy with your HW, and dont want newer games, more challenging games, then Id agree 100% with you. But thats not the idea of most people here. They want something more forwards looking, or something faster, just in case. Theres games out now that limits your eyecandy/resolutions etc so sometimes those extra few fps makes a huge difference, especially when youre forward looking. Cheating in a benchmark is wrong period. Its pure deception, by either or any company. Once again, as someone whos been around for awhile, I place very little in bungholio marks anyways, as theres just too many combinations to actually make a real comparison, from rig to rig, and whats it really gotr to do with how well my games are playing? So sometimes it can make all the difference those fps, and sometimes not at all, but to be always prepared to handle that next killer game is very important
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


I agree about futureproofing. By the time DX10.1 is supported, the 4xxx series will be outshined by the 5xxx series. Sadly, if Nvidia doesn't support it, then game companies won't. It's a matter of that bribery program. I wonder if it's as actionable as the Intel rebate program to OEM's? AMD's legal department should look into it.

I'm happy with my 3870x2, and since the 4870x2 won't be out till the fall, it's within a reasonable 8 months or so usage. I wish more games supported Crossfire and SLI, such that dual GPU cards would benefit more, but that's laziness in a market where companies think their games won't be played more than a couple of times and then ditched for the next best title.

Me, I keep going back to games I love for years. With LOTR Online, I'm lifetime, so I can always go back to it and enjoy. Sometimes I think that game developers view their work as entirely disposable, something a kid beats in 15 hours and then never plays again.



How is Nvidia more forward looking? Sure, they're marginally faster. Now that I have time to look up specs, I can see that the 9800gx2 beats the 3870x2 by a wide margin in Crysis: 15 fps (42.5 vs. 27 but the difference with FSAA is only 3 fps).

Not by as wide a margin in World of Conflict: 52.9 fps from a 9800gx2 vs. 48.5 with a 3870x2. When FSAA is on, it's the 10 fps that I talked about.

Maybe because I play CRPGs like Oblivion, The Witcher, LOTR Online etc. I just don't get the fps argument when they're within 10 fps. When I first started playing LOTR Online, Bree filled with players was a slideshow. Now it's not (thank Catalyst 8.4 or just Turbine updates, or both!). The other two games were quite playable, even where benchmark sites gave the nod to Nvidia cards for sheer framerates (but then again, not by much).

http://www.tomshardware.com/reviews/nvidia-geforce-9800-gx2-review,1792.html

When a 3870 gets 24 fps in Crysis and an 8800gt gets 25.8, then what's the true advantage of the Nvidia card beyond marketing and "megachurch of Nvidia" loyalty?

In that benchmark suite, the 3870x2 beats a 9800gtx 33.1 to 31.9, but it doesn't influence those here who talk about framerates because of that Nvidia loyalty. It's considered a fluke, or something that new drivers will fix.

http://www.tomshardware.com/reviews/nvidia-geforce-9800gtx-review,1800-7.html

In this benchmarking session, the 8800gt gets 5 fps more than the 3870 (rather than the 1.8 fps in the session above), but the 3870 clearly beats the 9600gt 38.1 to 34. So, 5 or 10 fps mean something when it's Nvidia, but are irrelevant when it's ATI? I don't think you believe that, but I read posts by people who do.

Even as far as it goes, the 9600gt beats it's intended competitor by 1.9 fps. So, how does that make Nvidia truly faster and how does it make them more forward looking? It makes them better marketers, IMHO.

http://www.tomshardware.com/reviews/nvidia-geforce-9600-gt,1780-13.html

There's a world of difference between AMD CPU's and Intel CPU's. If there were ATI chipsets for Intel, then I might consider switching. As is, AMD's slightly lesser CPU performance meets my price point better than Intel. That is, the Intel CPU's I can afford don't do better than my AMD CPU's. So, I'd rather buy AMD chipsets and CPU's and put the difference into ATI GPU's.

However, with Nvidia vs. ATI it's quite different. At every price point, both the mainstream I usually buy and the high end I bought last February, ATI matches Nvidia at usually slightly lower prices with better DX10 drivers, better image quality and AVIVO.

So, why does Nvidia have the lion's share of the market? I hate cheating too, which is why I hated it when ATI did it in 2003 and seems to be doing it with Futuremark Vantage. Yet, it's Nvidia who cheats more often in benchmarks, getting a glassey stared "Wow" from their fans who insist that the 3xxx series is a failure and that anything Nvidia puts out is innovative because it doesn't waste time on features found on ATI cards that game developers won't support until Nvidia says it's time to.

IMHO, just as gamers ditched Intel during Netburst, so too must gamers ditch Nvidia. That will only happen when gamers realize they have more at stake then just competition bringing down the price of their cards.

IMHO, though futureproofing is not possible. ATI's forward looking, from DX10.1 to dual GPU's on one PCB is what gamers need. Not the 10 or less fps that an Nvidia card provides while blurring image quality and fudging rendering what the game developers intend for gamers to see when they fire up the latest.

I still predict that Nvidia will "win" the next round too, even though their card's won't actually do any better than ATI's. People bought Nvidia when the Radeon 9800 Pro and XT were winners, they bought Nvidia when the X1900XT beat the best in the 7xxx series, and they buy Nvidia even when a 3850, 3870 or 3870x2 is a better deal overall vs. a 9600gt, an 8800gt or a 9800gtx.

I just think that Nvidia's winning by marketing, not by technology or by CEO leadership. Sad really. All the people who wanted Ruiz to quit were right to criticize, but Huang's rants at Intel deserve criticisim too and I don't see leadership at Nvidia. Nothing that will take them beyond the age of monster GPU's. So maybe that's why they're gearing up for a different market, allied with Via against Atom.