JAYDEEJOHN :
Ive been on these forums awhile, and the best answer I can give is this. Futureproofing doesnt exist. Never has. More forward looking? Yes, thats the right kind of thinking. If you are currently happy with your HW, and dont want newer games, more challenging games, then Id agree 100% with you.
I agree about futureproofing. By the time DX10.1 is supported, the 4xxx series will be outshined by the 5xxx series. Sadly, if Nvidia doesn't support it, then game companies won't. It's a matter of that bribery program. I wonder if it's as actionable as the Intel rebate program to OEM's? AMD's legal department should look into it.
I'm happy with my 3870x2, and since the 4870x2 won't be out till the fall, it's within a reasonable 8 months or so usage. I wish more games supported Crossfire and SLI, such that dual GPU cards would benefit more, but that's laziness in a market where companies think their games won't be played more than a couple of times and then ditched for the next best title.
Me, I keep going back to games I love for years. With LOTR Online, I'm lifetime, so I can always go back to it and enjoy. Sometimes I think that game developers view their work as entirely disposable, something a kid beats in 15 hours and then never plays again.
JAYDEEJOHN :
But thats not the idea of most people here. They want something more forwards looking, or something faster, just in case. Theres games out now that limits your eyecandy/resolutions etc so sometimes those extra few fps makes a huge difference, especially when youre forward looking.
How is Nvidia more forward looking? Sure, they're marginally faster. Now that I have time to look up specs, I can see that the 9800gx2 beats the 3870x2 by a wide margin in Crysis: 15 fps (42.5 vs. 27 but the difference with FSAA is only 3 fps).
Not by as wide a margin in World of Conflict: 52.9 fps from a 9800gx2 vs. 48.5 with a 3870x2. When FSAA is on, it's the 10 fps that I talked about.
Maybe because I play CRPGs like Oblivion, The Witcher, LOTR Online etc. I just don't get the fps argument when they're within 10 fps. When I first started playing LOTR Online, Bree filled with players was a slideshow. Now it's not (thank Catalyst 8.4 or just Turbine updates, or both!). The other two games were quite playable, even where benchmark sites gave the nod to Nvidia cards for sheer framerates (but then again, not by much).
http://www.tomshardware.com/reviews/nvidia-geforce-9800-gx2-review,1792.html
When a 3870 gets 24 fps in Crysis and an 8800gt gets 25.8, then what's the true advantage of the Nvidia card beyond marketing and "megachurch of Nvidia" loyalty?
In that benchmark suite, the 3870x2 beats a 9800gtx 33.1 to 31.9, but it doesn't influence those here who talk about framerates because of that Nvidia loyalty. It's considered a fluke, or something that new drivers will fix.
http://www.tomshardware.com/reviews/nvidia-geforce-9800gtx-review,1800-7.html
In this benchmarking session, the 8800gt gets 5 fps more than the 3870 (rather than the 1.8 fps in the session above), but the 3870 clearly beats the 9600gt 38.1 to 34. So, 5 or 10 fps mean something when it's Nvidia, but are irrelevant when it's ATI? I don't think you believe that, but I read posts by people who do.
Even as far as it goes, the 9600gt beats it's intended competitor by 1.9 fps. So, how does that make Nvidia truly faster and how does it make them more forward looking? It makes them better marketers, IMHO.
http://www.tomshardware.com/reviews/nvidia-geforce-9600-gt,1780-13.html
There's a world of difference between AMD CPU's and Intel CPU's. If there were ATI chipsets for Intel, then I might consider switching. As is, AMD's slightly lesser CPU performance meets my price point better than Intel. That is, the Intel CPU's I can afford don't do better than my AMD CPU's. So, I'd rather buy AMD chipsets and CPU's and put the difference into ATI GPU's.
However, with Nvidia vs. ATI it's quite different. At every price point, both the mainstream I usually buy and the high end I bought last February, ATI matches Nvidia at usually slightly lower prices with better DX10 drivers, better image quality and AVIVO.
So, why does Nvidia have the lion's share of the market? I hate cheating too, which is why I hated it when ATI did it in 2003 and seems to be doing it with Futuremark Vantage. Yet, it's Nvidia who cheats more often in benchmarks, getting a glassey stared "Wow" from their fans who insist that the 3xxx series is a failure and that anything Nvidia puts out is innovative because it doesn't waste time on features found on ATI cards that game developers won't support until Nvidia says it's time to.
IMHO, just as gamers ditched Intel during Netburst, so too must gamers ditch Nvidia. That will only happen when gamers realize they have more at stake then just competition bringing down the price of their cards.
IMHO, though futureproofing is not possible. ATI's forward looking, from DX10.1 to dual GPU's on one PCB is what gamers need. Not the 10 or less fps that an Nvidia card provides while blurring image quality and fudging rendering what the game developers intend for gamers to see when they fire up the latest.
I still predict that Nvidia will "win" the next round too, even though their card's won't actually do any better than ATI's. People bought Nvidia when the Radeon 9800 Pro and XT were winners, they bought Nvidia when the X1900XT beat the best in the 7xxx series, and they buy Nvidia even when a 3850, 3870 or 3870x2 is a better deal overall vs. a 9600gt, an 8800gt or a 9800gtx.
I just think that Nvidia's winning by marketing, not by technology or by CEO leadership. Sad really. All the people who wanted Ruiz to quit were right to criticize, but Huang's rants at Intel deserve criticisim too and I don't see leadership at Nvidia. Nothing that will take them beyond the age of monster GPU's. So maybe that's why they're gearing up for a different market, allied with Via against Atom.