Sign in with
Sign up | Sign in
Your question

GTX280 or HD4870 x 2 in CF ?

Last response: in Graphics & Displays
Share
May 29, 2008 9:02:19 PM

Just 3 more weeks till we get some benchmarks and find out!!!

More about : gtx280 hd4870

May 29, 2008 9:22:42 PM

I'm really hoping for a serious increase in performance from the 9800GTX & GX2. I have a feeling the new 512 mem bandwidth on the GTX280 will be the largest step up from previous Nvidia models. Since I have an Nvidia board, I really have no choice but to hope for the best out of them, although they have seriously disappointed me lately. ATi has really built some serious momentum ever since the hype of the 3870X2. No doubt the benchies will be competitive!
Related resources
May 29, 2008 9:28:55 PM

im placing the bet now the twice-as-expensive gtx 280 will be better :p 
a b U Graphics card
May 29, 2008 9:31:18 PM

spuddyt said:
im placing the bet now the twice-as-expensive gtx 280 will be better :p 


Sure as hell hope so.
I have been saving ever since I got my GTS 640 (at release) for their next top card.

Just a few more weeks....
May 29, 2008 9:46:47 PM

yea same, I have $900 saved up for my new card, case, and cooler so I could oc to the sky on my existing set-up :) 
May 29, 2008 9:56:22 PM

My bet (as the drum rolls)the 4870x2 will have more wins over the gtx280.I believe there is something to why AMD is waintig to launch the 4870x2.This feels like Christmas.
May 29, 2008 10:05:43 PM

Yes, but then overall, (IMO) I would believe the GTX 280X2? will have the highest performance overall, although I don't know how they would implement their cooling technology on such a monster card. However the 4870X2 would probably be the better bang for the buck than the GTX280, and most likely perform better. Although I can see the GTX280 taking the lead in games that do not support sli/crossfire. As for me, I am looking for a card in the $300-400 range ($400-500CAD) such as the GTX 260 or the 4870. And I am hoping that they will give me more than 50% performance over my 8800GTS. Considering that the 9800GTX is %30 higher on average, I could see this being true.

Thanks, Nico
Anonymous
a b U Graphics card
May 29, 2008 11:23:03 PM

stop calling it 9900 series... its been posed several times that the new naming scheme is gtx 280 (GTX) and the GTX 260 (GTS)...

reading 9900 is kind of annoying... ANYWAY

I hope Nvidias card will win (don't know why) but I'll get the top card from either company whenever prices come down a bit... so i'm not too excited yet
May 30, 2008 12:30:01 AM

ATI all the way, we need to reign the leash on Nvidia a little. Plus I think the ATI cards will perform close to the GTX260/280 and be much cheaper, feels like ATI's second coming. No I am not an ATI fanboy like the Nvidia fanboys here, never owned an ATI card even through the 7 series B.S. I'm just getting tired of Nvidia screwing us over with high prices and sitting on their as- butts while the GPU performance stands still. I don't care if you wont buy an ATI card for whatever reason, we just need the competition to reduce prices and most of all INCREASE FUTURE CARD'S PERFORMANCE!
a b U Graphics card
May 30, 2008 2:14:39 AM

well wat can nvidia do if there is no competition. im sure if u owned a company u would want the most $$$ as possible. the gtx 280 is most likely going to be a big leap and so will the the 4870 but i think nvidia will come up on top. ati is only hitting the mid range because they cant fight nvidia head to head just yet plus they dont have alot of spare cash for r&d.
May 30, 2008 2:31:16 AM

I think it is a sink or swim moment for ATI and I like blood raven, believe we need them to swim to lower the costs for us, the consumer and in turn increase competitiveness across the board.

Cost to performance ratios have been comparably dormant for too long now, and on that note I think backwards compatibility for older games such as WoW and EQ2 have had far too many issues with the newer cards. I do hope that this isn't an issue for the next generation of cards, drivers that actually work on release day would be a blessing.

My two cents.
a b U Graphics card
May 30, 2008 2:37:13 AM

Well, I doubt there'll ever be a GTX280-Gx2, just like there was never a G80-GX2 or R600-X2 talk about cards that would've needed their own generators.
But I'm certain that we'll see the HD4870X2 within days of the GTX-280's launch even if you can't buy one 'til much later (rememebr these guy compete by leaking previews and rumours against each other).

Also the failure to launch anything real to replace the G80 and blaming on ATi means that they'd better launch something well developed and pretty flawless at launch because they've had all this time to do so if it's not a home-run right away then it means they never were ready to replace the G80 regardless of competition.

It'll be interesting to see what happens over the next few weeks, but I have a feeling both sides are going to have both big surprises/benefits as well as big dissapointments.
a b U Graphics card
May 30, 2008 2:57:47 AM

Gargamel said:
I think it is a sink or swim moment for ATI and I like blood raven, believe we need them to swim to lower the costs for us, the consumer and in turn increase competitiveness across the board.

Cost to performance ratios have been comparably dormant for too long now,...


What are you talking about?

Seriously, increase competitiveness?
It's now more competitive than it was in the GF7/X1K days and GF6/X8 days, and far more so than the early G80 era and early R600 era.

And as for cost/performance ratios, they've never been better. The GTS-512 has been selling for around $200 for a month (now under $180 AMR) the HD3870-GDDR4 for $150/$130 after Rebate; now compare that to the GTX for most of it's life at $600, the GTS-640 at $400+ GTS-320 at $300+ and the X1900XTs at $300+ X1900XTXs at $400+ most of their lives and GF6800GT at $300+ until the end.
Seriously, we've never had it this good !!

90% of the Ultra's performance for 1/3 of the MSRP, when have we ever had a card value like that before?
The only area that you could argue there is limited competition is the eWang cards, and even then there's huge pressure from the cards below them.
Sure the Ultra's not cheap, the GX2 and X2 aren't cheap, but no one talks about cost/performance for those cards anyways, and even then an X2 for $300.

Man it's never been better except for the 'unlocking' era of cards.
a b U Graphics card
May 30, 2008 3:11:06 AM

im was hoping nvidia would price the gtx 280 for 499.99 and the gtx 260 for 399.99.
That would be a reasonable price. But there probably charging gtx280 600$ gtx260 450$
a b U Graphics card
May 30, 2008 3:49:50 AM

Yeah, that would've been nice, but it was bound to be expensive with it being a huge chip, lotsa memory (remember basically minimum 1GB due to the interface and ROP count), and complex PCB.

Also I think nV wants to try and re-establish the $600 MSRP they used to have after all the recent bargains (which don't make them money if they have low yields).

Funny thing is that the INQ kinda commented on this just today with Charlie confirming the yields, complexity issues, but also saying nV might low ball their launch prices to create an artificially low MSRP (which of course will then be out of stock everywhere) to create demand away from the other cards. I doubt it, but if you're looking for a deal, look to to your local e-tailer on launch day, because if they pull that trick again it may be around fro a day and then gone for weeks.
a c 130 U Graphics card
May 30, 2008 7:29:24 AM

Well i just cant wait to see some real test results. Does any one know what the video encoding/HD elements of the new cards will be. Any real improvements or noyicable additions ?
Mactronix
May 30, 2008 8:38:39 AM

I'm just sad fot those of us that bought the 3870's and 8800gt's and paid 300 for them when now they're on sale for as low as 150 or 200. Just hope the price war goes on so we can all benefit from the great deals that are coming.
May 30, 2008 9:28:17 AM

spuddyt said:
im placing the bet now the twice-as-expensive gtx 280 will be better :p 


It will be "better" but it might still be a failure, if leaked e-mails are true:

http://www.theinquirer.net/gb/inquirer/news/2008/05/29/...

Despite the flames made by Nvidia fans to the article, the essential concept that GTX280 at 65nm runs into yield, design and price issues makes sense, and that Nvidia won't have viable cards until a die shrink.

IMHO, the days of monster cards are going fast and this might be Nvidia's last summer of love from gamers who just want an extra 10 fps in Crysis.

Sadly, not only does Huang talk trash about his competitors, and deny the value of dual GPU on one PCB cards, but Nvidia also relies upon promotions to get game companies in line that are as dodgy as Intel's old OEM rebate program.

Yes, it should be faster, but all that gets is the crown if the part is not available at an already overly high MSRP.

darkdisciple said:
Just 3 more weeks till we get some benchmarks and find out!!!


The real competition will arrive with the 4870x2 in late summer or fall. Since a 4870 doesn't provide real performance over my 3870x2, I'm waiting to get the next dual GPU on one PCB card.

It will be fun to see if triple SLI GTX280's can match two 4870x2's in CrossfireX. My bet's on the CrossfireX but then again, I am an ATI fan and have been since the Radeon 8500.

Will the GTX280 be Nvidia's next FX5800? Only time will tell, but I bet they release dodgy drivers that affect image quality (Crysis demo water style) to make a bare win in the benchies.

If the linked article from Charlie is right, then Nvidia's in a situation with 65nm monster cards that AMD was in with 65nm quads. The thermals and performance isn't there vs. the competitor and the company has to rely on fan loyalty and a 17% or so increase over the last gen.

Personally, Intel failed technologically once and won in the marketplace still. Nvidia failed with the FX series and stayed ahead. It seems there's just too much dislike of AMD and ATI for Nvidia to really go away. As long as the fanboys set aside their $600 to $900 for cards that are only a smidgen better than ATI's mainstream in a few games, then I think Nvidia will survive.

I'd prefer to see them survive as just Via's partner against the Atom, but that's because I really don't like corporate CEO's talking the way Huang's been doing lately. I like competition that brings both innovation and prices that make consumers and investors happy, but I'm tired of cult of personalities and fan boy marketing empires.
May 30, 2008 10:38:47 AM

But you do know that you just quoted the INQ?
May 30, 2008 10:45:43 AM

invisik said:
well wat can nvidia do if there is no competition. im sure if u owned a company u would want the most $$$ as possible. the gtx 280 is most likely going to be a big leap and so will the the 4870 but i think nvidia will come up on top. ati is only hitting the mid range because they cant fight nvidia head to head just yet plus they dont have alot of spare cash for r&d.


Man, I agree with you, I'm tired with nvidia ripping us off with thie "new" generation cards, I'm the kind of person that try to find the best bang for the buck and right now I'm keeping my eye on ATi because they deliver almost the same performance as nvidia (I have an 8800 GTS 320 MB G80). Let's hope that ATi keeps their prices low and performance high,
May 30, 2008 10:51:32 AM

That article from the INQ did not lie, it only sensationalized the actual truth, the GTX200 are going to have problems, its only a question of if there will be more than the 4xxx series.
May 30, 2008 1:04:08 PM

I definitely think AMD will be having better yields, but Nvidia is going to convincingly take the performance crown until 4870x2 comes. Then maybe there will be competition at ultra high end.
a c 86 U Graphics card
May 30, 2008 1:14:07 PM

http://www.nordichardware.com/news,7819.html

Quote:
Apparently, the previous rumor of $600 wasn't enough, because the slide says $649 for the GTX 280 and $449 for the GTX 260.


hmm I wonder how much thats going to be here in Europe.. 650€ more likely... :/ 
May 30, 2008 2:03:45 PM

Since of late the INQ has been more right than wrong.
May 30, 2008 2:10:39 PM

Well the GT200b is cheaper so the prise will fall down to $600 when the Nvidia gets it node shrink ready.
So it's expensive, we know it allready. It is fast, so it's not anykind of FX5800... It's the fastest GPU in the world for a while. It can be possible that 4870x2 is close, but GTX280 will be fastest single ship around for a while. But it seems to be that ATI goes to the right direction, with cheaper chips? Even if this is the case, Nvidia just needs new 8800 GT type goldmine and they are financially fine even GTX280 is not economically sensible product.

It's even more interesting to see if GT300 is still even bigger chip and is the Atis next generation aka really going more to multi chip design.
a b U Graphics card
May 30, 2008 5:08:32 PM

hannibal said:
Well the GT200b is cheaper so the prise will fall down to $600 when the Nvidia gets it node shrink ready.


That's on the assumption the shrink will go without a hitch. It might as long as they don't tweak stuff, but nothing's guaranteed.

Quote:
It is fast, so it's not anykind of FX5800... It's the fastest GPU in the world for a while.


So was the FX5800U at some DX8 stuff and OGL.

Quote:
Even if this is the case, Nvidia just needs new 8800 GT type goldmine


The kind you buy from some guy in LasVegas, remember they lost money on the GF8800GT.

Quote:
and they are financially fine even GTX280 is not economically sensible product.


The GTX280 is only an economically sensible product if it contributes to the bottom line. The funny thing is that it's biggest benefit will likely not be direct, but the halo effect if it is the top performer. It may lose 10s of millions of dollars, but the effect on nV's other cards might be enough to be worth it like a PR/Ad campaign.

Quote:
It's even more interesting to see if GT300 is still even bigger chip and is the Atis next generation aka really going more to multi chip design.


Too far out to tell really, but unlikely that the true heir to the GTX2xx line will be similar to what it is. And both ATi and nV will likely have to compete with intel at that point in time.
a b U Graphics card
May 30, 2008 6:01:17 PM

Remember when everyone thought that maybe the G280 was 55nm? I think FUaD wrote about nVidia doing both, 65 and 55nm in hopes that at least one would work, going 65 first, but having 55 right behind it. Its all FUD and INQ tho, so who really knows, but Im pretty certain that ATI's much smaller die are going to go alot easier than the huge monolith G280
May 31, 2008 9:43:19 AM

JAYDEEJOHN said:
Im pretty certain that ATI's much smaller die are going to go alot easier than the huge monolith G280


It should be and at least less is gonna be wasted, when something goes wrong...
May 31, 2008 9:51:42 AM

TheGreatGrapeApe said:

The kind you buy from some guy in LasVegas, remember they lost money on the GF8800GT.


Didn't know that... I really was thinking that it was good and profitable product. So there was so much pressure from the ATI that they have to keep price low?

Quote:
The GTX280 is only an economically sensible product if it contributes to the bottom line. The funny thing is that it's biggest benefit will likely not be direct, but the halo effect if it is the top performer. It may lose 10s of millions of dollars, but the effect on nV's other cards might be enough to be worth it like a PR/Ad campaign.


Yep, that's very true. Flagship products have allways been more like good pr. campaign to suppoert the sale of lower end products.

Do you have any idea how well ATI's low end products sells when 9700, 9800 were out, and how Nvidia managet, when 8800 series were (and still are) at the top? It would be interesting reading. How much it does really matter?


a b U Graphics card
May 31, 2008 10:22:24 AM

From what I understand, the 92b's will have to fend off the 4xxx series because even after the die shrink on the G2xx's , the die will still be as large as the G80's, which were hot cards, and still drew alot of power. So from top to bottom in a physical sense, itll be awhile before they can even do that. And then, they have this huge die to sell. I mean, the die in the G280 is one of the largest ever made, and yes , more to go wrong, but all processor makers include an overhead amount, just in case, which is why after they get the bugs out, and the yields do go up, they have extra "room" for the refresh
May 31, 2008 10:22:48 AM

Ycon said:
But you do know that you just quoted the INQ?


Sure, but I recommended a shaker of salt with that snack. When I link to the Inquirer, it's always with a caveat. Still, in a world where fanboys go by biased sites that skew towards one company or the other, it's nice having a total rumor mill that disrespects everybody equally and never gets invited to the marketing parties.

That way you can bet that disgruntled insiders under NDA's will leak info that's at least partially correct, as opposed to marketing mishegoss that hides whatever inconvenient truths are out there.

TheGreatGrapeApe said:
The GTX280 is only an economically sensible product if it contributes to the bottom line. The funny thing is that it's biggest benefit will likely not be direct, but the halo effect if it is the top performer. It may lose 10s of millions of dollars, but the effect on nV's other cards might be enough to be worth it like a PR/Ad campaign.


The halo effect has always bothered me. It's an irrational response to marketing at it's best. I had a Radeon AIW 9800 Pro and it was great. The Radeon 9200 wasn't great, it was a modified last gen 8500 (though it often worked better than Nvidia's FX 5200).

What bothers me more than the halo effect is the way Nvidia tweaks drivers for demo benchmarks for soon to be released games. The Crysis demo water comes to mind. I wonder how many cards that sold? Also, the 9600 in SLI on specific Nvidia chipset boards with boosted PCIe x16 bus "beating" a 3870x2.

ATI hadn't done anything like that for quite awhile, but Nvidia kept it up and first benchmarks sell cards faster than the halo effect; just because an 8800 Ultra wins the day, no gamer would trust an 8600gt in DX10. Benchmarks that aren't transparent in the way boards or cards are tweaked, or that are done with outdated drivers cherry picked to make the other company's cards out to be losers -- all things Nvidia's done recently.

Now, both ATI and Nvidia are reported doing dodgy things to Futuremark Vantage with drivers that show one particular card in each lineup to have amazing performance that doesn't fit into the set that includes their other cards.

I hate that kind of stuff.
a b U Graphics card
May 31, 2008 2:40:30 PM

hannibal said:
Didn't know that... I really was thinking that it was good and profitable product. So there was so much pressure from the ATI that they have to keep price low?


Yeah, and yields from he G92 were rather poor according to nV's CFO.

I'm sure that without pressure from ATi they would've charged more for them, and likely would've profited more, but it ended up losing money for them due to a combination of both issues.

Quote:
Do you have any idea how well ATI's low end products sells when 9700, 9800 were out, and how Nvidia managet, when 8800 series were (and still are) at the top? It would be interesting reading. How much it does really matter?


Yeah, I don't know but the FX sucked at that time and still the FX5200 sold well, the R9000 of course was an odd fish in that it wasn't like the R9700, but then again neight was the MX series which was still being sold.

I'm certain that the X1900 helped sell a bunch of X1600s despite the GF7600 being a bit better at the time, then with the GF8800 arguably the HD2600 is slightly better in it's laptop form, but the GF8600M still gets the halo effect, but on the desktop I'd argue that it's tougher (due to higher clocks) where the GF8600 is a tigher race, and most people look at the AA results even though the cards aren't meant for AA really so it'd be tough to say what's a Halo effect and what's just a well researched purchase.
a b U Graphics card
May 31, 2008 2:51:07 PM

yipsl said:

The halo effect has always bothered me. It's an irrational response to marketing at it's best. I had a Radeon AIW 9800 Pro and it was great. The Radeon 9200 wasn't great, it was a modified last gen 8500 (though it often worked better than Nvidia's FX 5200).


Yeah but that's the market place unfortunately.
No one educates themselves about their purchases despite having the ease of the internet.

The floptimization is a bit annoying although I don't care about the Vantage issue as much since it is still rendering correctly, but the thing that bothers me is something I've mentioned before and Charlie even mentioned himself, that either company will do a release where they cut the price to get the attention, but never intended to ship enough product, just so that from now on all the launch reviews say that X card had better price/performance despite the fact that thy never really were available for that price until long afterwards when they were on their way out.

Both companies take advantage of these tactics and reviewers need to get more critical about it.
May 31, 2008 4:14:10 PM

i wonder how bad the yield is for the GT200 chip as nvidia have to use 65nm at first as a back up plan to the 55nm.would the 55nm will be that cooler given the massive size of the GT200.
a b U Graphics card
May 31, 2008 5:05:49 PM

Youre looking at a +20% die shrink from the original, plus hopefully a better process at the end, so it should be a good difference in yields as well as heat. Depends on what nVidia does with them. You could even shrink the amount of trannys if you get real good yields, elimanting a little over head, which would let you, attain slightly higher clocks and save on thermals, or alot more clocks, and keep the higher thermals, depending if the new process allowed for it, and it was needed at the time, due to competition
May 31, 2008 5:15:20 PM

yeah, thats pretty annoying I must admit. I have only had nVidia cards untill my first ATI Radeon HD2600XT. I just loved it. I saw image quality improvement with naked eyes, so its not just bogus, and since then I can call myself ATI fan. I am starting to hate nVidia for the driver tweaking stuff but they are doing that since long time ago, not only for the Crysis benchmark. ALL 3dMarks are examples how nVidia collaborates with developers so their cards have little advantage.

Another thing is that nVidia is sponsoring so many games that its just obvious that there should be some bias when it comes to performance.
I know that business is dirty word but thats the reality.

I am sure if ATI could do that, they would, too, but on the short stack its difficult to play bets.

If you are a game developer and for example nVidia sposor a game with 1 mil and the whole game budget is 5-6 mil, then its a LARGE sponsorship so the developer will make every possible enchancements for the nVidia architecture and not bother at all to ATI (well may be just a little so that game dont crashes and stuff), so then ATI are in behind and need to optimize thier drivers for the actual game.

The world is not perfect at all after all :) 
a b U Graphics card
May 31, 2008 5:32:13 PM

ATI is partly to blame for this as well. They have new people on this currently, sowe should see improvements in this area. Just like Intels influence in the cpu arena, so nVidia has its influence. I hear alot of Intel fanbois claiming this and that about Intels Larrabee, but I remind them that nVidia already has this influence in the gpu market, so this is going to be interesting. Assassins Creed has given up on DX10.1 and wont reimplement it. I think the heats going to be on the devs once Intel enters the fray, and things will improve, that and we need a better console, cause with these newer cards, they outpace current consoles so much, it inhibits game dev itself, so nVidias influence is sometimes welcome too
May 31, 2008 6:17:03 PM

GTX280 or 4870CF? ....

We would know when Computex starts on monday.

I'm dying to read Thunderman's comments on this though :p 
May 31, 2008 8:57:47 PM

In long run the big problem can be that intel kills both ATI and Nvidia. Now there at least seems to be fluctuative balance... But fortunate that seems to be in far future.
They have so big production capasity and resourses. Monopoly is newer good thing.
June 1, 2008 3:59:04 AM

back to all this talk of nvidia's lies, you cant forget the way they blurred the IQ on all the GF7XXX cards for better frames rates cuz the X1800, X1900, & X1950 were raping the 7800 & 7900, & 7950
June 1, 2008 4:03:25 AM

shouldnt ATI do the same now?if im to pick.i wouldnt pick a card that offer 10% better image quality but offer 20% less frame rate.so in short performance above quality.
!