Sign in with
Sign up | Sign in
Your question

GTX 350 vs 4870X2

Last response: in Graphics & Displays
Share
July 21, 2008 3:45:44 PM

Which one would run the almighty Crysis at higher frame rates at highest setting as possible?

this is the battle between ATI's fastest card versus Nvidia's fastest card.
i'd go with 350 all the way. we probably finally could run Crysis at least 60fps with highest setting as possible on DX10 with GTX350

More about : gtx 350 4870x2

July 21, 2008 3:52:28 PM

Nvidia and its not called GTX 350 :D 

Dont know how fast exactly, but Crysis runs smooth as @ 1080p Very High
a b U Graphics card
July 21, 2008 3:58:46 PM

I won't believe it until the card is actually out. Consider the number of times that fake claims were made about "future" cards performance.
Related resources
July 21, 2008 4:17:49 PM

Nice piece of fiction this is. Somebody make this a sticky.
a b U Graphics card
July 21, 2008 4:34:28 PM

What the heck is a "GTX350"?!?!?!?!
July 21, 2008 4:36:25 PM

How about GTX350 vs 5870x2 ;-)
It's better to compare two dream cards than one dream card and one real...
July 21, 2008 4:37:12 PM

hannibal said:
How about GTX350 vs 5870x2 ;-)
It's better to compare two dream cards than one dream card and one real...


5870x2 wont exist until 2010, while Nvidia plans to launch 350 this year
July 21, 2008 4:39:33 PM

5870 is due Q1 2009 so I would guess 5870x2 will be also Q1 or Q2 max.
What is GTX350 and when is it expected ?
a c 86 U Graphics card
July 21, 2008 4:40:07 PM

Quote:
while Nvidia plans to launch 350 this year



^^where did you hear that from?
July 21, 2008 4:46:17 PM

I'm going to make my own video card out of peas and carrots. It will be faster than your dream cards after it is overclocked.
a b U Graphics card
July 21, 2008 4:47:33 PM

The carrots draw too much power though - celery is a better choice.
July 21, 2008 4:50:08 PM

Celery doesn't overclock as well, it gets soggy at 6Ghz. Besides I'm going for performance, screw the environment.
July 21, 2008 4:52:50 PM

What about Fusion version 2 ? it will have 4 built in 6870 gpu cores with 4 Phenom3 cores ! Next step up from there will be freaking skynet chips man ! Then its Game Over man !
July 21, 2008 4:53:03 PM

Yes I agree, 5870 is due Q1 2009 based on announcements. GTX350 was still a myth. No announcements yet. 4870X2 is due next month, but already exist for engineering sample as of this time, no doubt it is the current fastest card available.
July 21, 2008 5:00:24 PM

What happened to Fusion version 1?
July 21, 2008 5:10:42 PM

don't ask....
July 21, 2008 5:12:58 PM

The rumored GTX350 is so impractical there is no way it will be made anytime soon
July 21, 2008 5:29:16 PM

Quote:
The rumored GTX350 is so impractical there is no way it will be made anytime soon


It's so impractical Nvidia probably has engineering samples that come with an onboard PSU and requires it be plugged directly into the wall socket.

Actually, they'll probably just buy out all the 4870x2's and slap on a few stickers and charge us 900 USD for it. the Nvidia fanboi's will drown them in praise for making such an amazing card and Crysis will release a new patch which optimizes gameplay on ATI cards. All of a sudden the mythological GTX350 will hit 60 FPS in crysis on very high and the ATI 4870x2 will do the same.

The ATI people will laugh and laugh because they paid $400 less and the Nvidia fanboi's will be blown away by the performance of their new "nvidia" card.

/truestory
July 21, 2008 5:40:02 PM

Should we use nuclear power for this cards or wait until cold fusion is perfected?

Seriously, comparing two cards that don't exist is quite difficult and meaningless.

Is the 4870 x2 out yet? I haven't heard anything about it.
July 21, 2008 5:43:47 PM

:lol:  some pple like playing starwars
July 21, 2008 5:50:35 PM

kelfen said:
:lol:  some pple like playing starwars


LikeD... missing a key letter there.

And this is still the most amusing thread I've read in quite a while. People like concrum help you remember that even retards can access the internet.

July 21, 2008 6:03:40 PM

cokenbeer said:
And this is still the most amusing thread I've read in quite a while. People like concrum help you remember that even retards can access the internet.

Stick around here and you will be reminded of that fact on a daily basis :ange: 
July 21, 2008 6:25:59 PM

homerdog said:
Stick around here and you will be reminded of that fact on a daily basis :ange: 


Ati vs Nvidia and Intel vs AMD will always prove that idiots are getting on the internet.

Maybe they made it too EZ...

Damn you Intel for realeasing teh PIII who "improved" the internet LOL
July 21, 2008 6:46:19 PM

I blame Al Gore for his invention he calls the internet
a b U Graphics card
July 21, 2008 7:02:02 PM

The GTX 350 has a hamster wheel attached to the card by the PCIe plugs. I have had the cards a week and went thru 5 hamsters.

I guess urine and Electricity don't mix. My house smells of burnt hair.
July 21, 2008 7:47:33 PM

szwaba67 said:
I'm going to make my own video card out of peas and carrots. It will be faster than your dream cards after it is overclocked.


You green Veggie fanbois make me puke. Team Red (meat) FTW!
July 21, 2008 8:19:52 PM

Yeah, I've got a direct feed from the power grid. Gets my ATvidia PheForce Radore Duo X4 HD 9955.02 to 10 GHz easy. And it just takes a water main from Anarctica to cool the whole case. But with global warming and all I have to add a bit of liquid Nitrogen every so often. Crysis runs buttery smooth @ 2560 x 1600, 8xAA, 16xAF. I feel sorry for all you noobs.
July 21, 2008 8:42:27 PM

1haplo said:
The GTX 350 has a hamster wheel attached to the card by the PCIe plugs. I have had the cards a week and went thru 5 hamsters.

I guess urine and Electricity don't mix. My house smells of burnt hair.


i'd use mice ...they are more powerful then hamsters...maybe even rats :o 
July 21, 2008 9:13:23 PM

This guy is just polluting the forums. I just got a brand new 6870 X856...2thz core, 270gb memory @ 15e100hz memory...and 10 trillion shaders. It only uses 2 Mega Watts of power, not too bad as the electric company just gives you a massive service instead of your tiny 100 amp panel. Maybe this will play Crysis...... Maybe i should start a hypothetical garbage post about it and start a big fire........ -100 to Concrum,
July 21, 2008 10:20:34 PM

Are there moderators on these forums?
July 21, 2008 10:23:26 PM

jcorqian said:
Are there moderators on these forums?


No. They are too busy storing jiggawatts to power their GTX 350's
July 21, 2008 10:28:39 PM

^ can't wait to get my flux capacitor installed in one of my extra pci slots.
a c 86 U Graphics card
July 21, 2008 10:35:50 PM

^^wouldn't want to get hit by a inversed tachyon beam while installing one of those, it stings, badly :/ 
July 21, 2008 10:37:13 PM

At least here in canada, it will cut on heating cost
a b U Graphics card
July 22, 2008 1:07:35 AM

S3 Excalibur in Multi-Chrome FTW. [:thegreatgrapeape:6]

Powered by dual infinite improbability drives (is that infinity squared or infinity x 2, never sure about how multi-chrome works).
July 22, 2008 3:38:27 AM

We don't even know if this "350 GTX" even exists or not. Too me the specs sounds way off, too good to be true. Even if its true, why did NVIDIA name it "350 GTX" instead of "380 GTX"? Who knows?
July 22, 2008 3:40:22 AM

I still think my 6870 X856 PWNS ALL. Crysis on 100 30" monitors. Ya, i got em, better than my Tri SLI GTX 350...I forgot to mention i have a few of those too. I only get 2 30" screens with Crysis playing at smooth frames.... :D 
July 22, 2008 11:21:21 AM

mathiasschnell said:
Yeah, I've got a direct feed from the power grid. Gets my ATvidia PheForce Radore Duo X4 HD 9955.02 to 10 GHz easy. And it just takes a water main from Anarctica to cool the whole case. But with global warming and all I have to add a bit of liquid Nitrogen every so often. Crysis runs buttery smooth @ 2560 x 1600, 8xAA, 16xAF. I feel sorry for all you noobs.


I have a space shuttle regularly bringing in shipments of ice crystals that I buy of ET on Jupiter. With that I can bring down core temps to a 165c idle.

My gtx 350 was a bargain too... made a deal with nvidia and it only cost me a hundred grand a dead ATI marketer.
July 22, 2008 12:36:53 PM

Ape, do the box has that relaxing printing of words "Don't panic" ? ;) 
July 22, 2008 12:41:11 PM

Ape, double improbability drive will manage to power and improbable GFX card, I guess. :p  (Zaphod would think so. :p  )
July 22, 2008 12:50:26 PM

Ape, but you should be careful with it:

"An earlier attempt at using the improbability drive, Starship Titanic, was also mentioned. In theory, the infinite improbability drive would make it infinitely improbable that anything would go wrong. It was not successful, however, ending in a "Spontaneous Massive Existence Failure." This was because, in these earlier times when the nature of improbability was less well understood, it was not appreciated that any event that is infinitely improbable will, by definition, occur almost immediately."
a b U Graphics card
July 22, 2008 3:48:50 PM

Actually it's the manual that has Don't Panic printed on the back; the box has a few lines of Vogon poetry to keep away the mildly curious.
a b U Graphics card
July 22, 2008 3:59:01 PM

Well, I just got my Matrox Beamer for my rig. Its powered by 4 zero point modules. It actually doesnt play games and such, it just beams whatever I like onto the screen. Talk about reality !!!! Woooo HHHHooooo Matrox FTW!!!!
July 22, 2008 4:09:11 PM

TheGreatGrapeApe said:
Actually it's the manual that has Don't Panic printed on the back; the box has a few lines of Vogon poetry to keep away the mildly curious.


Im ready to play Crysis, but dude, where is my towel ??? Did i left home without my towel ?
July 22, 2008 4:49:45 PM

szwaba67 said:
I'm going to make my own video card out of peas and carrots. It will be faster than your dream cards after it is overclocked.


Oh yeah I'm going to make mine out of Spinach and it will be faster and more powerful than anything mankind can dream of and it shall be called "Popeye".
July 22, 2008 4:50:25 PM

Oh dude! Never leave your towel behind!!!!!!

BTW, I wish at least one manual had those printed on the cover. Really.
a b U Graphics card
July 22, 2008 6:08:27 PM

Yeah I think it's best policy, especially for tech, Don't Panic... and always have a towel. :sol: 
July 22, 2008 6:12:14 PM

I just contacted Chuck Norris, he will be powering my next video card....

"There is no 'ctrl' button on Chuck Norris's computer. Chuck Norris is always in control."

chucknorrisfacts.com
!