Sign in with
Sign up | Sign in
Your question

Phenom II 955 and 965

Last response: in CPUs
Share
March 19, 2010 11:42:25 AM

phenom II 955 and 965

which will perform good in games

More about : phenom 955 965

a c 158 à CPUs
March 19, 2010 3:36:05 PM

Both, the only difference between them is 200MHz, that you can get from the 955 with rise the multiplier.

Now, remember that the games performance also depends of the GPU and RAM, not only of the CPU.
a b à CPUs
March 19, 2010 3:44:17 PM

They are the same core, with the 965 being a 1x multiplier higher than the 955. OC them and there won't really be a difference (other than that the 965 might get a little higher max OC).
Related resources
a c 83 à CPUs
March 19, 2010 8:28:40 PM

Phenom II 965 only has a 6.25% higher clock speed than the Phenom 955, and real world won't even show a 6% performance gain in most tasks as it doesn't scale that way. I would say in most cases, you won't see more than 1-2fps difference.
a b à CPUs
March 20, 2010 3:37:33 AM

As everyone said earlier, the 965 is, effectively, a factory OC'ed 955. Just go into the BIOS, raise the multiplier by 1, and boom! You have a 965.

Most games scale with the GPU instead of the CPU. So a slightly better graphics card will give you better performance as opposed to a slightly better CPU.
a b à CPUs
March 20, 2010 4:05:53 AM

Both 955 and 965 are Black Edition which implies unlocked multiplier on them.

965 is no more than a 955 with default multiplier increased by one.
a b à CPUs
March 20, 2010 6:26:42 AM

king game said:
phenom II 955 and 965

which will perform good in games


The answer is - both.

Which games are you playing? If you're on a limited budget, skip the quadcore, go dual core and use the extra cash to get a better videocard.
March 20, 2010 3:03:52 PM

1 good thing is right now newegg has combo deals on the 965 so youll save the cash you spend extra on the 965. currently the deal i choose was

#

AMD Phenom II X4 965 Black Edition Deneb 3.4GHz Socket AM3 125W Quad-Core Processor Model HDZ965FBGMBOX - Retail
Item #: N82E16819103727
Return Policy: CPU Replacement Only Return Policy

#

GIGABYTE GA-790FXTA-UD5 AM3 AMD 790FX SATA 6Gb/s USB 3.0 ATX AMD Motherboard - Retail
Item #: N82E16813128415
Return Policy: Standard Return Policy

#

AMD Gift - Call of Duty Modern Warfare 2 Coupon - Retail
Item #: N82E16800995090
a b à CPUs
March 20, 2010 4:08:07 PM

Thats a nice combo- how much is it though? If theres combos with the 965 making it similar price to what you'd get ith a 955, theres no reason not to:-)
a b à CPUs
March 20, 2010 5:38:15 PM

Didn't they fix the 965 so it has the same 'w' as the 955 at stock? That's a plus, but still not enough to warrant it unless you plan on max overclocking or stock usage.
March 20, 2010 5:51:58 PM

Yes, Raidur - the 955 and 965 have the same 125W TDP at stock settings. How they managed to do that, or if they just claim it, I do not know... But both the 955 and 965 "M"BOX versions are C3-Stepping, so that's not it.
a b à CPUs
March 20, 2010 5:56:37 PM

I'm guessing that they also improved the 955, but just kept the voltage the same and didnt change the TDP rating. I mean, really, they just were able to clock them at a higher speed without increasing voltages. So most likely what that means is you could probably under-volt the 955 at stock settings with the c3-stepping and still run at the spec speeds.
March 21, 2010 8:32:36 AM

The original 965 C2 had a TDP of 140W, but it never really ran all that much hotter than a 955 C2 (125W TDP). The C3 steppings have much improved IMC(memory controllers), where RAM can run 1600-1800mhz stable. Also, the C3 steppings use less voltage for higher clocks, meaning lower temps.

Between the 955 and 965, the 955 uses a stock voltage of 1.35V for 3.2ghz (up to 3.6ghz on stock voltage) and the 965 uses a stock voltage of 1.4V for 3.4ghz (up to 3.7-3.8ghz on stock voltage). The 965 C3 has an easier time getting 4-4.2ghz stable, though I haven't read too much about the 955 C3, so it could be the same.

If money isn't an issue, I would go for the 965 C3, simply because it's a much better overclocker and has a higher potential to reach 4+ghz stable. Though if you want to save money, go for the 955 C3 and use the spare $25 to get yourself an aftermarket CPU cooler or spend more on your GPU, which is more important in gaming.

Just remember to look at the model number like what RazberyBandit said, and make sure it ends with "....MBOX" instead of "....IBOX" as the "IBOX" are the C2 steppings. Here are links to the C3 steppings of the 955 and 965:

955 C3 stepping Link

965 C3 stepping Link

Hope I helped! :D 
March 21, 2010 2:13:05 PM

i plan on getiing a 5870 or 5850 cause i use a 24" screen now and my Gpu struggles on Crysis

on 1280X1024 i can play crisis perfectly

but i think this should be good

955+HD5870

but i will upgrade only after a months
March 21, 2010 10:12:03 PM

king game said:
i plan on getiing a 5870 or 5850 cause i use a 24" screen now and my Gpu struggles on Crysis

on 1280X1024 i can play crisis perfectly

but i think this should be good

955+HD5870

but i will upgrade only after a months

Sounds good to me, but I wouldn't upgrade anything in the next few months once you have your 955+HD5870, that's already overkill for most games. Unless you got a lot of money, it should last you at least a year or two before you have to upgrade.
a b à CPUs
March 22, 2010 6:29:35 AM

Man with Metro 2033 out it makes me damn glad to have my overpowered GPU setup. That damn game is beautiful.

Makes me start to want my 5870 crossfire setup now that my current ones get pwned during extreme lighting/explosions in metro. (Crysis = pwnt DX9 High 1920x1080 60+FPS, 8XAA 30-50FPS)

Hopefully I'll be happy enough when I sell my 4870 for a 2nd 4890. We'll see... =/ It'll be hard to convince myself to spend $800 because I know I won't be happy going to just a single 5870.

955+5870 = Prime gaming setup these days.
March 22, 2010 6:39:07 AM

king game said:
phenom II 955 and 965

which will perform good in games


It depends on your graphics card like if you have an NVIDIA you might want to enable PHys-x to relieve stress off the CPU. So the graphics card will take some of the work load, but as far as your question goes both will game I mean I see some old P4 3.20Ghz keeping up with some Dual-Cores.
March 22, 2010 8:08:02 AM

SHANEMARINES said:
It depends on your graphics card like if you have an NVIDIA you might want to enable PHys-x to relieve stress off the CPU. So the graphics card will take some of the work load, but as far as your question goes both will game I mean I see some old P4 3.20Ghz keeping up with some Dual-Cores.

Phys-X doesn't do wonders, sure it helps for the older CPUs/GPUs and keep in mind it only helps in games that support Phys-X. If the person already has a high-end Nvidia/ATI GPU, it's not gonna help at all.
March 22, 2010 8:15:32 AM

Raidur said:
Man with Metro 2033 out it makes me damn glad to have my overpowered GPU setup. That damn game is beautiful.

Makes me start to want my 5870 crossfire setup now that my current ones get pwned during extreme lighting/explosions in metro. (Crysis = pwnt DX9 High 1920x1080 60+FPS, 8XAA 30-50FPS)

Hopefully I'll be happy enough when I sell my 4870 for a 2nd 4890. We'll see... =/ It'll be hard to convince myself to spend $800 because I know I won't be happy going to just a single 5870.

955+5870 = Prime gaming setup these days.


Used to be 955+4890 just 6 months ago. -sigh- The times go by too fast sometimes. Does Metro 2033 really use up that much power? For a game to make a 4870 xfire setup stutter, just seems impossible.
March 22, 2010 8:55:03 AM

He's got a mixed CrossFire setup - 4890 + 4870. It should actually outperform a dual-4870 setup slightly. Either way, I too find it hard to believe he's got problems running any game... But, Metro 2033 is new so the driver package / Crossfire profile may need some tuning. As a single 4890 owner, and with their availability dwindling, I've wondered about adding a 4870 for CrossFire myself. I'm just reluctant, as I'd rather have two 4890's.

I view PhysX as somewhat of a gimmick. There really aren't that many titles that support it. Those that do will only show major performance gains if there's a dedicated PhysX card, as running the physics and the game's normal graphics simultaneously on a single GPU can cause performance hits.

Xbit-Labs did a nice comparison between the eVGA GTX-275 Co-Op and the HD5850. The 275 Co-Op is built similar to a GTX-295, but the 2nd GPU is actually a GTS-250 G92b GPU that only runs PhysX. If the game doesn't support PhysX, then the 2nd GPU does nothing. The GTX-275 GPU within it is underclocked some, too, so it's performance isn't the same as a standard one. Anyway, he's the link:
http://www.xbitlabs.com/articles/video/display/evga-gtx...
March 22, 2010 10:05:05 AM

Ah, yeah I know he was running a 4890/4870 combo, but it's closer to a 4870xfire setup than anything else, so I said that.

I've got a single 4890 too and it's the Sapphire Toxic edition (960core/1050mem@stock), so I don't want it's 1025/1111 OC clocks to go down to a 4870's lower clocks. It wouldn't do it justice and I would have wasted the extra money I spent on this card over the stock/less factory-OCed cards. Hopefully, I'll be able to find another Toxic on a forum or ebay IF I ever need that second card.

Good job on talking about the Phys-X dedicated setups. I forgot about them entirely, but they are the exceptions of when Phys-X will actually show a significant improvement in games. The only draw back is that you will need at least a dedicated 8800/9800GT/GTX250, to effectively help out the higher-end GPUs.
March 22, 2010 10:20:30 AM

I wish my XFX 4890 would OC that high. It's a ZDEC model with 875/975 stock settings. The highest successful OC I've managed is 930/1090, which is decent, but it's nowhere near your Sapphire's speeds. I've yet to bother voltage or BIOS modding it... XFX is pretty liberal with their warranty, permitting overclocking and 3rd-party cooler installations. However, I'm pretty certain actually modding the BIOS would void it.
March 22, 2010 10:31:49 AM

The thing is my Toxic runs at a default stock voltage of 1.4Vcore/2.1Vmem, so I'm probably using more voltage than you are. I've tried running it at 1.5Vcore/2.3Vmem and it can OC as high as 1060/1150, but I just didn't like how my memory's temps would reach 90°C+ during load.

I was thinking about using the Atomic (1000/1050) BIOS, but I haven't learned how to do GPU BIOS flashes yet. In fact, many people flash their GPU's BIOS and it's possible to even reflash it back to it's original BIOS, just in case you were thinking about sending it back for warranty or for some other manner.
a b à CPUs
March 22, 2010 10:43:06 AM

They'll both perform equally well. That's why the 955 is the better buy of the two. All CPUs have to do these days is be fast enough not to bottleneck the GPUs when it comes to gaming. Even the Athlon II X4 2.6GHz is fast enough not to bottleneck almost all GPUs so I wouldn't worry about it. :sol: 
March 22, 2010 10:49:07 AM

LOL. Mine runs at 1.3125V core/1.2V memory, according to GPU-Z. Time to download RBE from techpowerup so I can actually see the numbers within the BIOS for myself, finally. I might even tune-up the fan's settings to something more aggressive. I've grown tired of manually setting it to 50-70% while gaming to keep it under 80C.

TPU's Radeon BIOS Editor
March 22, 2010 3:50:00 PM

kokin said:
Phys-X doesn't do wonders, sure it helps for the older CPUs/GPUs and keep in mind it only helps in games that support Phys-X. If the person already has a high-end Nvidia/ATI GPU, it's not gonna help at all.


I was just putting a thought out there cause I have a pretty high end card GTX 260, but I only have a AMD Atholon dual core II @ 3.0Ghz.
a b à CPUs
March 22, 2010 6:55:20 PM

PhysX only works on a few certain games, and doesn't really do all that much. Most games are gonna be more GPU limited than CPU limited today anyway. PhysX is really mostly a gimmick- it does cool things on a couple of titles, but nothing that you can't do without. With prices and performance levels out there right now Ati cards are a much better buy (for sure until Fermi releases). For the question here, the 955 or 965, PhysX makes zero difference for the CPU- they both have plenty of power for doing CPU Physics calculations.
March 22, 2010 9:14:04 PM

flyinfinni said:
PhysX only works on a few certain games, and doesn't really do all that much. Most games are gonna be more GPU limited than CPU limited today anyway. PhysX is really mostly a gimmick- it does cool things on a couple of titles, but nothing that you can't do without. With prices and performance levels out there right now Ati cards are a much better buy (for sure until Fermi releases). For the question here, the 955 or 965, PhysX makes zero difference for the CPU- they both have plenty of power for doing CPU Physics calculations.


I like Phys-X for the fact I run a smaller Nano-meter in my GPU then my CPU. The smaller nano-meter is better at regulating heat from program stress.
a b à CPUs
March 22, 2010 9:36:30 PM

Interesting. I'm not sure how that works, but I would guess you have a fairly unique situation then, compare to most of us.
March 22, 2010 9:51:43 PM

flyinfinni said:
Interesting. I'm not sure how that works, but I would guess you have a fairly unique situation then, compare to most of us.


I should have edited it cause the fact I'm another (CRYSIS FANBOY). Crysis being one of the only games that really takes full advantageof NVIDIA
Phys-X and Windows vista/7 DirectX10. What a BeAuTiFuL game if you can run it properly.
March 23, 2010 3:11:26 AM

SHANEMARINES said:
I should have edited it cause the fact I'm another (CRYSIS FANBOY). Crysis being one of the only games that really takes full advantageof NVIDIA
Phys-X and Windows vista/7 DirectX10. What a BeAuTiFuL game if you can run it properly.

That's why we're all drooling for the release of Crysis 2. :) 

@Razbery, your GPU runs pretty hot! :o  I have never see my load over 65°C (aside from Furmark) and that's with fans set on auto (38-40%). I usually use 60% fan speed for gaming and I see temps in the high 50s °C.
a b à CPUs
March 23, 2010 3:32:35 AM

SHANEMARINES said:
I should have edited it cause the fact I'm another (CRYSIS FANBOY). Crysis being one of the only games that really takes full advantageof NVIDIA
Phys-X and Windows vista/7 DirectX10. What a BeAuTiFuL game if you can run it properly.


what the hell are you on, crysis does not use PhysX, they used there own proprietary physics that ran on the cpu
physics != PhysX

NVidia PhysX list, notice how crysis is not on there

btw, Metro 2033 is gpu intensive, i have 2 x 4870 and play on high to get good fps

@OP, if you still look at this thread, get the 955, its just as fast
March 23, 2010 4:20:52 AM

mindless728 said:
what the hell are you on, crysis does not use PhysX, they used there own proprietary physics that ran on the cpu
physics != PhysX

NVidia PhysX list, notice how crysis is not on there

btw, Metro 2033 is gpu intensive, i have 2 x 4870 and play on high to get good fps

@OP, if you still look at this thread, get the 955, its just as fast


Actully Crysis is right in my NVIDIA CONTROL PANEL SETTINGS FOR MY PHYS-X. I mean if you've ever played the game you will notice a huge difference when running the GPU/CPU Benchmark system built into Cyrsis bin folder 32. The fact also if you disable/enable phys-X in the NVIDIA control panel and then go into the crysis in game mode and press the "~" Key and then type r_displayinfo 1 and there will be a huge difference between have Phys-X enabled and disabled. I'm sorry I don't really like to argue, but I'm pretty sure I'm right about this one my whole computer goal has been to run Crysis original on DX10-1920x1080-VeryHigh Settings. I finally almost got there with a:

GTX 260
AMD Atholon II x 2 @ 3.0Ghz
2 x 1GB DDR2 in dual channel and another set of 2 x 512MB DDR2 in dual channel = 3GB All 4 DIMMs Dual Channel.
Gigabyte MA770-UD3

So I'm really happy with my build so far.
Anyways if I am wrong I'm sorry, I just want to lay the facts out that I have experinced through Crysis. Thanks though for the post.


Update: I did some research and your right and I'm wrong (Crysis does not support Phys-X). I'm wondering if somehow the GTX 260 is still taking some of the CPU workload?

Cause I know when I ran two GTS250's in SLI, they never stood a chance agianst my one GTX 260 that I bought from bestbuy.

I'm sorry I was wrong.
March 23, 2010 12:15:28 PM

Sorry for the hi-jacking, King Game.

kokin said:
That's why we're all drooling for the release of Crysis 2. :) 

@Razbery, your GPU runs pretty hot! :o  I have never see my load over 65°C (aside from Furmark) and that's with fans set on auto (38-40%). I usually use 60% fan speed for gaming and I see temps in the high 50s °C.

Damn... Now I not only have OC envy, but cooling envy, too. And yeah, I know mine's hot. If I set the fan to auto, I'm lucky if it ever spins up to 50% under load, and at idle it sits still. My card's reference cooler doesn't have the usual "squirrel-cage" fan. Instead, it has a 7-bladed fan of the same size. I dunno if that really makes a difference, though.

I've been tempted to take it apart and apply my own thermal grease to see if that helps at all. I've got AS Ceramique, ZEROTherm ZT-100, and Cooler Master Thermal Fusion at my disposal... I suppose it would make for a nice comparison of the three. LOL I've also considered installing an aftermarket cooler, such as the Thermalright T-Rad2 GTX w/ it's accompanying VRM2. XFX's warranty terms permit me to do this and the TIM swap, but it's quite a hassle and expense to go through, which in the end might not yield significantly better temps. Though, barring a major installation error on my part, I seriously doubt they could get any worse.

I sometimes find myself wishing it would just die. With the HD4890 now in end-of-life status, it shouldn't take long for XFX's reserve of replacement 4890's to disappear. A lack of 4890's means it would become possible to get upgraded to a replacement 5830/5850. If this were to happen, I wouldn't accept anything less than a 5830 as a replacement, and I'd fight like crazy with them for a 5850.

My brother bought a pair of BFG 6600GT's when they came out. In the years since, they've been replaced under warranty twice - first time with a pair of 7600GS's, and the 2nd time with a pair of 8600GT's. BFG really stood by their product. He's had the 8600GT's for over 2 years now, all the time hoping they'll die again so he can get another free upgrade. LOL
a b à CPUs
March 23, 2010 2:29:14 PM

Shanemarines- 2xGTS250 should be way more powerful than a GTX260... not sure how thats working, but maybe your SLI wasn't working properly with the GTS250s.
March 24, 2010 12:45:53 AM

RazberyBandit said:
Sorry for the hi-jacking, King Game.


Damn... Now I not only have OC envy, but cooling envy, too. And yeah, I know mine's hot. If I set the fan to auto, I'm lucky if it ever spins up to 50% under load, and at idle it sits still. My card's reference cooler doesn't have the usual "squirrel-cage" fan. Instead, it has a 7-bladed fan of the same size. I dunno if that really makes a difference, though.

I've been tempted to take it apart and apply my own thermal grease to see if that helps at all. I've got AS Ceramique, ZEROTherm ZT-100, and Cooler Master Thermal Fusion at my disposal... I suppose it would make for a nice comparison of the three. LOL I've also considered installing an aftermarket cooler, such as the Thermalright T-Rad2 GTX w/ it's accompanying VRM2. XFX's warranty terms permit me to do this and the TIM swap, but it's quite a hassle and expense to go through, which in the end might not yield significantly better temps. Though, barring a major installation error on my part, I seriously doubt they could get any worse.

I sometimes find myself wishing it would just die. With the HD4890 now in end-of-life status, it shouldn't take long for XFX's reserve of replacement 4890's to disappear. A lack of 4890's means it would become possible to get upgraded to a replacement 5830/5850. If this were to happen, I wouldn't accept anything less than a 5830 as a replacement, and I'd fight like crazy with them for a 5850.

My brother bought a pair of BFG 6600GT's when they came out. In the years since, they've been replaced under warranty twice - first time with a pair of 7600GS's, and the 2nd time with a pair of 8600GT's. BFG really stood by their product. He's had the 8600GT's for over 2 years now, all the time hoping they'll die again so he can get another free upgrade. LOL


Wow your brother is one lucky guy! As for temps, as long as it's stable and not making your CPU overheat, then I think it's not worth the investment and time. I've replaced my 4890's TIM with Arctic Silver 5 and I actually ended up increasing temps by 1 or 2C. Who would of thought the stock TIM was that good already? For your case, I would try replacing the TIM for your card. The process itself is pretty simple and if you take your time, you'd have nothing to lose. There's tons of videos and tutorials, if you're ever thinking about trying it out.
March 24, 2010 1:38:58 AM

I've replaced GPU coolers with aftermarket ones before, so I'm confident I can do it. I already have my ArctiClean, TIMs, rubber gloves, isopropyl alcohol, coffee filters, finger cots, and cotton swabs ready to go from the last build ;) 

The only problem I have with replacing the TIM lies with the "pads" for the memory chips and VRM. Temps within GPU-Z show they are indeed the parts that are getting the hottest. (VRM gets over 100C almost constantly under load, and the memory can hit 90C unless I crank up the fan.) I realize that if the core is kept cooler, the entire heatsink will be cooler, thus lessening all my temps. However, I'm curious if there's another means to get the memory chips and VRM to make contact with the cooler other than reusing the same pre-applied pads...

Any thoughts?
March 24, 2010 2:58:46 AM

Not sure actually. I just keep the same pads, but I noticed my memory goes up to 90C+, when it didn't go up that high before. I even tried to clean out the pads, too bad we can't directly connect them to the heatsink.
March 24, 2010 4:06:47 AM

flyinfinni said:
Shanemarines- 2xGTS250 should be way more powerful than a GTX260... not sure how thats working, but maybe your SLI wasn't working properly with the GTS250s.


I only got a jump from Crysis/Folder-Bin32/GPUBenchmark:

When running the 1 GTS 250: Average frames were 23.43

When running 2 GTS 250's SLI: Average frames were 31.25

Crysis Spec's:

DX10, 1920 x 1080, AA-Off, VeryHighSpecs.

When running 1 GTX 2601GB: Average frames were 44.63

Sorry if this seems weird to some, but SLI didn't help that much for the fact running two cards was making 1 PCI Express 16 slot down clock to 8x instead of the regular 16x. I was very disappointed!

I took the GTS 250's back to bestbuy and bought 1 PNY GTX 260 and I just love my GTX 260!
March 24, 2010 4:13:13 AM

Those are pretty same but 965 has 1X up multiplier or 200 MHz which can be easily achieved by upping the multiplier on 955 as it a black edition processor.
March 24, 2010 4:15:07 AM

Mudit Sathe said:
Those are pretty same but 965 has 1X up multiplier or 200 MHz which can be easily achieved by upping the multiplier on 955 as it a black edition processor.


Agreed!
March 24, 2010 5:50:22 AM

Mudit Sathe said:
Those are pretty same but 965 has 1X up multiplier or 200 MHz which can be easily achieved by upping the multiplier on 955 as it a black edition processor.

That's been said many times in the beginning of the thread, but the 965 C3 is known to OC higher than the 955 C3.
March 24, 2010 5:58:33 AM

SHANEMARINES said:
I only got a jump from Crysis/Folder-Bin32/GPUBenchmark:

When running the 1 GTS 250: Average frames were 23.43

When running 2 GTS 250's SLI: Average frames were 31.25

Crysis Spec's:

DX10, 1920 x 1080, AA-Off, VeryHighSpecs.

When running 1 GTX 2601GB: Average frames were 44.63

Sorry if this seems weird to some, but SLI didn't help that much for the fact running two cards was making 1 PCI Express 16 slot down clock to 8x instead of the regular 16x. I was very disappointed!

I took the GTS 250's back to bestbuy and bought 1 PNY GTX 260 and I just love my GTX 260!

This just means your motherboard doesn't operate at 16x/16x, but rather 16x/8x, just like my motherboard has 2 16x slots and 2 8x slots. Even so, 8x doesn't lose much performance, if any.

Tom's did an article testing 16x/16x and 8x/8x on an i5-750 and 2 5850s and their testings showed a 5-7% decrease in performance between the 16x crossfire and the 8x crossfire. I've tried putting my 4890 in one of my 8x slots and I was still getting the same FPS as 16x, though benchmark scores were about 1-5% lower.

Something must of been wrong with your SLI setup because those FPS look kind of funny. Unless those 250's were only 512mb, but it wouldn't matter if you had AA/AF off.
March 24, 2010 3:15:06 PM

kokin said:
This just means your motherboard doesn't operate at 16x/16x, but rather 16x/8x, just like my motherboard has 2 16x slots and 2 8x slots. Even so, 8x doesn't lose much performance, if any.

Tom's did an article testing 16x/16x and 8x/8x on an i5-750 and 2 5850s and their testings showed a 5-7% decrease in performance between the 16x crossfire and the 8x crossfire. I've tried putting my 4890 in one of my 8x slots and I was still getting the same FPS as 16x, though benchmark scores were about 1-5% lower.

Something must of been wrong with your SLI setup because those FPS look kind of funny. Unless those 250's were only 512mb, but it wouldn't matter if you had AA/AF off.


First thing I bought was:

http://www.bestbuy.com/site/XFX+-+ATI+Radeon+HD+5770+1G...

It didn't perform as well as I would have liked. I really did like though that ATI actully spent the time on putting in an HDMI so I could hook it up to my 50inch plasma and have sound! EVER HAD DISPLAY PORT, its about time someone follows APPLE a little.

Second thing I bought:

http://www.bestbuy.com/site/PNY+-+NVIDIA+GeForce+XLR8+G...

I'm not sure what happend, I enabled SLI and SLI indicator and selected game in NVIDIA control panel. The game was (BadCompany 2) I was running the game worse then my friend with his:

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

It was crap I didn't understand so I took the GTS 250's back to bestbuy agian!

Third thing I bought was:

http://www.bestbuy.com/site/PNY+-+NVIDIA+GeForce+XLR8+G...

I love this card thats all I'm going to say and when Nvidia finally gets past 295 series I'm jumping on it! This card performs well wil high resolutions, details maxed out, and works great running as a single card!

If I somehow didn't hook it up right with SLI I'm sorry for misleading you all, but I was positive it was showing an increase, but nothing that made you say WOW I'M SO HAPPY I SPENT THAT EXTRA 150-200 DOLLARS! lol...

March 24, 2010 4:32:45 PM

Shane, the important thing is you're happy with what you've got now. Though, for added performance in Crysis, you might wanna try unlocking or overclocking that CPU, as Crysis is still quite CPU-intensive.
March 24, 2010 5:14:58 PM

Razbery is right. As long as you're happy, then it's okay. It just seems weird to me that a single gpu is able to outperform SLI gpus just one model below. Don't forget to turn on the eye candy with AA/AF! :D 
a b à CPUs
March 24, 2010 5:33:34 PM

kokin said:
Used to be 955+4890 just 6 months ago. -sigh- The times go by too fast sometimes. Does Metro 2033 really use up that much power? For a game to make a 4870 xfire setup stutter, just seems impossible.


RazberyBandit said:
He's got a mixed CrossFire setup - 4890 + 4870. It should actually outperform a dual-4870 setup slightly. Either way, I too find it hard to believe he's got problems running any game... But, Metro 2033 is new so the driver package / Crossfire profile may need some tuning. As a single 4890 owner, and with their availability dwindling, I've wondered about adding a 4870 for CrossFire myself. I'm just reluctant, as I'd rather have two 4890's.

I view PhysX as somewhat of a gimmick. There really aren't that many titles that support it. Those that do will only show major performance gains if there's a dedicated PhysX card, as running the physics and the game's normal graphics simultaneously on a single GPU can cause performance hits.

Xbit-Labs did a nice comparison between the eVGA GTX-275 Co-Op and the HD5850. The 275 Co-Op is built similar to a GTX-295, but the 2nd GPU is actually a GTS-250 G92b GPU that only runs PhysX. If the game doesn't support PhysX, then the 2nd GPU does nothing. The GTX-275 GPU within it is underclocked some, too, so it's performance isn't the same as a standard one. Anyway, he's the link:
http://www.xbitlabs.com/articles/video/display/evga-gtx...


It could have something to do with me only using 2GB of ram but on XP I don't see it as an issue.

You guys really need to check out this game. First game I have found to beat crysis' extreme textures (minus character skin, like face and hands).

Another poster on this thread also mentioned having to lower settings to 'high' on a 4870x2.

This game really eats up the GPU. Look up some screen shots if you haven't. The lighting and texture depth is simply amazing. I hope more games continue to release that push higher end cards to the limit.

Oh yeah, and my setup is only slightly faster than 4870 1GB crossfire. My fps doesn't get any better with a higher clocked 4890 but its performance per clock is slightly better than a 4870s. I keep both cards at 800/1000.

Tbh I'm dieing to get a 2nd 4890 so I can take these cards to the 1ghz club. I just hate how damn hot these 4890s run! It out-temps my overclocked 4870 by 8c even at idle. Ugh. Not to mention the fan is always spinning twice as fast as my 4870s to keep up.
March 24, 2010 6:37:14 PM

It's somewhat relieving to see I'm not the only one struggling with higher 4890 temps and high fan speeds to compensate for them... LOL
a b à CPUs
March 24, 2010 7:11:14 PM

Yeah, mine runs at 60c IDLE... I don't even want to think how hot my case is going to get when I replace my 4870. Luckily I have some massive air-flow. (HAF 922)
March 24, 2010 8:19:01 PM

RazberyBandit said:
Shane, the important thing is you're happy with what you've got now. Though, for added performance in Crysis, you might wanna try unlocking or overclocking that CPU, as Crysis is still quite CPU-intensive.


True I mean my CPU is only a AMD Athlon II x 2 @ 3.0Ghz. I'm just saving up for the:

AMD Phenom II X4 965 Black Edition Deneb 3.4GHz 4 x 512KB L2 Cache 6MB L3 Cache Socket AM3 140W Quad-Core Processor

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

!