Sign in with
Sign up | Sign in
Your question

Quad cpu's not optimized on ATI?.........yet?

Last response: in Graphics & Displays
Share
a b U Graphics card
November 29, 2008 2:20:49 PM

According to this http://www.behardware.com/news/10005/dx10-radeon-geforc...
ATI's current drivers dont take advantage of quad cpus yet, and may be what weve witnessed recently with the BB2 drivers from nVidia, and recent benchmarks showing lower performance with ATI vs nVidia while using the new i7 chips

"According to Intel, 25 to 40% of the CPU load in a game is linked to Direct3D and the graphics driver. The threading of this load is not therefore insignificant and it would be more than useful for AMD to look at the question as soon as, to at least get on a comparable level with NVIDIA. We are now planning to update our Core i7 test, which we carried out with a Radeon HD 4870 X2 at first, this time using a GeForce GTX 280."

So, this may explain a few driver increases weve seen lately, and look with anticipation for future ones as well

More about : quad cpu optimized ati

November 29, 2008 4:41:49 PM

That would explain why 2 4870 X2s and 3 GTX 280s perform pretty close until you slap i7 in and 3 GTX 280s destroy 2 4870 X2s. Damn, I was toying with the idea of 3 GTX 280s for some amazing theoretical performance! I suppose there wouldn't be that much of a difference if this is true and gets fixed.
a b U Graphics card
a b à CPUs
November 29, 2008 4:49:01 PM

I wonder then, could this spell driver/GPU performance increases for all Quad core users? Even those outside of the i7 chips?
Related resources
a b U Graphics card
a b à CPUs
November 29, 2008 5:09:52 PM

Probably - the main issue is threading, not specific i7 optimization.
Anonymous
a b U Graphics card
a b à CPUs
November 29, 2008 5:39:42 PM

I so hope so... then all the quad-naysayers and DUAL FOR LIFE!!! people (not too many) can't continue to recommend dual cores when quad cores are becoming so cheap

I especially hope it improves single card performance... but i doubt it
a c 84 U Graphics card
a b à CPUs
November 29, 2008 6:48:58 PM

^^well the test was made with single cards so it really should...

interesting article, nice find :) 
a b U Graphics card
November 29, 2008 7:09:02 PM

And it appears, whether its some certain code, or the actual way the i7's do their threading, thats what we may be seeing in the multicard setup performance numbers. Ive been wondering about this for awhile, and so far, no ones done alot of benching on it
November 29, 2008 8:23:19 PM

Quote:
I so hope so... then all the quad-naysayers and DUAL FOR LIFE!!! people (not too many) can't continue to recommend dual cores when quad cores are becoming so cheap

I especially hope it improves single card performance... but i doubt it


Sorry pal, I wont stop recommending dualcores until the quadcores can compete with the E5200 @ $60-80! I mean common, a CPU that costs half the amount of the cheapest decent quadcore and can overclock easily to heights that allow it to easily outperform it in most applications! Though if you have the money then go ahead! Quadcores are future proof, but an E5200 is just plain amazing for the price/performance. Also I can kill every quadcore, and perform on par with i7 with my much cheaper E8600 with a good overclock! Common, give me one thing that is more fun than a CPU stable @ 5.0 Ghz on air (barely I will admit)!
a b U Graphics card
a b à CPUs
November 29, 2008 8:27:38 PM

The_Blood_Raven said:
Common, give me one thing that is more fun than a CPU stable @ 5.0 Ghz on air (barely I will admit)!


Ummm. Here's one.

http://www.ferrariusa.com/home.php

:pt1cable: 
a b U Graphics card
November 29, 2008 8:35:53 PM

Since weve pretty much reached the pinnacle of clock speeds, or rather, every single improvement we see becomes less and less, the vast improvements will come in IPC (tho less and less as well) , and multi core/multi threading.

Its like when people went from Black and white TVs to color TVs. Alot of stations didnt have color yet, and the color sets were crappy for true color, and mixing the color in the image often caused for fuzzy results, and color exactness wasnt that close,whereas B&W sets were cheap, and the video quality was superb.

Weve seen this before when singles were king, and the newer duals were clocked lower, and didnt perform as well, and cost much more. Its up to the individual as to how often they upgrade, and their uses of their rigs. For gaming only, since its so single threaded at this time, having a dual still holds its worth
November 29, 2008 9:05:13 PM

i absolutely love the idea of quads, its just that at the moment they offer no gaming benefits over duals, they cost more, they suck more power, they are less stable, and harder to/less of an overclock.
November 29, 2008 9:34:12 PM

because they understand how it works maybe? ATI will release a new driver set and it will be how it was 2 weeks ago lol
a b U Graphics card
November 29, 2008 9:34:15 PM

Possibly they already have with Big Bang2?
a b U Graphics card
November 29, 2008 9:36:27 PM

They should have used older nVidia drivers to see if this is when it was implemented, and possibly this means on a game for game basis? and each games possibilities?
November 29, 2008 9:38:57 PM

the new drivers are sweet. i didnt gain in 3dmark but games like UT3, Fallout3 and Crysis it is noticeably better. i am running a dual core so it isn't as relevant but it is their best driver release in a while.
a b U Graphics card
November 29, 2008 9:46:52 PM

Is this in DX10 only? What Im getting at is, if its better on DX10, then it goes to whats being said or claimed in the article, or better results anyways vs DX9. It may not be exclusive to quads, who knows?
November 29, 2008 9:57:50 PM

the 180.48 drivers were released the same day as the i7 was. Initially, i dont think there was a XP driver release so I am inclined to agree with ya there on the DX10 point. All their high end cards benefited from it so it was intended for card performance without the focus being specific to the i7 from what I gather from it.
Anonymous
a b U Graphics card
a b à CPUs
November 29, 2008 10:13:48 PM

blood raven has a point :na: 

but i mean... for the people who can spend a little bit more than 60 $... hehe
!