Sign in with
Sign up | Sign in
Your question

If I hav e a really weak cpu

Last response: in Graphics & Displays
Share
January 10, 2009 6:19:03 PM

and get like 1.5-2 on 3dmark06 cpu score (I think I meant 1.5 thousand not just 1.5 btw), what graphic card is so good that even with such bottle neck Ill get above 10k on 3dmark06? Cause my 9800 gt ocd only goes to like 9000. btw I only have a dual core 2.4 ghs.

More about : hav weak cpu

January 10, 2009 6:41:03 PM

Your graphics card pushes out more frames per second at lower resolutions. That can cause a "CPU bottleneck". One way to alleviate it with a weaker CPU is to play at higher resolutions. That way, the graphics card pushes out fewer fps and the CPU can keep up.

Unfortunately, there are no quick and easy charts matching particular graphics cards to particular CPU's. What processor do you have and what resolution are you gaming at?

If your performance in games is okay, then don't worry about 3DMark scores, because they're partly based on two CPU tests. So, a weaker CPU will affect the final score.

When I had an Athlon X2 4600+ with my 3870x2, I'd only get around 9,000 in 3DMark06. When I upgraded to a triple core 8750, the score bumped up to around 12,500 or so. When I get a Phenom 920 in a couple of weeks, it will be higher.

Yet, it won't mean a thing. 3DMark06 is not a perfect benchmark. The frame rates you get in games at your monitor's native resolution, with settings that suit you, is what really matters. Not the CPU limited 3DMark06.
Related resources
January 11, 2009 12:38:54 AM

Quote:
Where are your links though?

Where is you technical link about gpu-cpu communicaiton?


I'm not sure what you mean here. If you mean a discussion of CPU/GPU balance then The Inquirer has a test from last May:

http://www.theinquirer.net/inquirer/news/459/1042459/le...

http://www.theinquirer.net/inquirer/news/757/1019757/cp...

I see on an older Tom's Hardware thread you had problems with the concept, so I'll quote mactronix, who had a good explanation:

Quote:

This thing about resolution helping to cause bottlenecks with a CPU is a fact. I spent quite a while proving how it works on this forum in the past. This is much less likely to happen with a modern CPU, the older single core processors when teamed with a faster modern GPU will suffer from it more. The CPU can only calculate physics so fast. Now if the card is running at say 1680 x 1050 and is working a bit to do so then lowering the res means it has less to do so it will run faster. The assumption is that the CPU is close to flat out keeping up in the first place, so if the card is asking for frames faster than the CPU can do the calculations then you get a restriction on the FPS because the GPU is waiting for the CPU. That's not exactly what is happening technically but that's the basic idea. I really don't see why people have a problem understanding it.


http://www.tomshardware.com/forum/253966-33-will-bottle...

There's no easy formula to use. No "Tom's Bottleneck Charts" to refer to. It can vary from game to game and resolution to resolution, but overall, weaker CPU's bottleneck systems with high end graphics cards at lower resolutions. Even a good CPU can be a bottleneck as more cards are added. Some games are reported as CPU bound (high AI etc) and other games are GPU bound. Even some very good Intel quads at stock were bottlenecks with 4870x2 cards at resolutions lower than 1920 x 1080. I'm sorry if I can't provide you with a more detailed technical discussion. Maybe one of the regulars who's actually in engineering can explain it better.




January 11, 2009 6:31:58 AM

That's what I have a x2 4600+ and I try playing games at 1600x1200.
a c 130 U Graphics card
a b à CPUs
January 11, 2009 6:43:14 AM

You cant get a card that will help in this case the problem is in the way 3dm06 works, its very CPU reliant. People complained about this when it first came out, the CPU matters to it a lot more than the GPU does. This kind of thing is why using something like 3dm06 for general benchmarking of a system is a bad idea. Its ok to establish a base line to compare any improvements against, say a GPU or CPU upgrade.
Personally i think actual use is a better indication of how a PC is performing, play some games and as long as its giving you a good experience its doing its job fine. Who cares if someone else has a thousand more 3dm points than you do :) 

Mactronix (yes that one) :D  (cheers yips) :hello: 
January 11, 2009 7:57:51 AM

Do you think I should install vista x64 since I only have the 32 bit version and will see improvement in games?
January 11, 2009 7:56:19 PM

Pershing121 said:
Do you think I should install vista x64 since I only have the 32 bit version and will see improvement in games?


That I'm not sure about. Driver support for Vista-64 has improved, but I haven't tried it yet.

Depending on the game and the resolution, more memory on the GPU helps, but more system memory available will help mostly by avoiding having to use the hard drive's swap file space as frequently.


January 11, 2009 9:42:09 PM

Quote:
Well, i will again disagree with everyone and say that is a cpu limit not a bottleneck, I said it in another thread just now, but what you describe is a cpu limitation and IMO an ideal situation.

Basically there are not links to back up what you say, IMO your are using the wrong definition of bottleneck, you refer to it in relation to one component instead of a system.

Just my opinion but it will not change and i wil try to provide people with the best real world advice i can.



+1 thats the way i see it as well
a c 106 U Graphics card
a b à CPUs
January 12, 2009 2:20:50 AM

Well it's going to depend on what games you want to play at 1600x1200. Some games are more CPU bound then others after all. As everyone has already stated, 3dmark isn't a perfect benchmark but it's a good baseline. If you're worried that your CPU is limiting your performance you may want to overclock it. Of course your ability to do that is going to depend greatly on your motherboard and a bit on your particular CPU sample.
a c 130 U Graphics card
a b à CPUs
January 12, 2009 6:18:54 AM

@ strangestranger

While im happy for us to agree to disagree on this and also do agree that those who dont properly understand the issue misuse the term "Bottleneck"
I would like to point out that im not talking about a restriction which i see as something that could be rectified by a small Overclock. Im talking about a physical Que of requests that are causing a performance issue.
By the way as you are insisting that posts need links to be valid where are your's supporting what you are saying ?

Mactronix
January 12, 2009 3:31:45 PM

Quote:
Very well,

I will just link to the cpu charts, good enough for the purpose.

Supreme Commander

No huge difference between procs and for that game won't make a big deal as long as it's smooth.



1. There is asignificant frame rate difference between the Athlon X2 7750 @ 21.88 and the i7 965E @ 35.20. Those are average frame rates and I'm quite sure the minimal frame rates on the X2 7750 would hamper playability, even if the average was playable.

2. The highest clocked Windsor X2 at 3.2 still only gets 26.67. Though it has a lower IPC than the Phenom based X2 7750, it's clocks make up for it somewhat, but not all that much. It's only a measly 5 fps at most.

3. Supreme Commander uses more than 2 cores, so not only do higher IPC processors do better, but triples and quads do better than duals from the same company. Thus, an Intel quad clocked similarly to an Intel duo would do better in Supreme Commander and an AMD quad would do better than a triple, which would do better than a dual.

This example proves the point that overclocking a weak CPU only helps somewhat. The really weak CPU's that people still game on are not on 3rd quarter CPU charts anyways. Though the next version will add new processors like the Phenom II 940, they won't go back and add all the Windsors or Allendales.

It's better to balance a CPU with a GPU, but there is no hard and fast rule. My experience is that a 3870x2 was too powerful for a Windsor core X2 at 2.4, but a Phenom 8750 is fine. My gut feeling is that a 4870 1 gig is too powerful for an 8750, but a Phenom II 940 would be fine.

You argue that a limitation is fine, and you dismiss links showing that GPU's are hindered from doing their job without the right CPU at the right resolution (The Inquirer wanted to test out Nvidia's claim that CPU's don't matter). The evidence is clear that the same GPU gets higher framerates with more CPU power.

At any rate, I hope we can agree that 3DMark06 isn't much of a benchmark compared to actual games. Perhaps you just think that 21 fps average is no significant difference compared to 35? Heck, though they don't have the 8750, the 8650 gets almost 10 fps difference over the similarly clocked Kuma dual core. 10 fps difference is a big deal when even the best setups get under 36 fps.
a c 130 U Graphics card
a b à CPUs
January 12, 2009 3:43:39 PM

@ strangestranger

How does linking to toms charts help at all ? All you are showing is that a better CPU will give more frames. We all know that. What you seen to be saying is that there is no such thing as a "Bottleneck" just degrees of restriction.
Well the thing is the industry refer to these restrictions as bottlenecks, look up some reviews all the hardware site pro's use it.
I myself am well aware of the differance and as i said before its not something that happens very much these days so its a bit moot anyway.
Mactronix
a c 130 U Graphics card
a b à CPUs
January 12, 2009 8:08:48 PM

Yes and i don't disagree with you but again as i have said more than once on this thread and the other similar thread. IM NOT TALKING ABOUT A SMALL RESTRICTION IM TALKING ABOUT A PHYSICAL PROCESS QUE THAT ENDS UP CAUSING A PERFORMANCE ISSUE.
Sorry to shout i know its rude but your last couple of answers have just been you describing a restriction.
Instead of repeating yourself why don't you tell me what your definitions of the two are and how they are different ?

Mactronix
a b U Graphics card
January 13, 2009 1:13:42 AM

To me, you could use the term bottleneck, as in , it bottlenecked before jamming. That makes perfect sense for the term bottleneck. Its constrained more than just restricted. To me, this makes more sense
January 13, 2009 1:43:54 AM

yipsl said:
1. There is asignificant frame rate difference between the Athlon X2 7750 @ 21.88 and the i7 965E @ 35.20. Those are average frame rates and I'm quite sure the minimal frame rates on the X2 7750 would hamper playability, even if the average was playable.

2. The highest clocked Windsor X2 at 3.2 still only gets 26.67. Though it has a lower IPC than the Phenom based X2 7750, it's clocks make up for it somewhat, but not all that much. It's only a measly 5 fps at most.

3. Supreme Commander uses more than 2 cores, so not only do higher IPC processors do better, but triples and quads do better than duals from the same company. Thus, an Intel quad clocked similarly to an Intel duo would do better in Supreme Commander and an AMD quad would do better than a triple, which would do better than a dual.

This example proves the point that overclocking a weak CPU only helps somewhat. The really weak CPU's that people still game on are not on 3rd quarter CPU charts anyways. Though the next version will add new processors like the Phenom II 940, they won't go back and add all the Windsors or Allendales.

It's better to balance a CPU with a GPU, but there is no hard and fast rule. My experience is that a 3870x2 was too powerful for a Windsor core X2 at 2.4, but a Phenom 8750 is fine. My gut feeling is that a 4870 1 gig is too powerful for an 8750, but a Phenom II 940 would be fine.

You argue that a limitation is fine, and you dismiss links showing that GPU's are hindered from doing their job without the right CPU at the right resolution (The Inquirer wanted to test out Nvidia's claim that CPU's don't matter). The evidence is clear that the same GPU gets higher framerates with more CPU power.

At any rate, I hope we can agree that 3DMark06 isn't much of a benchmark compared to actual games. Perhaps you just think that 21 fps average is no significant difference compared to 35? Heck, though they don't have the 8750, the 8650 gets almost 10 fps difference over the similarly clocked Kuma dual core. 10 fps difference is a big deal when even the best setups get under 36 fps.


why dont most develoeprs develope games with gpu in midn and gpu powered instead of so many of them relying on peopel having fast cpu's to get good fps? I mean does intel pay them money or something?
January 13, 2009 5:35:20 AM

From what I can gather, more games are GPU limited than CPU limited. CPU limited games would involve massive AI hits for example. GPU limited games would involve draw distances and the number of polygons a model has affecting framerate.

CPU's work with all the components on the motherboard. No component stands alone. Just as a FSB allows for slower memory access than placing the memory controller on the CPU, the ability of a CPU to process data and send it to the GPU affects the frames per second.

Higher clocks and higher IPC's, as well as more cores in many games (or in multitasking while gaming) directly affects the GPU's ability to push those pixels on screen. Most modern processors do not have classic bottlenecks, but there are limitations to how many frames per second a GPU can display at any resolution. Those limitations are partly due to the quality of the GPU (i.e. clocks, memory bandwidth and amount on the card). They are also due to the quality of the CPU.

I understand what stranger is trying to say, but I disagree with him that bottlenecks don't exist just because most people misunderstand the term. This isn't just a matter of how person A vs. person B perceives a game as playable, there are technical limitations.

As for the CPU vs. GPU argument, both Intel and Nvidia debated that in their tiff over Larrabee and GPGPU this past year. The Inquirer, as snarky and rumor filled as it often is, did a few tests to determine if either side was right. I linked to those tests in an earlier post.
January 13, 2009 6:01:08 AM

but still they should be releasing games for the average consumer, not only those who have expensive cpu's form self built computers or those of heavy gmaing lineups. I mean most people have like a dual core 2 something gigs amd or intel not quads. Save those games for like 1-2 yrs form now.
a c 130 U Graphics card
a b à CPUs
January 13, 2009 6:06:38 AM

Quote:
Haha, i do not even think we are disagreeing, you are arguing about when a cpu bottleneck can be said to be happening, i am arguing that when most people say it is it actually isn't, it is just an irrelevant limitation.

If i am incorrect correct me but seriously, i have been going round in circles on here and elsewhere for years, why should i stop now.



You know what i think you are right :lol:  I myself have been trying to explain to people that its fine to buy a GPU that you may, shock horror not actually see all of the FPS its capable of providing at peak performance.

Mactronix :) 
January 13, 2009 7:37:47 AM

I mean look at blizzard and the diablo/starcraft series, all of them are almost the same technology no matter how many years apart except for gameplay, why cant other pc developers rely more on gameplay than grpahics and processing power so more people could enjoy their games without sacrificing?
January 13, 2009 10:29:44 AM

Ideally games need to scale well. Some games do, others do not. Scaling allows people to play with decent looking low settings, but "future proofs" where newer graphics cards do better in framerate and detail.

Take Oblivion as an example. One of the devs working on the magic system had a Radeon 9800 XT in his work PC. When I first played the game, I had a Radeon 9800 Pro on a P4 AGP system. Then, I upgraded that to a Radeon X1600 Pro. Then to a 7600GS on an Athlon X2 PCIe system. Now I have a 3870x2 on a Phenom X3 system that I'll replace with a 4850 or 4870.

With each GPU and CPU upgrade Oblivion improved. I went from medium to high to ultra high settings. "From no AA and AF to max settings. Many people don't play "old" games, but I'll even play games from 1995 on Dosbox. So, when I go back and play Oblivion, or The Witcher every year or so, it eventually maxes out.

MMO's scale well too. LOTRO can run on a 690G's integrated graphics. It looks better on a 780G's integrated graphics, better on a 3650, on a 3870 on a 4650, a 4830, a 4870...get my point?

Developers don't want to leave anyone out. They have to code for a minimum level of graphics, but they also code for the best graphics cards available at development, often with some room to spare.
a b U Graphics card
a b à CPUs
January 13, 2009 11:30:39 AM

Pershing121 said:
but still they should be releasing games for the average consumer, not only those who have expensive cpu's form self built computers or those of heavy gmaing lineups. I mean most people have like a dual core 2 something gigs amd or intel not quads. Save those games for like 1-2 yrs form now.


Thats why games have video settings options. The best hardware has access to the best graphics. Sub-par hardware simply needs to use lower settings, instead of forcing "Very High" and complaining about 22 FPS.

Seriously, if everybody waited two years to jump on the bandwagon (Note, Quads have been out two years already; and i7 supports 8 threads at once), you would have a situation where Crysis would be as graphically detailed as BF2142. Granted, having an optimized engine to allow a greater range of hardware to run games would be great, but with few exceptions, a $150 4850 runs most everything out there out or near max, and a $200 Q6600/E8700 runs everything without issue.

As for RAM, thats simply a Vista issue; i run everything max with 2GB on XP SP3 with no issues whatsoever.
a b U Graphics card
January 13, 2009 3:44:27 PM

Wheres the Crysis for cpus? They could make one. It wouldnt effect Intel or AMD as the real Crysis has the gfx community. They could stuff AI,physics etc etc in a game thatd choke any cpu made easily, but they know the cpu makers are much more limited in their growth potential than gpus are. I think itd change peoples attitudes about quads vs duals etc. Maybe thats exactly what the cpu competitors need, to spur innovation. Get those multi cored cpus selling. Soft ware usually leads HW in gaming, and if they want to fully expand multithreading in games, maybe its something they need to at least look at
a c 130 U Graphics card
a b à CPUs
January 13, 2009 4:18:34 PM

Crickey can you imagine that, a game properly multi threaded to take advantage of all the 8 threads available on an i7 :bounce:  :bounce:  :bounce:  So thats two threads each for a set of Quad GPUs with a Lucid Hydra GPU load balancer optimising the flow :ouch:  :ouch:  .
Ohh im getting all exited have to go now :love:  :pt1cable:  :love: 

Mactronix
January 13, 2009 4:51:47 PM

gamerk316 said:
Thats why games have video settings options. The best hardware has access to the best graphics. Sub-par hardware simply needs to use lower settings, instead of forcing "Very High" and complaining about 22 FPS.

Seriously, if everybody waited two years to jump on the bandwagon (Note, Quads have been out two years already; and i7 supports 8 threads at once), you would have a situation where Crysis would be as graphically detailed as BF2142. Granted, having an optimized engine to allow a greater range of hardware to run games would be great, but with few exceptions, a $150 4850 runs most everything out there out or near max, and a $200 Q6600/E8700 runs everything without issue.

As for RAM, thats simply a Vista issue; i run everything max with 2GB on XP SP3 with no issues whatsoever.


that's bs,
most motherboards dont support a 200 dollar Q6600/E8700 even form new store bought computers msot come with dual core am2 motherboards. Im not talking about those that built their own cause thats a small percentage of pc home users just like those people who use custom firmwar eon their psp's. And Crysis is scaled horribly.
January 13, 2009 4:55:32 PM

I mean it looks like total rap on low settings (crysis) like worse than the original far cry.
January 13, 2009 4:57:52 PM

JAYDEEJOHN said:
Wheres the Crysis for cpus? They could make one. It wouldnt effect Intel or AMD as the real Crysis has the gfx community. They could stuff AI,physics etc etc in a game thatd choke any cpu made easily, but they know the cpu makers are much more limited in their growth potential than gpus are. I think itd change peoples attitudes about quads vs duals etc. Maybe thats exactly what the cpu competitors need, to spur innovation. Get those multi cored cpus selling. Soft ware usually leads HW in gaming, and if they want to fully expand multithreading in games, maybe its something they need to at least look at


most companies of prebuilt computers (which ar emajority of the public) dont allow people to ugprade to those computers without replacing their motherboards and possibly other components and voiding their warranty, and most use the bar eminimum and make people pay a lot for dual cores and wont even include those quads.
a b U Graphics card
January 13, 2009 5:18:04 PM

OK, my point isnt what HP will do, but what AMD and Intel will do. If they phase out duals, whats HP to do? Or Dell? Its the direction theyre heading, and makes a certain amount of sense to do something like this. All itll take is for 1, and only 1 supplier to cave, and allow for a better bios for support, which can be done anyways on alot of OEM prebuilts. The power envelope is the main concern, and would somewhat limit what your talking about, but thats about all. In this economy, theres incentives for this activity, even if it doesnt mean buying a whole new rig
a c 130 U Graphics card
a b à CPUs
January 13, 2009 5:34:27 PM

They cant phase out Duals untill the support is there for the Quads, same as XP they cant get rid of it because there is no support for DX10 and higher, once its all supported there wont be any reason to not get a Quad. ATM a decent Dual is faster than a Quad but the OEM's put Quads in a PC and charge another $200 for it and joe public thinks they are getting something special.

Mactronix
a b U Graphics card
January 13, 2009 5:55:21 PM

Thats true, but looking at history, M$ released 2 OS almost back to back before we got xp. Now we have Vista and W7, and W7 is setup for MT, as well as the newer DX releases. Itll end this year, next year will be the end of duals, and MT is on its way, with a huge kick off this year. Look at Valve for instance, theyre cutting edge right now
January 13, 2009 6:18:42 PM

JAYDEEJOHN said:
OK, my point isnt what HP will do, but what AMD and Intel will do. If they phase out duals, whats HP to do? Or Dell? Its the direction theyre heading, and makes a certain amount of sense to do something like this. All itll take is for 1, and only 1 supplier to cave, and allow for a better bios for support, which can be done anyways on alot of OEM prebuilts. The power envelope is the main concern, and would somewhat limit what your talking about, but thats about all. In this economy, theres incentives for this activity, even if it doesnt mean buying a whole new rig


duuuude, they already phased out single cores I believe yet many computers are still single core (store bought of course like the cheaper models). just because they phase it out doesnt mean they wont keep using it and people keep buying them.
January 13, 2009 6:21:23 PM

how is valve cutting edge when they wont allow me to change my video settings to dx10 (or shader model 4) in video settings of team fortress 2?
a c 130 U Graphics card
a b à CPUs
January 13, 2009 7:53:32 PM

JAYDEEJOHN said:
Thats true, but looking at history, M$ released 2 OS almost back to back before we got xp. Now we have Vista and W7, and W7 is setup for MT, as well as the newer DX releases. Itll end this year, next year will be the end of duals, and MT is on its way, with a huge kick off this year. Look at Valve for instance, theyre cutting edge right now


Its all well and good having a OS that supports MT but its the devs that need to support it.
I really dont think there will be any movement until there are at least a couple of must have games that run noticeably better on more than 2 cores or/and with DX11.
I mean come on if you believe what the companies are saying everyone and his dog are pirating stuff and those that are not are encoding home movies, just the kind of work Quads are made to excel at. If that was the case then Quads would have taken over now.
Its games that matter and until the performance is noticeably better than XP and a C2D then thats what the majority of systems will run.

Mactronix
!