Sign in with
Sign up | Sign in
Your question

Will cpus be bottlenecks in future games?

Last response: in CPUs
Share
June 2, 2008 4:17:23 PM

I was reading this http://www.anandtech.com/showdoc.aspx?i=3320&p=8 and was surprised to see performance increases at 3.4Ghz that was USEFUL in this game. The new gpus cards coming out will surely stomp these cards, and if games get even more cpu demanding, will we see some bottlenecks?
June 2, 2008 4:23:33 PM

JAYDEEJOHN said:
I was reading this http://www.anandtech.com/showdoc.aspx?i=3320&p=8 and was surprised to see performance increases at 3.4Ghz that was USEFUL in this game. The new gpus cards coming out will surely stomp these cards, and if games get even more cpu demanding, will we see some bottlenecks?



I do not think so.
June 2, 2008 4:32:21 PM

If AC sees usable gains, and soon the new cards come out, and we have a more demanding game/s, it looks like we may need Nehalem quickly
Related resources
June 2, 2008 4:46:52 PM

When the new cards do come out, and we finally see Crysis being played at decent fps, I think we may see some real bottlenecking without some serios ocing. Thats today. What about this fall? when the games come out?
June 2, 2008 4:56:28 PM

JAYDEEJOHN said:
If AC sees usable gains, and soon the new cards come out, and we have a more demanding game/s, it looks like we may need Nehalem quickly



There will always be new and improved technologies. Of course, you will see improvements but it does not mean that today's cpus (like core 2 quad over 3 ghz) will cause so slow that you will not play nicely in a year from now. You can play well with those cpus for sure. But, it is no dobut that Nehalem with 4870 or Gx280 will beat today's high end setup for sure. But, this is not the case. In short, you will be able to play game nicely for a year from now.
June 2, 2008 5:01:53 PM

Did you read the article? 60 fps is always what a gamer shoots for. It cant be had on those current cpus at stock. And we arent talking Crysis here. I think this is a developing situation, where we will see it more and more
June 2, 2008 5:09:12 PM

JAYDEEJOHN said:
Did you read the article? 60 fps is always what a gamer shoots for. It cant be had on those current cpus at stock. And we arent talking Crysis here. I think this is a developing situation, where we will see it more and more



I did not say stock. did i? well... my point is not to argue with you.
June 2, 2008 5:25:32 PM

I mean not to argue, sorry if I did or came across that way. This is average framerates, showing with an oc of 3.4 you still get 8% better fps in a playable resolution, using soon to be old archetecture. The next games out will undoubtadly have greater demands, pushing those cards. If the cpus cant bring average fps to 60 in this game, let alone minimum fps, then what will we see from the greater demands put on the cpus? I think its a worthy question. And its reasonable as well. Currently, we see the old K8s being a bottleneck in some situations. I have a feeling the "old" C2D willl be doing this as well
June 2, 2008 5:33:57 PM

I guess what Im saying is, the C2D is starting to show its age, and its time for something new. Cmon Nehalem
June 2, 2008 5:37:59 PM

To answer the subject, I would say the current trend will continue. Where you have some games that require a lot of CPU some that ask for lots of GPU. Besides no one can give you a for sure answer without lieing.
June 2, 2008 5:48:36 PM

True. Theres not one answer anyways as you put it anyhow. But there is a trend starting to show up. Im somewhat forward looking, but I believe this is going to show up sooner than most people think
a c 96 à CPUs
June 2, 2008 8:00:22 PM

I would seriously question any premise derived from that article, especially noting that cpu utilization on the q6600 never rose above 50% (and not even arguing about a console game ported to a PC). And then there's this ....

Quote:
Reading between the lines, it seems clear that NVIDIA and Ubisoft reached some sort of agreement where DirectX 10.1 support was pulled with the patch. ATI obviously can't come out and rip on Ubisoft for this decision, because they need to maintain their business relationship. We on the other hand have no such qualms. Money might not have changed hands directly, but as part of NVIDIA's "The Way It's Meant to Be Played" program, it's a safe bet that NVIDIA wasn't happy about seeing DirectX 10.1 support in the game -- particularly when that support caused ATI's hardware to significantly outperform NVIDIA's hardware in certain situations.

Ubisoft needs to show that they are not being pressured into removing DX 10.1 support by NVIDIA, and frankly the only way they can do that is to put the support backing in a future patch. It was there once, and it worked well as far as we could determine; bring it back


And as far as this ....

Quote:
Yes.

See also http://www.tomshardware.com/review [...] ,1939.html


LOL. That's funny.

The basis for your arguement is a single core 3400+ (Venice) @ 2.2 GHz, AsRock s939 ULI 1695 chipset, three different vendor ram sticks with different timings (run at 166 MHz x2), an 120gb ATA 100 8mb cache IDE hard drive, with AGP versions of PCIe video cards with DX10 and Vista Ultimate ???

A system so unstable ... it could not be overclocked !
June 2, 2008 8:32:18 PM

I was actually reffering to this "If you're looking to get performance above 60 FPS, it's obvious that the first step is going to be purchasing the fastest CPU you can find. AC definitely supports dual-core processors, and even quad-core may be beneficial in certain situations. However, quad-core CPU usage often stays below 50% of the total CPU potential. Thus, an overclocked dual-core processor appears to be the best choice for maximizing AC performance.

During initial testing, we were a bit surprised to find that SLI didn't seem to improve performance. As this is a "The Way It's Meant to Be Played" game, that would have been another serious blow to NVIDIA's credibility. At the time, we were testing with a stock-clocked Q6600 at 1680x1050 and various graphics settings in order to utilize anti-aliasing. It was only when we began overclocking that we discovered the real culprit was CPU performance. There's CPU overhead associated with CrossFire and SLI, so with slower CPUs at moderate resolutions SLI and CrossFire will actually reduce performance in AC.

With an appropriately fast CPU -- at least 3.0 GHz would be our recommendation -- and running at 2560x1600, SLI and CrossFire are able to show substantial performance benefits. SLI and CrossFire both improve performance by 56%, but we still appear to be at least somewhat CPU limited. Increasing the CPU clock speed to 3.42 GHz on the X38 system (the maximum stable result for this particular system) further improves CrossFire scaling to 62%."

June 2, 2008 8:35:09 PM

The 50% was refering to the fact that its not a quad optimized game
June 2, 2008 8:56:47 PM

Actually, it's really becoming the opposite. The processing is one way. The CPU does its thing and then sends the data to the GPU, so the GPU can do its thing. Bottlenecks are software/hardware impediments that limit the flow of data. If the PCIe bus and GPU no longer throttle the CPU, then the bottlenecks are being removed.

So yea, without a bottleneck, the CPU side of the equation becomes the limiting factor. However, being that we only need so much FPS (like 40) before you cannot see any difference, I wonder what game writers will do to throttle the games.

When I wrote a small DirectX program, it ran so freaking fast (1000 or so FPS) that it tested my computer's cooling systems. So, I throttled it by adding a Sleep() function call.

If 40 FPS is all we need in games, pumping out more just wastes power and produces heat. If the GPU is no longer a limiting factor, game writers will need to throttle the games on the CPU side.
June 2, 2008 9:04:37 PM

Ask any FPS gamer if 40fps is enough. When you cross the screen as fast as you can, and the gpu keeps pace, then the cpu starts a bottleneck and the fps drops to 30, youre dead.
June 2, 2008 9:11:56 PM

on a single card on any platform a the Core 2 CPU at 3Ghz will not be bottleneck the gaming performance.but if multi GPU the CPU will have some serious work as the 2 GPU demand more data to process.

and another thing to consider is the AI and physics is done by the CPU so if there is more physics in the game then the CPU will be more important than it would be in other game

.Crysis is a good example as proved by its own benchmark and other games.different CPU speed on the same setup will make a big influrence in the FPS numbers.

and dont we all love the graphics inside that game!!!lol
June 2, 2008 9:19:50 PM

The G280 is rumored to be some 30 to 40% faster than that X2 card, what then?
June 2, 2008 9:22:53 PM

but the nehalem is 45% faster per clock then the Penny.

you are comparing the next gen so i bring you the next gen in CPU also.

how about that!haha
June 2, 2008 9:26:52 PM

iluvgillgill said:
but the nehalem is 45% faster per clock then the Penny.

you are comparing the next gen so i bring you the next gen in CPU also.

how about that!haha


here is what you want to read.

http://www.tomshardware.co.uk/forum/248332-10-intel-neh...

http://www.vr-zone.com/articles/Biostar_Launched_P45_Bo...

http://www.vr-zone.com/articles/GeForce_GTX_280_%26_GTX...

even though the GTX280 is not officially prove by well known sources but so is the HD4xxx but performance increase is looking at about 50% for both company's offering compare to the current gen.
June 2, 2008 9:30:46 PM

And itll be out in 2 weeks. When will Nehalem , or a faster C2D be here? And in the mean time, by the time Nehalem does get here, therell be a refresh, making these G280s even faster. And it appears games are getting more and more demanding
June 2, 2008 9:36:16 PM

well as you said before in another thread.the 55nm is what will change Nvidia's offering to another level.

and everything in the IT world move on so quickly(its even quicker then playboy picking prey).so you cant really wait and get it.

so you either get it and live with it or just sit back for the rest fo your life and wait.............UNTIL INFINITY!!!!lol
June 2, 2008 9:40:26 PM

What would be nice, since unfortunately AMD cant push Intel with their cpus, that gpus would do it, and move it along a little faster. Maybe this is what itll take to get Intel moving faster, tho they really havnt missed with their tic toc schedule
June 2, 2008 9:43:42 PM

you looking forward to Larrabee?but thats looooooooooooooooooooooooooooong
June 2, 2008 11:34:22 PM

thats the same link i gave you jay><
June 2, 2008 11:42:40 PM

Look at the link numbers, they arent even the same . Your links go to gpu abilities, mine, if youd even bothered to check it out, shows a Nehalem benchmark
June 2, 2008 11:49:10 PM

Bah, I have e6750 at stock and I play AC with everything maxed at 1680x1050. That game definitely doesn't require 60fps to be satisfied with. I don't have any other games that are not playable because of my "slow" CPU.
June 2, 2008 11:50:45 PM

Dont buy Crysis
June 3, 2008 1:52:25 AM

well that link is in my thread if you actually went and look.and its in the same session as well if you have look around the forum.

and download crysis
!