mac981

Honorable
May 14, 2012
54
0
10,630
An i3-2100 would bottle-neck a GTX 660 TI in several games, especially it's maximum frame rates even in games that generally are at least somewhat CPU-bound rather than very CPU bound. The 660 TI has the GTX 670's GPU. Do you really expect the i3 to cope perfectly, geekapproved?
 
You have cpu bottlenecking video card and cpu bottlenecking game confused son.

Cores don't bottleneck video cards. Clock speed and architecture bottlenecks video cards.

Which is precisely why a 3.1ghz i3 throws higher framerates than a 4.5ghz FX4100.

http://www.tomshardware.com/reviews/gaming-fx-pentium-apu-benchmark,3120-9.html

The only reason the i5 throws higher frames than the i3 is because it's 300mhz faster.

There is no video card made today that would be held back by a sandy bridge i3.
 
I'm not confusing anything. At about 4.5GHz, an FX CPU is almost as fast as a roughly 3GHz Sandy Bridge CPU in performance per core. Older tests suc has that confirm it. Newer tests show where some games with newer patches are better at efficiently using several cores/threads and more parrallel CPUs are improved in performance in that game relative to an older patch. New patches for SC2 and WoW, among other games, let them more effectively utilize multi-core CPUs with four or more cores and that is why FX wins in newer tests more often. Stop posting tests from almost a year ago. They are outdated and irrelevant.
 
They AREN'T current games. Those tests were old and used outdated versions of the games. How could they possibly be relevant? I don't choose my CPU company by looking at Athlon 64/FX versus P4/PD benchmarks today, so why should I look at performance benchmarks and accept them as still true when they are outdated too? That doesn't make any sense at all. People often try to prove me wrong with outdated benchmark,s but it never works because the benchmarks are simply outdated. I will never understand why so many people like to use outdated benchmarks. They aren't relevant because they don't show how what they were testing performs in current situations, even with the same tests, but simply more up to date.

I guess we all know that an i3 is better than an i5 that has a slightly lower clock frequency because a slightly faster per core CPU is better than a higher end, but slightly slower per core quad core CPU. We all know that Phenom II x6s are worse than Phenom II x4s for gaming these days.

None of that is true because most modern games are more well-threaded than they games used to be, even many older games that are continually updated (especially Blizzard games). Tom's own up-to-date tests show this.

I'm not trying to be rude, but maybe I failed in trying not to be, but this is ridiculous. Outdated benchmarks are outdated. They aren't relevant in this context and they probably never will be relevant in this context again.

Heck, even in older tests, Tom's proved that an i3 can be a bottle-neck in many games and not just because of core count either. Performance per core is important too and it always has been for gaming. It always will be so long as CPU technology is organized into cores and similar structures. It's not always the end-all be-all, but it is always a factor and that is unavoidable, just as clock frequency has always been a factor in performance of something that has one and will be so long as we use synchronous electronics.

I don't need to read up on any of this. These are unavoidable aspects of computing with current methods of technology. They haven't changed ever since computers we first made synchronous electronic CPUs and they won't change so long as we use them, perhaps even longer.

You can argue nonsense all you want, but like I said, this is ridiculous and I'm done arguing over it. If you want to, then go ahead and keep saying that nearly one year old benchmarks of games that have had substantial patches since back then are still relevant. This is like saying that benchmarks of AMD's Radeon 7000 cards with the old December driver are still relevant today for gauging graphics card performance despite that too obviously being not true in the least, granted several people have tried to tell me otherwise.
 
http://www.tomshardware.com/reviews/world-of-warcraft-cataclysm-directx-11-performance,2793-7.html

Oh yes, telling the truth is so silly. The DX11 patch for WoW is one such example, although it's not the latest. WoW originally could only use one core very well, then two, then three, and with this, it could use four, although very fast cores could negate scaling if the graphics couldn't keep up. For example, before this patch, WoW could scale substantially upwards in performance with up to three AMD Phenom II cores, but Intel only needed two cores at a similar clock frequency to have similar performance. WoW's low system requirements are one example of its fairly well-threaded design at this point being hidden unless you got a very powerful graphics setup and let the FPS fly, maybe like maxing out a 3D display that can display 120Hz per eye (four times more performance required than maxing out a 2D 60Hz display).

Heck, Skyrim too had a CPU patch so that it could utilize newer instructions and this made it far les CPU limited than before, so this isn't even the only example of performance patches. A lot of games get them.
 

metal orient

Distinguished
Mar 17, 2011
706
0
18,990


While i'm with you that patches can make the game engine more efficient and so become less processor bound it doesn't answer whether or not an i3 will bottleneck a GTX 660 Ti.

Saying that people use outdated benchmarks and then the only example you give is a 2 year old article, which does't even cover sandy bridge cpus, suggest you do too.
 
An i3 can and will bottle-neck a GTX 660 TI in some games. It doesn't have the performance to keep up with the GK104 even when it is lacking a memory controller and related hardware. Also, the old article that I used didn't need to be new because the point that I was using it for was that some games do get patches that change their performance characteristics. If I was using it to prove how the CPUs would perform today, then it would be outdated, but for what I used it for, it was not. It used as an example, not a benchmark of how the CPUs would perform in the games today.