AMD CPU speculation... and expert conjecture - Page 90
Tags:
-
AMD
-
CPUs
Last response: in CPUs
hcl123
June 12, 2013 6:42:36 PM
And even this 32nm shows it could have more lol
A test of the A10 6700 "Richland" 65W
~40% efficiency shown for a similar clocked 5800 on perf/power
result showing possibilities of above 20% gain for graphics stufss (much included here games, more if the mem controller with higher speed DRAM is used... i think...)
http://translate.googleusercontent.com/translate_c?act=...
UPDATE:
also wonder if a FM2+(edit lol) platform is used along with higher clock DRAM (which richland might support better) then the 31% (salt) better graphics/games of Richland can materialize
http://wccftech.com/asus-shows-a88xm-pro-fm2-socket-mot...
A test of the A10 6700 "Richland" 65W
~40% efficiency shown for a similar clocked 5800 on perf/power
result showing possibilities of above 20% gain for graphics stufss (much included here games, more if the mem controller with higher speed DRAM is used... i think...)
http://translate.googleusercontent.com/translate_c?act=...
UPDATE:
also wonder if a FM2+(edit lol) platform is used along with higher clock DRAM (which richland might support better) then the 31% (salt) better graphics/games of Richland can materialize
http://wccftech.com/asus-shows-a88xm-pro-fm2-socket-mot...
-
Reply to hcl123
hcl123 said:
8350rocks said:
Yuka said:
palladin9479 said:
Cazalan said:
The compiler issue has been beaten like a dead horse. If you want better benchmarks then stop talking about Intel/AMD and put your foot to the websites doing the reviews.
Promote GCC test suites like this one.
http://www.phoronix-test-suite.com/
There's actually a very easy way to check to see if the ICC was used. Don't search for Compiler strings or other test information, it's trivial to change that. Instead search the binary for the specific machine language that indicates the Intel dispatcher (the part that cheats) is present.
http://www.softpedia.com/get/Programming/Patchers/Intel...
That will search the executable (I'm working to see if I can get it to work with libraries) for the exact instructions used. What the dispatcher essentially does is a sequence of string checks with the result being a binary 1 (Genuine Intel found) or 0 (non-Intel found). If the result is 1 then the dispatcher will do a check of the CPUID flags and set the appropriate code path, if 0 then it will force the lowest code path compiled. The way around that is to change the function to always return 1 which is as simple as changing a few values in a hex editor. Then the dispatcher will always check the flags and chose the proper code path.
I've been checking various programs and I see the intel dispatcher code in many of them. When I get home I'll post more info on what I found. Suffice to say that "Maxon Cinebench 11.5" reports it's MSVC yet contains the ICC dispatcher instructions.
I'll ask friends to bench games with that program. If we get improvements, we should spread this like fire in a dried jungle.
I've always wondered why no one created this program before and went viral. Uhm...
Cheers!
Agreed...imagine if it did go viral...curious to see someone run cinebench 11.5 with this now...just to see what the results would be.
Don't get overexcited. What the ICC dispatcher does in CB11.5 is use AVX on intel and SSE2 on others (AMD)... IIRC AVX performance on "BD FlexFPU" is not a big improvement. The interesting part would be to use SSE4.*, then the improvement could easily reach 2 digits of percentage on FlexFPU... but of course for this you'll have to recompile lol
But then we can say it will not be fair for intel to compare AVX against SSE4... not oranges to oranges... (EDIT)
Depends on the version of ICC used. Older versions were i386 vs SSE2/3. Newer versions depend on whether it's x86 or x64 and what libraries were present. Agner did a big break down on the "new" ICC and what code is supposed where, his conclusion was that even with the "fixed" ICC there was a ton of code discrimination going on. The best you can currently hope for is SSE2 on non-Intel CPU's and if they were using ICC 10 (2008) then it's i386 vs SSE4.2.
The differences are in small functions that tend to do lots of brute force lifting, so anywhere from 5~30% (mostly around 10%) improvements are possible. Value performance crowns have been given away for less.
-
Reply to palladin9479
On more Hasfail news:
http://techreport.com/news/24950/intel-removes-modest-f...
Looks like they'll lose the "cheap platform" place they won in TH's Build Marathons. Wonder what gamers on a tight budget will say.
Cheers!
http://techreport.com/news/24950/intel-removes-modest-f...
Looks like they'll lose the "cheap platform" place they won in TH's Build Marathons. Wonder what gamers on a tight budget will say.
Cheers!
-
Reply to Yuka
Related resources
- Tek Syndicate: Expert Conjecture and Speculation - Forum
- AMD Steamroller rumours ... and expert conjecture - Forum
- Will the 4+4 pin CPU power on the Corsair RM850 Gold reach to 8 pin header on the Asus Z87- Expert in the Corsair Obsidian 750 - Tech Support
- I need a CPU expert's opinion on this - Tech Support
- 5/5 on CPU expert, but no badge? - Forum
What irks me about Haswell is the price rise for the Xeon E3-1230 series SKU
SNB: http://ark.intel.com/products/52271/
TRAY: $215.00
BOX : $230.00
IVB :http://ark.intel.com/products/65732
TRAY: $215.00
BOX : $230.00
Haswell: http://ark.intel.com/products/75054/Intel-Xeon-Processo...
TRAY: $240.00
BOX : $250.00
Right now the Haswell version sells for $270 whilst the IVB version sells for $235 on Newegg.
This was my favourite CPU series because you got HyperThreading like a Core i7 but paid a little more than a Core i5.
(But lost the use of the IGP for a lower TDP)
Intel figured it out.
At $270, I might as well buy a Core i7 for QuickSync.
SNB: http://ark.intel.com/products/52271/
TRAY: $215.00
BOX : $230.00
IVB :http://ark.intel.com/products/65732
TRAY: $215.00
BOX : $230.00
Haswell: http://ark.intel.com/products/75054/Intel-Xeon-Processo...
TRAY: $240.00
BOX : $250.00
Right now the Haswell version sells for $270 whilst the IVB version sells for $235 on Newegg.
This was my favourite CPU series because you got HyperThreading like a Core i7 but paid a little more than a Core i5.
(But lost the use of the IGP for a lower TDP)
Intel figured it out.
At $270, I might as well buy a Core i7 for QuickSync.
-
Reply to amdfangirl
@hcl123: i got the idea of hd4k@128 shader alus and haswell gt2@160 shader alus et al from techreport's gt3e review, compared with trinity and kabini specs and thought that those could be comparable.
thanks for the extra insight.
now i wait for analysis from english website. hopefully toms will publish more on haswell (and richland/kabini too...). those guys are strangely quiet. usually the oc/uc analyses, mobile comparos and intel/amd comparos come out after a week or two of the launch...
thanks for the extra insight.
now i wait for analysis from english website. hopefully toms will publish more on haswell (and richland/kabini too...). those guys are strangely quiet. usually the oc/uc analyses, mobile comparos and intel/amd comparos come out after a week or two of the launch...
-
Reply to de5_Roy
noob2222 said:
The 360 is tri-core, the PS3 is single-core (two threads) plus 6 SPUs. That is why most PC games are using few cores. Next consoles will be both eight cores. Moreover, both use lower clocks. Therefore game developers are not going to use one or two cores and ignore the rest (as happens today with most PC games). The first demo of a PS4 game is already using six cores.Both wrong. 360 has a tri-core, POWER7 based CPU capable or running two threads per core.
http://en.wikipedia.org/wiki/Xenon_(processor)
Quote:
Xenon is a CPU used in the Xbox 360 game console. The processor, internally codenamed "Waternoose", which was named after Henry J. Waternoose III in Monsters, Inc. by IBM[1] and XCPU by Microsoft, is based on IBM PowerPC instruction set architecture, consisting of three independent processor cores on a single die. These cores are slightly modified versions of the PPE in the Cell processor used on the PlayStation 3.[2][3] Each core has two symmetric hardware threads (SMT), for a total of six hardware threads available to games. Each individual core also includes 32 KiB of L1 instruction cache and 32 KiB of L1 data cache.The PS3's PPE has 7 functional SPE elements, one reserved for the OS, leaving 6 hardware threads available to games.
So consoles have supported 6 hardware threads since 2006, making your entire argument wrong.
The reason PC's don't use the same amount of CPU resources is because the underlying architecture is so significantly different. Different memory and threading subsystems, and the lower absolute performance of the consoles forces developers to use REALLY low-level code in some places, and a lot of the memory that games use is static; EG: you know ahead of time where most every variable is located in RAM, which allows for very fine-tuned performance. You can't do this on a PC, where the entire memory subsystem is virtualized. You also don't have as large an OS, so you get a lot less performance loss due to context-switches locking up the CPU every time a kernal thread needs to run.
-
Reply to gamerk316
tuklap
June 13, 2013 7:34:06 AM
Yuka said:
On more Hasfail news:http://techreport.com/news/24950/intel-removes-modest-f...
Looks like they'll lose the "cheap platform" place they won in TH's Build Marathons. Wonder what gamers on a tight budget will say.
Cheers!
Can someone please explain what this means in the link in simple terms and if it matters that much.
"Paying the extra for a K-series product also means giving up support for one of Haswell's key features, the TSX extensions that enable transactional memory. Intel has stripped out the VT-d device virtualization and vPro management features in the K series, as well."
-
Reply to simon12
Yuka said:
On more Hasfail news:http://techreport.com/news/24950/intel-removes-modest-f...
Looks like they'll lose the "cheap platform" place they won in TH's Build Marathons. Wonder what gamers on a tight budget will say.
Cheers!
Gamers on a tight budget would likely get the Athlon X4 750K for $85. Native 3.4/4.0 and unlocked.
The fused off GPU adds area to the die to allow better cooling. Which theoretically should let it overclock past an FX-4350 much easier.
-
Reply to Cazalan
Cazalan said:
Yuka said:
On more Hasfail news:http://techreport.com/news/24950/intel-removes-modest-f...
Looks like they'll lose the "cheap platform" place they won in TH's Build Marathons. Wonder what gamers on a tight budget will say.
Cheers!
Gamers on a tight budget would likely get the Athlon X4 750K for $85. Native 3.4/4.0 and unlocked.
The fused off GPU adds area to the die to allow better cooling. Which theoretically should let it overclock past an FX-4350 much easier.
I got a MSI FM2A85XA-GD65 + 750K + ARES Blue 1600's for $225 equivalent, paired it with a MSA Twin Frozr GTX650ti Boost 2GB card, games like a champion.
-
Reply to sarinaide
hcl123
June 13, 2013 11:50:30 AM
A review of Richland 6800K
http://translate.googleusercontent.com/translate_c?act=...
As i posted Richland is quite a good OC, 4.7ghz stable (HSW was 4.4 no ?) with 1.45v (quite good, wonder FM2+platform)
http://translate.googleusercontent.com/translate_c?act=...
128W at load taxing the CPU with 4.7ghz (yes the marked or rated TDP never was the typical power not now not ever)... fantastic
http://translate.googleusercontent.com/translate_c?act=...
Games benchs also show a complete different picture from previous, includes a HSW 4670K
http://translate.googleusercontent.com/translate_c?act=...
EDIT: comparing with a Deneb 965 on CPU tests, it seems AMD catched the performance of the old K10 with this tweak.
http://translate.googleusercontent.com/translate_c?act=...
As i posted Richland is quite a good OC, 4.7ghz stable (HSW was 4.4 no ?) with 1.45v (quite good, wonder FM2+platform)
http://translate.googleusercontent.com/translate_c?act=...
128W at load taxing the CPU with 4.7ghz (yes the marked or rated TDP never was the typical power not now not ever)... fantastic
http://translate.googleusercontent.com/translate_c?act=...
Games benchs also show a complete different picture from previous, includes a HSW 4670K
http://translate.googleusercontent.com/translate_c?act=...
EDIT: comparing with a Deneb 965 on CPU tests, it seems AMD catched the performance of the old K10 with this tweak.
-
Reply to hcl123
juanrga
June 13, 2013 12:54:34 PM
palladin9479 said:
There's actually a very easy way to check to see if the ICC was used. Don't search for Compiler strings or other test information, it's trivial to change that. Instead search the binary for the specific machine language that indicates the Intel dispatcher (the part that cheats) is present.
http://www.softpedia.com/get/Programming/Patchers/Intel...
That will search the executable (I'm working to see if I can get it to work with libraries) for the exact instructions used. What the dispatcher essentially does is a sequence of string checks with the result being a binary 1 (Genuine Intel found) or 0 (non-Intel found). If the result is 1 then the dispatcher will do a check of the CPUID flags and set the appropriate code path, if 0 then it will force the lowest code path compiled. The way around that is to change the function to always return 1 which is as simple as changing a few values in a hex editor. Then the dispatcher will always check the flags and chose the proper code path.
I've been checking various programs and I see the intel dispatcher code in many of them. When I get home I'll post more info on what I found. Suffice to say that "Maxon Cinebench 11.5" reports it's MSVC yet contains the ICC dispatcher instructions.
I suspect something similar happens with x264 (Windows). It is not compiled with ICC but seems to check CPUID and run optimal path for GenuineIntel.
Yuka said:
I'll ask friends to bench games with that program. If we get improvements, we should spread this like fire in a dried jungle.I've always wondered why no one created this program before and went viral. Uhm...
Cheers!
I am interested. I could prepare graphics with the before/after scores, tweet about all this and use part of my server-space to upload the graphics. Although I will be very busy next week.
-
Reply to juanrga
Looking at the linux benchmarks on phoronix for FX8350, it still doesn't really beat ivybridge. I doesn't seem much better than in windows for single threaded programs. ICC may be there to throw off AMD performance in a lot of synthetics but the real world software difference probably isn't that big.
-
Reply to esrever
juanrga
June 13, 2013 4:47:24 PM
esrever said:
Looking at the linux benchmarks on phoronix for FX8350, it still doesn't really beat ivybridge. I doesn't seem much better than in windows for single threaded programs. ICC may be there to throw off AMD performance in a lot of synthetics but the real world software difference probably isn't that big. In average, the FX-8350 was about a 10% behind the i7-3770k in the old benchmarks. Being up to a 42% faster than the i7 in some test. Those benchmarks used a version of GCC (4.7) that lacked some optimizations beyond bulldozer.
There is some recent benchmarking using version 4.8 of GCC where the FX-8350 (8-threads) is competitive with i7-3960X (a 12-threads $1000 chip)
http://www.phoronix.com/scan.php?page=article&item=llvm...
You can even see the cheap AMD chip beating the eXtreme chip in some few tests
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
I don't know if GCC 4.8 is already using all the performance of the FX chips, but seeing it beating a 5x more expensive chip shows, I think, that the FX is a much better chip than usually considered.
-
Reply to juanrga
Kulasko
June 13, 2013 4:53:46 PM
Kulasko said:
I just have tested Cinebench 11.5 with and without the Intel Compiler Patcher...To my suprise, it made absolutely no difference in performance...
It won't. That patch is for older versions of ICC, the newer ones add a second check for the family of the CPU. If it doesn't recognize the family then it sets the CPU Options variable to 080H which is just basically SSE2. They talk about it more on those sites I linked as does agner. Mostly the ICC patcher will let you know that there is screwy code involved, it's hit and miss on being able to override it.
-
Reply to palladin9479
juanrga
June 13, 2013 5:49:15 PM
copichael7
June 13, 2013 6:18:23 PM
GOM3RPLY3R
June 13, 2013 6:48:34 PM
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
-
Reply to GOM3RPLY3R
GOM3RPLY3R
June 13, 2013 6:59:47 PM
juanrga said:
esrever said:
Looking at the linux benchmarks on phoronix for FX8350, it still doesn't really beat ivybridge. I doesn't seem much better than in windows for single threaded programs. ICC may be there to throw off AMD performance in a lot of synthetics but the real world software difference probably isn't that big. In average, the FX-8350 was about a 10% behind the i7-3770k in the old benchmarks. Being up to a 42% faster than the i7 in some test. Those benchmarks used a version of GCC (4.7) that lacked some optimizations beyond bulldozer.
There is some recent benchmarking using version 4.8 of GCC where the FX-8350 (8-threads) is competitive with i7-3960X (a 12-threads $1000 chip)
http://www.phoronix.com/scan.php?page=article&item=llvm...
You can even see the cheap AMD chip beating the eXtreme chip in some few tests
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
http://openbenchmarking.org/embed.php?i=1305170-UT-LLVM...
I don't know if GCC 4.8 is already using all the performance of the FX chips, but seeing it beating a 5x more expensive chip shows, I think, that the FX is a much better chip than usually considered.
Nice to see you again juanrga. I would like to point out that you argument is both valid and invalid. Did they point out the clock speeds, voltages, and heat? I think you know this already, but for an 8350 to come close to a i7-3770k stock at best, it would need some serious tuning (overclocking), which also results in more heat (burns computers). Yes I know AMD can run hotter, but thats to the fact that they need more raw materials to make more cores which have to be clocked higher so they can par up.
Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).
Now what if we did the vice versa and made the i5 an 8 core processor and overclocked to 4.0. I would say that it's easily and extreme series CPU and may be the best on the market for performance.
It all comes down to the price. The main argument is between the (no offence) inexpensive, less wealthy people, versus the more profited people. If you have the money, Intel is the better choice (unless your on a budget or video editing or doing things with more OpenCL or just want to make your computer run very hot). Really now for gaming these days, if you have a Core 2 Quad Q8300 @ 2.5 (my old CPU), as long as, for GPUs, you have something like a 680 or 7970 or more, then your fine. ^_^
-
Reply to GOM3RPLY3R
Quote:
Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).
This is logically incorrect.
Clock speed means absolutely squat. The only thing that matters is performance vs cost and for some folks vs energy usage. There already exists that "cut in half" 8350, it's the 4xxx series and their significantly cheaper then an i5. FX8350 is $199, it's nearest competitor is i5-3570K @ $214. The FX will win in any benchmark that actually use's it's resources. If the application is heavily restricted to 1~2 threads then the i5 will be ahead. The best indication of this is the i3 scores. The i3 is exactly 50% of the CPU resources that the i5 has. If the i3 is scoring anywhere near the i5 (at same clock speed) then the application coding is what's limiting your system.
-
Reply to palladin9479
GOM3RPLY3R said:
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
-
Reply to 8350rocks
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.
Flyingsuicide.net
Flyingsuicide.net
-
Reply to sarinaide
juanrga
June 14, 2013 3:09:56 AM
GOM3RPLY3R said:
juanrga said:
In average, the FX-8350 was about a 10% behind the i7-3770k in the old benchmarks. Being up to a 42% faster than the i7 in some test. Those benchmarks used a version of GCC (4.7) that lacked some optimizations beyond bulldozer.
Nice to see you again juanrga. I would like to point out that you argument is both valid and invalid. Did they point out the clock speeds, voltages, and heat? I think you know this already, but for an 8350 to come close to a i7-3770k stock at best, it would need some serious tuning (overclocking), which also results in more heat (burns computers).
Hi. The above figures are for the 8350 @ stock.
sarinaide said:
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.Flyingsuicide.net
Very interesting link. Thanks!
-
Reply to juanrga
juanrga
June 14, 2013 3:17:15 AM
palladin9479 said:
FX8350 is $199, it's nearest competitor is i5-3570K @ $214. The FX will win in any benchmark that actually use's it's resources. If the application is heavily restricted to 1~2 threads then the i5 will be ahead.Right, only to add that the i5 is faster (will it continue to be after the FX-fix reported above?) if you run one of those applications, wait, close it, run another... Most people runs several applications at once and then they find how their FX chip is faster than their i5/i7.
-
Reply to juanrga
8350rocks said:
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
Uh, no. For one, even though there are a couple thousand threads running at a time on a PC system, only a handful are in a "ready to run" state at any given time. You don't NEED to run that many theads at once. Secondly, most tasks aren't time sensitive; if your UI is delayed by 50ns because you have to wait for the thread to get swapped in, guess what? You don't care.
Thirdly, most modern GPU's with some form of compute [everything since teh 8000 series from NVIDIA] can offload work from the CPU in some fashion; the PS4 is hardly unique in that regard. Whether you see a performance increase is largely dependent on scale though; I wouldn't bother to offload anything unless it scales to AT LEAST 32-GPU compute units, due to how relatively weak a single GPU compute resource is compared to a single CPU core.
So no, if you installed a full version of Win 7 64 on it, the PS4 would have roughly half the performance of a medium-grade gaming PC.
-
Reply to gamerk316
juanrga said:
sarinaide said:
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.Flyingsuicide.net
Very interesting link. Thanks!
friend of mine
-
Reply to sarinaide
Quote:
I suspect something similar happens with x264 (Windows). It is not compiled with ICC but seems to check CPUID and run optimal path for GenuineIntel.Glad you brought this one up:
http://www.behardware.com/articles/847-14/the-impact-of...
Some things to note:
1) x264 is only officially supported to be compiled via GCC, so you can't blame ICC for any performance bias.
2) Performance across all performance profiles flatlines unless assembly optimizations are enabled, which increases baseline performance by at least a factor of 3.
3) Performance profiles show almost zero change in performance, regardless if assembly optimizations are used or not. Even going from i686 to corei7-AVX on a 2600k shows virtually no change in performance.
So not seeing any real bias out of x264 based on independent compiler testing. Enabling assembly optimizations causes baseline performance to improve by about ~3.7x for both SB and BD (SB has a higher baseline, however), and about 3x for PII. Specific profiles show almost ZERO performance benefit for all three architectures. In short: You can't blame compiler optimizations for any performance differences between AMD and Intel on the x264 benchmark.
-
Reply to gamerk316
lilcinw
June 14, 2013 10:22:00 AM
sarinaide said:
juanrga said:
sarinaide said:
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.Flyingsuicide.net
Very interesting link. Thanks!
friend of mine
It seems like he left the crucial details out of his report. What is the 'feature' that he found? What does it do and why does it improve x87 performance so much when it is disabled? Who uses x87 anymore?
It feels like he is hiding something which makes me suspicious of his results.
-
Reply to lilcinw
lilcinw said:
sarinaide said:
juanrga said:
sarinaide said:
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.Flyingsuicide.net
Very interesting link. Thanks!
friend of mine
It seems like he left the crucial details out of his report. What is the 'feature' that he found? What does it do and why does it improve x87 performance so much when it is disabled? Who uses x87 anymore?
It feels like he is hiding something which makes me suspicious of his results.
Well, MSVC still compiles to X87 by default. So you'd be surprised how much X87 code shows up during compilation when you don't set your switches right.
-
Reply to gamerk316
lilcinw
June 14, 2013 10:31:01 AM
lilcinw said:
sarinaide said:
juanrga said:
sarinaide said:
Found this particularly interesting and when its ready can't wait to test myself. It is no doubt that software level constraints are found in many synthetics.Flyingsuicide.net
Very interesting link. Thanks!
friend of mine
It seems like he left the crucial details out of his report. What is the 'feature' that he found? What does it do and why does it improve x87 performance so much when it is disabled? Who uses x87 anymore?
It feels like he is hiding something which makes me suspicious of his results.
He had stated once they have workable writes, he will ensure it becomes accessible and results be published, he also has extremely good contacts which helps, I don't dispute his findings because he is reputable, lets just see what he comes up with.
-
Reply to sarinaide
8350rocks said:
GOM3RPLY3R said:
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
If you think the 8th gen of consoles is going to be more powerful than gaming PCs, no offense but you should be in the loony bin. I might get a PS4 for some exclusives, but pretty much everything else is best played on the PC.
-
Reply to montosaurous
GOM3RPLY3R
June 14, 2013 6:24:33 PM
montosaurous said:
8350rocks said:
GOM3RPLY3R said:
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
If you think the 8th gen of consoles is going to be more powerful than gaming PCs, no offense but you should be in the loony bin. I might get a PS4 for some exclusives, but pretty much everything else is best played on the PC.
Totally agree here. Consoles are made for the average consumer, or someone who has a great deal of interest in gaming. You can NOT think your amazing because you have a console. Consoles are a great way to pass time and or play with friends, but on a serious note, PC is the way to go.
As for the comment about the consoles, 8350, gamer is completely right. There will be no point that it needs that many threads. It would only really need it for someone that uses it for extensive testing or something, but other than that, its a waste. And yes on a basis of actuality the PS4 runs off strictly AMD parts, and the reasoning (don't deny them, they are true):
- The games don't need serious power
- The parts can handle more heat (for people who put their consoles in very small/enclosed spaces)
- They are much cheaper than other parts
Thus being said, really your idea of 'AMD integrity' comes from the failed realization that: the games for these systems could probably run on a Dell with a Pentium 4 @ 2.0 and a AMD 5000 series card at great frames.
In terms of your account of the "Consoles Breaking Ahead," that would mean that Intel and Nvidia would probably go out of business, and the consoles would be well more than $2000.
You, being an AMD fan (no offence), I would think you would keep the reality in your mind that AMD is a Budget Friendly Service so to say. Honestly, I don't know why you can get it through your dense brain your knowledge that the AMD family requires parts that can resist more heat from the jumped overclocking and extra amount of cores/material power that it needs, also not to forget the power consumption, so they go cheap with the processing.
Lets say that if AMD were to continue with the exact same clock/core amount/etc. however (essentially to fit), use materials on the Intel type basis, they would costs extreme sums of money that would be insane. They would probably perform amazing, but just too expensive. That is why they use much cheaper materials so it doesn't cost as much, which in turn drops performance. Realistically on the economical basis, they are very good buys, but mostly only for if you are OpenCLing (using OCL), Video Editing, and/or are on a tight budget.
-
Reply to GOM3RPLY3R
GOM3RPLY3R said:
... the games for these systems could probably run on a Dell with a Pentium 4 @ 2.0 and a AMD 5000 series card at great frames.As a former Dell and Pentium owner, there is no way that could run upcoming console games. A faster 2.8Ghz Pentium 4 couldn't even handle 360/PS3 era games smoothly.
-
Reply to anxiousinfusion
Yeah I'd say it will take an FX 6100 and Radeon 6870 to play upcoming console games with console quality and console frames. Last gen though is a different story. I'm not saying 8350Rocks is completely wrong however. He does have good points that future games will likely use more threads, which will benefit AMD and give the 8xxx line an edge over the i5. And currently, only heavily CPU intensive games that are lightly threaded will show benefit from Intel over AMD.
-
Reply to montosaurous
GOM3RPLY3R
June 14, 2013 7:57:17 PM
anxiousinfusion said:
GOM3RPLY3R said:
... the games for these systems could probably run on a Dell with a Pentium 4 @ 2.0 and a AMD 5000 series card at great frames.As a former Dell and Pentium owner, there is no way that could run upcoming console games. A faster 2.8Ghz Pentium 4 couldn't even handle 360/PS3 era games smoothly.
It was just a quick exaggeration, sorry
-
Reply to GOM3RPLY3R
cowboy44mag
June 14, 2013 9:10:15 PM
8350rocks said:
GOM3RPLY3R said:
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
I have to say I agree with a lot of what 8350 said in this post. The timing is going to be the deciding factor in all of that though. It all comes down to how far the software studios have to push the hardware to get the graphics and content they want. It took the better part of six years for the Xbox 360 and PS3 to be pushed to using all their resources by software studios, but it eventually happened. That is why if you look at blogs for the newest games out 360 and PS3 owners are complaining about freezes, and crashes. The software is pushing the hardware past what it is capable of.
The PS4 is an impressive piece of technology, and if studios came out of the gate pushing the hardware for everything it was worth running 60+ threads a 3570K would be left in the dust. In fact most, if not all, current gaming PCs couldn't keep up. However I don't think that the PS4 will be using even half its resources by the end of next year to run new games.
Next year is going to be very interesting in the world of console and computer gaming. The era of single core execution power and games running on a max of six threads is over. It has to be as the software studios are going to have to develop for the tech they have. Saying that each individual core in a PS4 is clocked so low we may very well see first gen games running on 12 threads, maybe more. Even with a conservative estimate of 12 threads would your gaming PC be able to keep up?
I've said it before, and I'll say it again: the future of gaming is heavily threaded games. Looking forward to games running 12, 20, 40, 60 threads the PS4 is way ahead of current PC tech, however I don't expect to see games running 60 threads for several years at least (although I fully expect to see them at some point). It is hard to guess at what pace the software studios will push the new technology. Just because the last console systems lasted so long doesn't mean that in 2 or 3 years the PS4 won't be tapped out by gaming studios pushing the envelope to produce bigger and better eye candy.
-
Reply to cowboy44mag
cowboy44mag said:
8350rocks said:
GOM3RPLY3R said:
8350rocks said:
PS3 had 1080p HD (hence the HDMI output on the unit)...how on earth could you expect a console designed to support 4K resolution TVs would be in 720p?Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?
I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.
Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.
Keep in mind, this isn't even talking about resolutions.
This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.
This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
I have to say I agree with a lot of what 8350 said in this post. The timing is going to be the deciding factor in all of that though. It all comes down to how far the software studios have to push the hardware to get the graphics and content they want. It took the better part of six years for the Xbox 360 and PS3 to be pushed to using all their resources by software studios, but it eventually happened. That is why if you look at blogs for the newest games out 360 and PS3 owners are complaining about freezes, and crashes. The software is pushing the hardware past what it is capable of.
The PS4 is an impressive piece of technology, and if studios came out of the gate pushing the hardware for everything it was worth running 60+ threads a 3570K would be left in the dust. In fact most, if not all, current gaming PCs couldn't keep up. However I don't think that the PS4 will be using even half its resources by the end of next year to run new games.
Next year is going to be very interesting in the world of console and computer gaming. The era of single core execution power and games running on a max of six threads is over. It has to be as the software studios are going to have to develop for the tech they have. Saying that each individual core in a PS4 is clocked so low we may very well see first gen games running on 12 threads, maybe more. Even with a conservative estimate of 12 threads would your gaming PC be able to keep up?
I've said it before, and I'll say it again: the future of gaming is heavily threaded games. Looking forward to games running 12, 20, 40, 60 threads the PS4 is way ahead of current PC tech, however I don't expect to see games running 60 threads for several years at least (although I fully expect to see them at some point). It is hard to guess at what pace the software studios will push the new technology. Just because the last console systems lasted so long doesn't mean that in 2 or 3 years the PS4 won't be tapped out by gaming studios pushing the envelope to produce bigger and better eye candy.
If AMD fanboys are this stupid I might as well switch to Intel and Nvidia to avoid any association.
-
Reply to montosaurous
cowboy44mag said:
I've said it before, and I'll say it again: the future of gaming is heavily threaded games. Looking forward to games running 12, 20, 40, 60 threads the PS4 is way ahead of current PC tech, however I don't expect to see games running 60 threads for several years at least (although I fully expect to see them at some point). It is hard to guess at what pace the software studios will push the new technology. Just because the last console systems lasted so long doesn't mean that in 2 or 3 years the PS4 won't be tapped out by gaming studios pushing the envelope to produce bigger and better eye candy.
Thread count doesn't mean much by itself, it's the thread load. I'm running 975 threads on an i5 and it's at 5% usage. Having 8 cores vs 4 cores doesn't mean it's going to multitask better if the 8 cores are running at half the speed of the 4 cores.
The reason PS4 has Jaguar cores is purely for the TDP. The 8 Jaguar cores are running fairly slow compared to desktop PCs (1.6-2ghz). I wouldn't put it past a Phenom X4 965 (3.4Ghz x 4) in relative CPU performance (4 year old tech). They are pairing it with a $160 class video card which helps a lot.
For $399 it's a great deal. You'd be hard pressed to build a gaming rig of equal power for the same price.
-
Reply to Cazalan
GOM3RPLY3R said:
Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).
Now what if we did the vice versa and made the i5 an 8 core processor and overclocked to 4.0. I would say that it's easily and extreme series CPU and may be the best on the market for performance.
It all comes down to the price. The main argument is between the (no offence) inexpensive, less wealthy people, versus the more profited people. If you have the money, Intel is the better choice (unless your on a budget or video editing or doing things with more OpenCL or just want to make your computer run very hot). Really now for gaming these days, if you have a Core 2 Quad Q8300 @ 2.5 (my old CPU), as long as, for GPUs, you have something like a 680 or 7970 or more, then your fine. ^_^
you have no room to call anyone fanboy. P4 .. seriously ... put down the crack pipe. Lets look at a game that doesn't even prefer AMD cpus.

8350 is at 58.4fps
4300 (cut the 8350 in half right) 48.9
your q 8300 is down to 35.8
core 2 d E7200 (considerably faster than P4) is even farther at 28.8fps.
Lets look at one that isn't quite so hateful.

8350, 80.8
4300, 56.7
c2q 8300 31.2
c2d E7200 26.3
and this was with a 7970 ...
" Really now for gaming these days, if you have a Core 2 Quad Q8300 @ 2.5 (my old CPU), as long as, for GPUs, you have something like a 680 or 7970 or more, then your fine. ^_^
hows that q8300 looking now, or better yet, 4300 is equal to a p4 ... rofl. Then again, maybe that extra 400 hmz on the 4300 is the only thing keeping it over 2x as fast as a C2D E7200 wich is a lot faster than a P4 ... Had to dig this one up
-
Reply to noob2222
hcl123
June 15, 2013 3:44:01 AM
GOM3RPLY3R said:
Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).
Food for thought i think its getting hopeless...
The software is much more easy to change (in any direction)... why not have good 8 thread client software ? ... of course then that i5 would be uter completely obsoleted, but who cares about low thread count obsolete hardware ?
After so much long debate about benchmarks and crooked aways, i think what is absolutely proven is that "performance is in the software", that is... that GCC4.8, that PostgreSQL pgbench, many many other examples, that shows some little but good change in the software can produce improvements that have been equivalent to many generations of hardware
So why ask to cut hardware in half, why not ask to double thread count of software ?... in all honesty isn't Multithreading the future ? ... if single thread is of any importance, more that a brief PTB promotion, then why build Multiprocessor chips with more cores ?
-
Reply to hcl123
hcl123
June 15, 2013 4:01:47 AM
GOM3RPLY3R said:
Totally agree here. Consoles are made for the average consumer, or someone who has a great deal of interest in gaming. You can NOT think your amazing because you have a console. Consoles are a great way to pass time and or play with friends, but on a serious note, PC is the way to go.
As for the comment about the consoles, 8350, gamer is completely right. There will be no point that it needs that many threads. It would only really need it for someone that uses it for extensive testing or something, but other than that, its a waste. And yes on a basis of actuality the PS4 runs off strictly AMD parts, and the reasoning (don't deny them, they are true):...
I think that logic is found to be flawed.
* Console game developers have been able to extract quite more performance for consoles because they can right much more close-to-the-metal
* So in that light they don't need as powerful hardware as the PC world.
* And in that order of thought, 8 thread games, and perhaps PS4 is quite more faster than a strip down version of the game for PC running on a Titan... or even an HD7990(depends)
Some stories (salt) is up to an order of magnitude more drawing power on the console side... i thing for some specific games with identical hardware in either side... but following that order of thoughts, the PS4 can be equivalent to a PC discrete graph card with 10000 ALUs (Sp in AMD lingo) LOL
-
Reply to hcl123
juanrga
June 15, 2013 5:54:20 AM
Not to start a useless console vs PC battle here, but the PS4 will perform at least like a windows gaming PC with an i7 (HT activated) and a GTX-780, for games using 3GB VRAM or less. For games using more VRAM that graphic card will be memory bottlenecked loosing performance.
Also nobody (except Sony and some developers) really knows the real improvement provided by all the supercharged architecture included in the PS4 design: improved GCN units, double bus, volatile bit, hUMA... That is why I wrote "at least". If the rumour of the elemental demo was running in AH with one third of the performance of the PS4 is correct, then the PS4 will be much faster than a PC with a Titan or a HD7990. Some devs. are claiming that the PS4 performance is years ahead of gaming PCs. Time will say.
Also nobody (except Sony and some developers) really knows the real improvement provided by all the supercharged architecture included in the PS4 design: improved GCN units, double bus, volatile bit, hUMA... That is why I wrote "at least". If the rumour of the elemental demo was running in AH with one third of the performance of the PS4 is correct, then the PS4 will be much faster than a PC with a Titan or a HD7990. Some devs. are claiming that the PS4 performance is years ahead of gaming PCs. Time will say.
-
Reply to juanrga
gamerk316 said:
8350rocks said:
How about trying this one on for size...
Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.
You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.
So lets do some math for a second so you understand what is being discussed about consoles:
PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)
Your "EXCELLENT GAMING PC" = 8 threads.
My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?
The answer (whether you agree or not), is simple: You can't.
That's why consoles are generations ahead of current PC technology.
Uh, no. For one, even though there are a couple thousand threads running at a time on a PC system, only a handful are in a "ready to run" state at any given time. You don't NEED to run that many theads at once. Secondly, most tasks aren't time sensitive; if your UI is delayed by 50ns because you have to wait for the thread to get swapped in, guess what? You don't care.
Thirdly, most modern GPU's with some form of compute [everything since teh 8000 series from NVIDIA] can offload work from the CPU in some fashion; the PS4 is hardly unique in that regard. Whether you see a performance increase is largely dependent on scale though; I wouldn't bother to offload anything unless it scales to AT LEAST 32-GPU compute units, due to how relatively weak a single GPU compute resource is compared to a single CPU core.
So no, if you installed a full version of Win 7 64 on it, the PS4 would have roughly half the performance of a medium-grade gaming PC.
While all GPUs with compute functions can offload "some"...the HD 7990 (the best compute GPU for gamers) can run 4 threads at once. The GPU in the PS4 has 64 compute pipelines. That's more than 1000% increase in parallel computing capability.
Also, it's a good thing Sony isn't running windows isn't it? Besides...windows makes everything slower (even intel CPUs).
So, the HSA and hUMA capabilities on the PS4 are going to be a dramatic performance increase. So much so, that I don't think anyone can sit back and say..."based on the hardware..." because it's all speculation. Developer's hands are no longer "tied" to having to accommodate for different memory systems and complicated architectures that require a ton of time to get oriented with.
-
Reply to 8350rocks
montosaurous said:
Yeah I'd say it will take an FX 6100 and Radeon 6870 to play upcoming console games with console quality and console frames. Last gen though is a different story. I'm not saying 8350Rocks is completely wrong however. He does have good points that future games will likely use more threads, which will benefit AMD and give the 8xxx line an edge over the i5. And currently, only heavily CPU intensive games that are lightly threaded will show benefit from Intel over AMD. As a developer...the PS4 is excitingly ahead of it's time. There is a lot of room for growth.
I am not saying PS4 would be faster than a gaming PC when it comes to brute force. TFLOPS are TFLOPS, after all...
However, what I am saying is that, through finesse and parallel computing, the PS4 can actually achieve the capability of being able to do more at once than a modern gaming PC, even though it's through different means.
A modern intel i5 would kick the daylights out of the PS4 running SuperPi, for example...but you don't play SuperPi with a controller do you? SuperPi doesn't bring you hours of enjoyment in front of a TV while you're trying to head shot your buddies playing online does it?
There are strengths in everything...the issue is the "per core power" crowd doesn't see that GPUs have orders of magnitude better capability...so utilizing them to the maximum potential means that CPUs are there only to run things that the GPU is inefficient at (which in gaming is a far lesser number than it is in daily PC tasks).
That's why supercomputers use GPGPUs...because they have far more computing power.
-
Reply to 8350rocks
Related resources
- Solvedwhat AMD cpu would run good with this? solution
- SolvedAMD CPU for gaming (fanboi question) solution
- Solvedwill a amd 750k cpu hold back my 280x solution
- SolvedAMD GPU and CPU speeds? solution
- SolvedAMD Athlon ii x4 640 CPU/GTX 480 temps good? solution
- SolvedUpgrading CPU from AMD solution
- SolvedIntel Core i3-4130 or AMD FX-6300 CPU? solution
- SolvedAmd CPU and 12 Gb of Ram solution
- SolvedCPU-Cooler (AMD FX-8350) solution
- SolvedAMD FX6350 CPU Gaming boost? Any suggestions? solution
- SolvedWhich CPU-Cooler? (AMD FX-8350) solution
- SolvedHelp identifying an AMD Athlon II CPU solution
- Solvedasus sabertooth 990fx rev 2 amd 8350 sudden crash and cpu led solution
- Solvedasus sabertooth 990fx rev 2 amd 8350 sudden crash and cpu led solution
- Solvedis the amd a10-7850k a good cpu solution
- More resources
!
