AMD Ryzen AMA

Best Gaming Processor, Clock Speeds, and Development

Image
AMD Radeon RX 480 8GB

johnson151168: Which processor do you recommend strictly for gaming, priced less than $250? I don't really play anything besides League of Legends and World of Warcraft, but I would like to try out quite a few upcoming games, and I know my FX CPU is not be up to the challenge. I am currently using a Radeon RX 480 8GB graphics card.

DON WOLIGROSKI: Strictly gaming, well, assuming your “strictly gaming” goal doesn't include gaming and streaming to twitch (which can really take advantage of the 12-threaded Ryzen 5 1600X and Ryzen 7 processors), I'd steer you to the Ryzen 5 1500X. High clocks and the most XFR clock rate headroom in the Ryzen stack so far (up to 200 MHz over the Precision Boost spec with capable cooling), four cores and 8 threads so it has enough resources to take advantage of the games that value more cores. $189. Sweet little part, basically a Core i7 equivalent in a lot of ways, but for about half the price.

SinxarKnights: Did you have a party once the first Ryzen chips rolled off the line to celebrate?

DON WOLIGROSKI: I work from home in Canada, I wasn't at the AMD campus when they had the Ryzen launch party.

jaymc: Do you expect to hit higher and higher clock speed as AMD further refines and tweaks the Ryzen platform? Will Ryzen ever hit clock speeds equal to an overclocked Kaby Lake?

DON WOLIGROSKI: Until we have new silicon spins, anything I say is speculation. But we're all quite optimistic about how fast we got this first architecture/process to go in its first go round, and bolstered that we have a lot of opportunity to crank up the clocks.

Martell1977: Could tell us a little about Ryzen's development, as in, how long ago was it that it was started? Was it before the Bulldozer release? Shortly after?

DON WOLIGROSKI: Off the top of my head, I believe it was 4 or 5 years ago now, around 2012. Before my time at AMD, I started my tenure here at the beginning of 2015. The promise of the Zen architecture is one of the reasons I came to AMD in the first place.

Martell1977: The benchmarks I have seen for Ryzen 7 have made it difficult to know exactly which CPU to recommend. There are 4 SKU's out to challenge Intel’s. Is there a chart or list you have that show exactly what your intended CPU vs CPU matchups are?

DON WOLIGROSKI: You can compare on price or on ability. The Ryzen 7 1700X ($499) actually also beats the Core i7-6900K in a lot of multi-threaded benches, but on a price standpoint it's closer to the Core i7-6800K, which it dominates.

We usually pit the Ryzen 7 1700 ($330) and the i7-7700K because their price is so close. From a productivity rendering/encoding/encrypting standpoint the 1700 kicks the crap out of Kaby. The 7700K does have higher clocks and IPC, so there's a 1080p gaming advantage, but once you raise resolution to 1440p the gaming advantage is very muted. At 4K and in VR, it barely shows up in benchmarks. So, if you're spending over $300 on a CPU, I think the Ryzen 7 1700 is an easy choice because folks in this segment would be buying 1440p or higher resolution monitors. You're not giving up any real-world game performance at 1440p and above, but you're getting colossal application advantages.

The new Ryzen 5 1600X ($249) is 6 cores/12 threads and priced similarly to the Core i5-7600K. Literally 3x the threads on Ryzen, this is the easiest battle for us. Productivity is on a different level entirely, while some modern games really appreciate more than 4 threads and the 7600K can suffer significantly compared to the 7700K. So, games trade blows at 1080p. No real argument to choose Kaby Lake here.

The Ryzen 5 1500X ($189) is priced opposite the Core i5-7500. With twice the threads of the core i5, the Ryzen 5 1500X is a good gaming part for people who like the idea of Core i7 class productivity for half the price, should they ever want to exercise that power. And games are becoming more threaded all the time thanks to DirectX 12 and Vulkan.

Dragonsmint878: Any update on the availability to purchase Wraith coolers? Will the Wraith Max, might it be available to purchase at R5 Launch? Will stock RGB coolers be able to be bought separately?

DON WOLIGROSKI: We're very aware of the demand from AMD customers for standalone wraith coolers. We haven't announced anything publicly. I personally think it would be very cool if they were offered as a standalone item.


MORE: Best CPUs


MORE: Intel & AMD Processor Hierarchy


MORE: All CPU Content

Image
AMD Ryzen 7 1700X
Tom's Hardware Community

The Tom's Hardware forum community is a powerful source of tech support and discussion on all the topics we cover from 3D printers, single-board computers, SSDs, and GPUs to high-end gaming rigs. Articles written by the Tom's Hardware Community are either written by the forum staff or one of our moderators.

  • BugariaM
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.
    Reply
  • genz
    I intensely disagree Bugariam. All the info he could provide is provided and he asked people actually close to the metal when he did not know. You will not get tech secrets or future insights from ANY AMD or Intel rep on tomshardware; Its far too public and every drop of information here is also given to Intel, Nvidia, and any other competitors hoping to steal AMDs charge. What we did get is a positive outlook on AMD's products.... when you compare that to what we already had from Toms and other publishers who have spent years watching Intel lead and thus don't have faith (or simply got their jobs for their love of Intel) was major.

    I personally think he did not remind us that the current crop of 8 core consoles will inevitably force AMD's core advantage to eat all the competition Intel currently has. In 5 years every single Ryzen 1 processor will terrorize the Intel processors they competed with.... Ryzen 5s will have 50% performance gains over Kaby i7 etc etc.

    Intel knew this was the future, that is why all Intel consumer processors have stuck to 4 cores to try and keep the programming focus on their IPC lead. Now that that lead is only 6% and the competition has more cores, we will see the shift toward 6+ cores that we saw when Core 2 Duo came and made dual FX and Dual Pentiums viable mainstream gaming chips, and when Core Quad and Nehalem made quad cores viable gaming chips.

    As the owner of a 3930k, you can read my past posts and see I have always said this is going to happen. Now, a month after you are seeing the updates come out already. Wait till there are 12 threaded games on the market (this year I expect) and you will see just how much the limitation of the CPU industry's progress was actually created by Intel's refusal to go over 4 cores in the mainstream.

    For all the talk of expense creating 6 and 12 core processors, Intel could have had consumer 8 core low clock chips in mainstream for prosumers and home rendering types years ago and they didn't. My theory is that they are scared of heavily threaded applications in the mainstream creating opportunity for competition to outmanouvre their new chips based on slower, more numerous cores. It's not like a 2ghz 6 or 8 core in the mainstream was never an option.
    Reply
  • Calculatron
    I remember being really excited for the AMD AMA, but could not think of anything different from what everyone else was already asking.

    In retrospect, because hindsight is always 20/20, I wish I would have asked some questions about Excavator, since they still have some Bristol Ridge products coming out for the AM4 platform. Even though Zen is a new architecture, there were still some positive things that carried over from the Bulldozer family that had been learned through-out its process of evolution.
    Reply
  • Ernst01
    As a long time AMD Fan it is so cool AMD has more in the future for us.
    Reply
  • TJ Hooker
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.
    Reply
  • alextheblue
    Thank you Don!
    Reply
  • Tech_TTT
    19562297 said:
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.

    I agree with you 100% ... Ask me anything should include people from the R&D department and not only sales person. or maybe a team of 2 people , Sales and Research. or even better? the CEO him/herself included.

    Reply
  • Tech_TTT
    @Tomshardware : WE DEMAND APPLE AMA !!!
    Reply
  • genz
    19566458 said:
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.

    That is simply not true.

    Here's an example. 22nm and 18nm TDP is usually far higher than actual draw because the chip is so small any cooling solution has a much smaller surface area to work with. Another example: When Intel brought over onboard memory controllers from the bridge to the CPU socket, the TDP of their chips went unchanged because (thermally speaking) the controller was far away enough from the chip to never contribute to thermal limitations... despite the temperature of the chip rising much faster under OC because of the additional bits, and the chips themselves drawing more power due to more components. A final example: I have a 130W TDP chip that without overvolting simply cannot reach a watt over 90 even when running a power virus (which draws the max power the chip can draw - more than burn-in or SuperPi). The TDP rating is directly connected to the specific parts of the chip that run hot and how big they are, not their true power draw. This is why so many chips of the same binning have the same TDP despite running at lower clocks and voltages than each other.

    Further to that, TDP is rounded up to fixed numbers to make it easy to pick a fan. True power draw is naturally dependent on how well a chip is binned, and super badly binned chips may still run with enough volts so they usually add 10 to 20 watts for the thermal headroom to make that possible.
    Reply
  • TJ Hooker
    @genz I never said TDP is equal to power draw, in fact I explicitly said there are reasons why it isn't. I simply said that "thermal watts" (heat being generated by the CPU) are equivalent to "electrical watts" (power being consumed by the CPU). At any given moment, the power being drawn by the CPU is equal to the heat being generated.

    I'll admit, I'm sort of nitpicking a small part of the answer given in the AMA regarding TDP, I just felt the need to point it out because this is a misconception I see on a semi regular basis.
    Reply