AMD Ryzen AMA

Project Scorpio, Versus Core i5, and ECC

Robert Pankiw: Microsoft engineers are said to have made significant improvements to both the architecture and design of the Scorpio Engine, the SoC (system on chip) jointly developed with AMD. The engineering team reduced the Jaguar powered SoC to a 16nm process node. I realize that doing a process shrink isn't nearly as easy as shrinking a picture in MSPaint. What goes into shrinking existing core design?

The team also reportedly made huge strides in parsing DirectX 12 commands, even claiming that their new designs reduced some API calls down from thousands of instructions to 11. Can AMD still benefit from that knowledge and implementation specifics?

DON WOLIGROSKI: I should qualify this by starting with saying I can't comment directly on Project Scorpio, I am not involved in that project and have no idea if what you've heard about a die shrink is true, but I can make a comment on die shrinks in general

Die shrinks are far more involved than people think, because architecture is tied to dies in ways that we don't know. Not that I know, but this is what our architects tell me when I ask. I can say it's a non-trivial, massive undertaking. But I'm no processor engineer, to be sure. So, I don't have the knowledge to answer you with any authority, sorry.

It might sound like I'm tooting our own horn, but I do believe that AMD invented the basis for all modern APIs. DirectX 12 owes a good portion of its existence to AMD's Mantle API, which laid out a template for Microsoft to follow. They do a lot of things similarly. And of course, Mantle lives on as the basis of the Vulkan API. So absolutely, we're very, very focused on future-looking graphics APIs and taking advantage of them as best we can.

g-unit1111: I'm interested in upgrading my 4th Generation Intel Haswell rig to a Ryzen based system. What performance can we expect from the Ryzen 5 Processors? How does the Ryzen 5 1600X compare to say an i5-7600K? Would my money be better spent on upgrading to a Ryzen 7 1700X?

Also, what's the issue with AM4 mounting brackets? I see that companies like Noctua are giving away AM4 mounting brackets but would older coolers be able to work on the new platform?

DON WOLIGROSKI: Ryzen 5 will murder the Core i5 when it comes to prosumer applications: rendering, encoding, encryption. Anything that takes advantage of more threads, the Ryzen dominates.

If you're a prosumer who wants even more productivity, Ryzen 7 will deliver even shorter processing times than Ryzen 5. If this is you, get the

best you can afford. But know that the Ryzen 5 is worlds better than the Core i5. The Ryzen 5 1600X is essentially as fast as the Core i7-6850K when it comes to prosumer applications.

Now, if all you do is surf the web and game, one application at a time, maybe don't upgrade yet. If you game and stream at the same time, and like to run apps while gaming, then Ryzen is a sweet upgrade for gamers.

norune: Has AMD fine-tuned Ryzen chips so there is less overclocking headroom for the 1800X models in comparison to the 1700 models? Any date for the next revision of Ryzen? Like Late 2017 or early 2018?

DON WOLIGROSKI: AMD qualifies chips. We choose the best samples to be the 1800X, because it has to run at the highest clocks. Does that mean a 1700 or 1700X can't run at those same clocks? Not at all, they might run very well at higher clocks! But they might need a bit more voltage and a bit more cooling to do so.

I have no dates for the next Ryzen revision, sorry. All I can tell you is that Ryzen 3 is coming in the 2nd half of 2017.

sp1207: What is the story with ECC? I've read reports of it working with various motherboards, working only in Linux, working in Windows but not advertised as such. Is there any AMD push to coordinate with Microsoft and motherboard manufacturers to enable ECC as an option even if not officially supported?

DON WOLIGROSKI: Ryzen processors support ECC memory, but it's up to motherboard manufacturers to qualify their platforms. Since this isn't a typical consumer feature, you'll need to do some research and see what works I'm afraid, unless a motherboard specifically announces support for ECC RAM.

Robert Pankiw: Does AMD, Intel, and NVidia work together pre-launch to prevent as few hardware related bugs as possible, especially if these bugs only show up in certain configurations?

DON WOLIGROSKI: AMD does its best to ensure the best possible user experience when we partner with any other vendor.


MORE: Best CPUs


MORE: Intel & AMD Processor Hierarchy


MORE: All CPU Content

Image
AMD Ryzen 5 1600
Tom's Hardware Community

The Tom's Hardware forum community is a powerful source of tech support and discussion on all the topics we cover from 3D printers, single-board computers, SSDs, and GPUs to high-end gaming rigs. Articles written by the Tom's Hardware Community are either written by the forum staff or one of our moderators.

  • BugariaM
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.
    Reply
  • genz
    I intensely disagree Bugariam. All the info he could provide is provided and he asked people actually close to the metal when he did not know. You will not get tech secrets or future insights from ANY AMD or Intel rep on tomshardware; Its far too public and every drop of information here is also given to Intel, Nvidia, and any other competitors hoping to steal AMDs charge. What we did get is a positive outlook on AMD's products.... when you compare that to what we already had from Toms and other publishers who have spent years watching Intel lead and thus don't have faith (or simply got their jobs for their love of Intel) was major.

    I personally think he did not remind us that the current crop of 8 core consoles will inevitably force AMD's core advantage to eat all the competition Intel currently has. In 5 years every single Ryzen 1 processor will terrorize the Intel processors they competed with.... Ryzen 5s will have 50% performance gains over Kaby i7 etc etc.

    Intel knew this was the future, that is why all Intel consumer processors have stuck to 4 cores to try and keep the programming focus on their IPC lead. Now that that lead is only 6% and the competition has more cores, we will see the shift toward 6+ cores that we saw when Core 2 Duo came and made dual FX and Dual Pentiums viable mainstream gaming chips, and when Core Quad and Nehalem made quad cores viable gaming chips.

    As the owner of a 3930k, you can read my past posts and see I have always said this is going to happen. Now, a month after you are seeing the updates come out already. Wait till there are 12 threaded games on the market (this year I expect) and you will see just how much the limitation of the CPU industry's progress was actually created by Intel's refusal to go over 4 cores in the mainstream.

    For all the talk of expense creating 6 and 12 core processors, Intel could have had consumer 8 core low clock chips in mainstream for prosumers and home rendering types years ago and they didn't. My theory is that they are scared of heavily threaded applications in the mainstream creating opportunity for competition to outmanouvre their new chips based on slower, more numerous cores. It's not like a 2ghz 6 or 8 core in the mainstream was never an option.
    Reply
  • Calculatron
    I remember being really excited for the AMD AMA, but could not think of anything different from what everyone else was already asking.

    In retrospect, because hindsight is always 20/20, I wish I would have asked some questions about Excavator, since they still have some Bristol Ridge products coming out for the AM4 platform. Even though Zen is a new architecture, there were still some positive things that carried over from the Bulldozer family that had been learned through-out its process of evolution.
    Reply
  • Ernst01
    As a long time AMD Fan it is so cool AMD has more in the future for us.
    Reply
  • TJ Hooker
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.
    Reply
  • alextheblue
    Thank you Don!
    Reply
  • Tech_TTT
    19562297 said:
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.

    I agree with you 100% ... Ask me anything should include people from the R&D department and not only sales person. or maybe a team of 2 people , Sales and Research. or even better? the CEO him/herself included.

    Reply
  • Tech_TTT
    @Tomshardware : WE DEMAND APPLE AMA !!!
    Reply
  • genz
    19566458 said:
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.

    That is simply not true.

    Here's an example. 22nm and 18nm TDP is usually far higher than actual draw because the chip is so small any cooling solution has a much smaller surface area to work with. Another example: When Intel brought over onboard memory controllers from the bridge to the CPU socket, the TDP of their chips went unchanged because (thermally speaking) the controller was far away enough from the chip to never contribute to thermal limitations... despite the temperature of the chip rising much faster under OC because of the additional bits, and the chips themselves drawing more power due to more components. A final example: I have a 130W TDP chip that without overvolting simply cannot reach a watt over 90 even when running a power virus (which draws the max power the chip can draw - more than burn-in or SuperPi). The TDP rating is directly connected to the specific parts of the chip that run hot and how big they are, not their true power draw. This is why so many chips of the same binning have the same TDP despite running at lower clocks and voltages than each other.

    Further to that, TDP is rounded up to fixed numbers to make it easy to pick a fan. True power draw is naturally dependent on how well a chip is binned, and super badly binned chips may still run with enough volts so they usually add 10 to 20 watts for the thermal headroom to make that possible.
    Reply
  • TJ Hooker
    @genz I never said TDP is equal to power draw, in fact I explicitly said there are reasons why it isn't. I simply said that "thermal watts" (heat being generated by the CPU) are equivalent to "electrical watts" (power being consumed by the CPU). At any given moment, the power being drawn by the CPU is equal to the heat being generated.

    I'll admit, I'm sort of nitpicking a small part of the answer given in the AMA regarding TDP, I just felt the need to point it out because this is a misconception I see on a semi regular basis.
    Reply