AMD Ryzen AMA

Update Schedule, 1600X Microarchitecture, and Optane

lightofhonor: How has developing/updating the BIOS process been different then supporting previous AMD sockets or Intel sockets? I've noticed a lot of updates since release on my Killer board. When will the BIOS stop being updated several times a month? When do you think the BIOS will be "done"?

DON WOLIGROSKI: On any new platform, there's going to be more development than usual. It happens with every major socket update, on both AMD and Intel for those who have been around long enough to remember a number of turnovers.

But we're making really good progress, and very quickly. My gut feeling is that the upcoming April update gets us to a place where people are generally satisfied, and then we'll hone that edge in the months to come.

Aris25: Is the 1600X manufactured as a 3x3 or is it a 1800X with one core on each side turned off or is it an 1800X with one core that failed on each side that was then turned off or something else altogether?

DON WOLIGROSKI: The 1600X is essentially an 1800X with one core disabled per CCX (a 3+3 configuration). All 16MB of L3 cache is still enabled, BTW.

Evilwumpus: Can we expect a significant performance difference of Vega or the Polaris refresh when used in conjunction with Ryzen 7 vs an Intel Core i7-7700k?

DON WOLIGROSKI: From a CPU perspective, we try to be graphics-agnostic so everyone can enjoy Ryzen regardless of their choice of GPU.

Nope 1151: Will you ever go back to the green AMD logo?

DON WOLIGROSKI: Your answer lies within (your username).

aeriolwinters: Would the Athlon brand still be active? With the R7 for Enthusiasts, the R5's for mainstream high computing and the R3 for mainstream computing, how do you see the Athlon fit in with all the APU's still not in tow?

DON WOLIGROSKI: Athlon will be used for CPUs that sit below the Ryzen 3 brand, just as it sits below the current FX brand. It will live next to A-series APUs in the same segment. Bristol-Ridge-based APUs and Athlons will be available for Socket AM4 motherboards at an undisclosed date. Stay tuned!

valeman2012: Any plans for Intel Optane support?

DON WOLIGROSKI: The short answer is no. The long answer is:

1. Optane is Intel-proprietary technology, and the Optane M.2 slot is exclusive to some Intel motherboards

2. Intel partnered with Micron to create 3D Xpoint memory technology that Optane is based on. I don't know if Micron's 3D Xpoint-based memory will ever be available as an agnostic solution. I would assume that Intel has an exclusivity clause, but I don't know how long it'd last.

3. In its current form for consumer desktop, Optane is basically an SSD cache drive with a maximum (pitiful) 32GB of storage. They don't even recommend pairing it with an SSD because you wouldn't notice a performance difference. They suggest you pair it with a mechanical hard drive. Lots of hype and little substance.

On the consumer desktop, you're better served with an SSD that actually has decent amount of storage space.

anironbutterfly: I've been reading on the new Ryzen CPUs, in hopes that they're a good successor to the FX-series (I'm currently using an FX-8350 on an original Sabretooth FX990 motherboard with 32 GB DDR RAM). I'm not a gamer, but a hobbyist graphic artist who uses Poser and DAZ|Studio. It's starting to show its age, and I'm looking at options to upgrade. the Ryzen series are the first new straight CPUs I've seen come out of AMD in several years.

I'm curious how this new series of chips might perform for 3d graphics rendering with the Nvidia Iray render engine (and the alternative 3Delight rendering) in comparison to the i5 and i7 Intel CPUs and compatibility with the Nvidia GeForce video cards. (I'm currently using an EVGA Nvidia GeForce GTX 970 4GB that will be progressing into my new build).

DON WOLIGROSKI: I'm not familiar how the Nvidia iRay engine works - I'll assume because it's Nvidia, it's CUDA based. It may not be CPU dependent.

For any CPU-dependent renderer, though, Ryzen will give you colossal - and I mean COLOSSAL - performance increases over FX. And in general, it's just a lot faster and enables multi-tasking in a much more responsive way. Even the sub-$200 Ryzen 5's will give you a tremendous upgrade over the FX. But I encourage you to read the launch day reviews on April 11th.


MORE: Best CPUs


MORE: Intel & AMD Processor Hierarchy


MORE: All CPU Content

Image
AMD Ryzen 7 1700
Tom's Hardware Community

The Tom's Hardware forum community is a powerful source of tech support and discussion on all the topics we cover from 3D printers, single-board computers, SSDs, and GPUs to high-end gaming rigs. Articles written by the Tom's Hardware Community are either written by the forum staff or one of our moderators.

  • BugariaM
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.
    Reply
  • genz
    I intensely disagree Bugariam. All the info he could provide is provided and he asked people actually close to the metal when he did not know. You will not get tech secrets or future insights from ANY AMD or Intel rep on tomshardware; Its far too public and every drop of information here is also given to Intel, Nvidia, and any other competitors hoping to steal AMDs charge. What we did get is a positive outlook on AMD's products.... when you compare that to what we already had from Toms and other publishers who have spent years watching Intel lead and thus don't have faith (or simply got their jobs for their love of Intel) was major.

    I personally think he did not remind us that the current crop of 8 core consoles will inevitably force AMD's core advantage to eat all the competition Intel currently has. In 5 years every single Ryzen 1 processor will terrorize the Intel processors they competed with.... Ryzen 5s will have 50% performance gains over Kaby i7 etc etc.

    Intel knew this was the future, that is why all Intel consumer processors have stuck to 4 cores to try and keep the programming focus on their IPC lead. Now that that lead is only 6% and the competition has more cores, we will see the shift toward 6+ cores that we saw when Core 2 Duo came and made dual FX and Dual Pentiums viable mainstream gaming chips, and when Core Quad and Nehalem made quad cores viable gaming chips.

    As the owner of a 3930k, you can read my past posts and see I have always said this is going to happen. Now, a month after you are seeing the updates come out already. Wait till there are 12 threaded games on the market (this year I expect) and you will see just how much the limitation of the CPU industry's progress was actually created by Intel's refusal to go over 4 cores in the mainstream.

    For all the talk of expense creating 6 and 12 core processors, Intel could have had consumer 8 core low clock chips in mainstream for prosumers and home rendering types years ago and they didn't. My theory is that they are scared of heavily threaded applications in the mainstream creating opportunity for competition to outmanouvre their new chips based on slower, more numerous cores. It's not like a 2ghz 6 or 8 core in the mainstream was never an option.
    Reply
  • Calculatron
    I remember being really excited for the AMD AMA, but could not think of anything different from what everyone else was already asking.

    In retrospect, because hindsight is always 20/20, I wish I would have asked some questions about Excavator, since they still have some Bristol Ridge products coming out for the AM4 platform. Even though Zen is a new architecture, there were still some positive things that carried over from the Bulldozer family that had been learned through-out its process of evolution.
    Reply
  • Ernst01
    As a long time AMD Fan it is so cool AMD has more in the future for us.
    Reply
  • TJ Hooker
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.
    Reply
  • alextheblue
    Thank you Don!
    Reply
  • Tech_TTT
    19562297 said:
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.

    I agree with you 100% ... Ask me anything should include people from the R&D department and not only sales person. or maybe a team of 2 people , Sales and Research. or even better? the CEO him/herself included.

    Reply
  • Tech_TTT
    @Tomshardware : WE DEMAND APPLE AMA !!!
    Reply
  • genz
    19566458 said:
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.

    That is simply not true.

    Here's an example. 22nm and 18nm TDP is usually far higher than actual draw because the chip is so small any cooling solution has a much smaller surface area to work with. Another example: When Intel brought over onboard memory controllers from the bridge to the CPU socket, the TDP of their chips went unchanged because (thermally speaking) the controller was far away enough from the chip to never contribute to thermal limitations... despite the temperature of the chip rising much faster under OC because of the additional bits, and the chips themselves drawing more power due to more components. A final example: I have a 130W TDP chip that without overvolting simply cannot reach a watt over 90 even when running a power virus (which draws the max power the chip can draw - more than burn-in or SuperPi). The TDP rating is directly connected to the specific parts of the chip that run hot and how big they are, not their true power draw. This is why so many chips of the same binning have the same TDP despite running at lower clocks and voltages than each other.

    Further to that, TDP is rounded up to fixed numbers to make it easy to pick a fan. True power draw is naturally dependent on how well a chip is binned, and super badly binned chips may still run with enough volts so they usually add 10 to 20 watts for the thermal headroom to make that possible.
    Reply
  • TJ Hooker
    @genz I never said TDP is equal to power draw, in fact I explicitly said there are reasons why it isn't. I simply said that "thermal watts" (heat being generated by the CPU) are equivalent to "electrical watts" (power being consumed by the CPU). At any given moment, the power being drawn by the CPU is equal to the heat being generated.

    I'll admit, I'm sort of nitpicking a small part of the answer given in the AMA regarding TDP, I just felt the need to point it out because this is a misconception I see on a semi regular basis.
    Reply