Skip to main content

AMD Ryzen AMA

Tweaks & Support, Power Plan, and Game Benchmarks

Eric_010: Is AMD working with Battlefield 1 developer DICE to improve performance on Ryzen CPUs?

DON WOLIGROSKI: We're engaging with every major developer we can to make sure the Ryzen gaming experience only gets better. For now I'd play with the Battlefield 1 DirectX setting and detail to optimize the game, offhand I don't recall hearing about these issues on BF1 and Ryzen so this might be a problem specific to your system. That said, there's a lot of data being tracked and I apologize if it's a known issue that I can't recall at the moment.

Image 1 of 7

Image 2 of 7

Image 3 of 7

Image 4 of 7

Image 5 of 7

Image 6 of 7

Image 7 of 7

fourseven: I live in Indonesia. Do you know when AMD plans to launch the Ryzen 5 in South East Asia, specifically in Indonesia?

DON WOLIGROSKI: Official on-shelf launch day is April 11th worldwide. I'm not sure if your specific country has any challenges that would prevent that, but that's the day we expect Ryzen 5 to be available on shelf.

cstephenson: When do you expect the Ryzen chips to be competitive with Intel in terms of gaming? The potential is certainly there!

DON WOLIGROSKI: I'd argue that Ryzen is already competitive. There's been a lot of reviewers talking about outliers where we don't do as well, and bringing a lot of focus to them. And those outliers are where we've focused our first-round of developer engagements. Games like Ashes of the Singularity, Total War: Warhammer, and DOTA2 already have improvements.

With the new faster memory support, we average game performance delta between Ryzen and Kaby Lake is a lot closer than you'd think over a wide swath of games at 1080p. And Ryzen can hit over 60 FPS in pretty much every game I've seen at 1080p, and usually over 80 FPS and 120 FPS. It's never slow, it's just not the fastest. At 1440p, 4K, and in VR, the delta becomes insignificant between Ryzen and Kaby.

Based on that, I think it's fair to say we're already quite competitive, we're just not just not the fastest at 1080p gaming. Saying Ryzen isn't a competitive gaming CPU because Kaby is a bit faster is like saying the Ferrari 488 isn't a competitive sports car because the Bugatti Veyron is faster. It's a gross oversimplification.

Ditt44: Having just read that AMD has a new power plan available, is this something that we will see integrated with Ryzen after "Date X" or will users have to manually download and update?

DON WOLIGROSKI: For now it’s a manual download. Our long-term goal is to get it automatically updated in Windows, but I don't have a target date on that yet, sorry.

Tech_TTT: Are we expecting an AMD APU with onboard HBM2 Memory as shared memory for both System and GPU and no DIMMs slots any time sooner? What are your plans for very low voltage CPU? The Ryzen managed a good 65W TDP for 8 cores. Can we expect a 15W 4 core Ryzen APU to compete with a low voltage Intel CPU?

DON WOLIGROSKI: We're definitely considering different HBM implementations, but we haven't announced anything I can talk to. In a lot of ways the Zen architecture gets more impressive as you provide less power. I can't comment on unannounced laptop parts, but there are great things coming!

Tech_TTT: Why did you choose to go dual channel memory and not quad or eight channels for the Ryzen? Why doesn’t AMD manufacture their own motherboards?

DON WOLIGROSKI: We decided to focus on what’s best for the market. Our goal is to have a platform that competes with low-end Intel boards all the way up to high-end Intel Extreme. After analyzing the benefits, the real-world advantage of quad-channel RAM doesn't outweigh the extra costs or trade-offs. The vast majority of users will never see the difference. Heck, the dual-channel 1800X can still beat the tar out of the quad-channel 6900K in many benchmarks. I think it was a good compromise for the vast majority of users. From an enthusiast perspective, it's always nice to have more, though, so I get it.

AMD does not manufacture their own motherboards because, frankly, our partners do a better job and offer more differentiation and flavor than AMD would want to. We're happy to concentrate on the processors and leave the boards to the specialists.

Thanks again to everyone who participated! If you haven't yet, now is your final chance to enter our giveaway for the ASRock X370 Taichi AM4 Motherboard. New to the Tom's Hardware Community? Head to the forums and sign up to become a member of the largest enthusiast community on the planet.


MORE: Best CPUs


MORE: Intel & AMD Processor Hierarchy


MORE: All CPU Content

AMD Ryzen 5 1600View Deal
  • BugariaM
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.
    Reply
  • genz
    I intensely disagree Bugariam. All the info he could provide is provided and he asked people actually close to the metal when he did not know. You will not get tech secrets or future insights from ANY AMD or Intel rep on tomshardware; Its far too public and every drop of information here is also given to Intel, Nvidia, and any other competitors hoping to steal AMDs charge. What we did get is a positive outlook on AMD's products.... when you compare that to what we already had from Toms and other publishers who have spent years watching Intel lead and thus don't have faith (or simply got their jobs for their love of Intel) was major.

    I personally think he did not remind us that the current crop of 8 core consoles will inevitably force AMD's core advantage to eat all the competition Intel currently has. In 5 years every single Ryzen 1 processor will terrorize the Intel processors they competed with.... Ryzen 5s will have 50% performance gains over Kaby i7 etc etc.

    Intel knew this was the future, that is why all Intel consumer processors have stuck to 4 cores to try and keep the programming focus on their IPC lead. Now that that lead is only 6% and the competition has more cores, we will see the shift toward 6+ cores that we saw when Core 2 Duo came and made dual FX and Dual Pentiums viable mainstream gaming chips, and when Core Quad and Nehalem made quad cores viable gaming chips.

    As the owner of a 3930k, you can read my past posts and see I have always said this is going to happen. Now, a month after you are seeing the updates come out already. Wait till there are 12 threaded games on the market (this year I expect) and you will see just how much the limitation of the CPU industry's progress was actually created by Intel's refusal to go over 4 cores in the mainstream.

    For all the talk of expense creating 6 and 12 core processors, Intel could have had consumer 8 core low clock chips in mainstream for prosumers and home rendering types years ago and they didn't. My theory is that they are scared of heavily threaded applications in the mainstream creating opportunity for competition to outmanouvre their new chips based on slower, more numerous cores. It's not like a 2ghz 6 or 8 core in the mainstream was never an option.
    Reply
  • Calculatron
    I remember being really excited for the AMD AMA, but could not think of anything different from what everyone else was already asking.

    In retrospect, because hindsight is always 20/20, I wish I would have asked some questions about Excavator, since they still have some Bristol Ridge products coming out for the AM4 platform. Even though Zen is a new architecture, there were still some positive things that carried over from the Bulldozer family that had been learned through-out its process of evolution.
    Reply
  • Ernst01
    As a long time AMD Fan it is so cool AMD has more in the future for us.
    Reply
  • TJ Hooker
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.
    Reply
  • alextheblue
    Thank you Don!
    Reply
  • Tech_TTT
    19562297 said:
    Many people ask clear and technically interesting questions, hoping to get the same answers ..

    And they are answered by a person who is far from the technical plane and the engineering questions.
    He is a manager, he is a salesman.
    His task is more blah-blah, only for the sake of an even greater blah-blah.

    Thanks, of course, but alas, I found nothing interesting for myself here.

    I agree with you 100% ... Ask me anything should include people from the R&D department and not only sales person. or maybe a team of 2 people , Sales and Research. or even better? the CEO him/herself included.

    Reply
  • Tech_TTT
    @Tomshardware : WE DEMAND APPLE AMA !!!
    Reply
  • genz
    19566458 said:
    "TDP is not electrical watts (power draw), it's thermal watts."Argh, this kind of annoys me. "Electrical watts" and "thermal watts" are the same thing here, power draw = heat generated for a CPU. There are reasons why TDP is not necessarily an accurate measure of power draw, but this isn't one of them.

    That is simply not true.

    Here's an example. 22nm and 18nm TDP is usually far higher than actual draw because the chip is so small any cooling solution has a much smaller surface area to work with. Another example: When Intel brought over onboard memory controllers from the bridge to the CPU socket, the TDP of their chips went unchanged because (thermally speaking) the controller was far away enough from the chip to never contribute to thermal limitations... despite the temperature of the chip rising much faster under OC because of the additional bits, and the chips themselves drawing more power due to more components. A final example: I have a 130W TDP chip that without overvolting simply cannot reach a watt over 90 even when running a power virus (which draws the max power the chip can draw - more than burn-in or SuperPi). The TDP rating is directly connected to the specific parts of the chip that run hot and how big they are, not their true power draw. This is why so many chips of the same binning have the same TDP despite running at lower clocks and voltages than each other.

    Further to that, TDP is rounded up to fixed numbers to make it easy to pick a fan. True power draw is naturally dependent on how well a chip is binned, and super badly binned chips may still run with enough volts so they usually add 10 to 20 watts for the thermal headroom to make that possible.
    Reply
  • TJ Hooker
    @genz I never said TDP is equal to power draw, in fact I explicitly said there are reasons why it isn't. I simply said that "thermal watts" (heat being generated by the CPU) are equivalent to "electrical watts" (power being consumed by the CPU). At any given moment, the power being drawn by the CPU is equal to the heat being generated.

    I'll admit, I'm sort of nitpicking a small part of the answer given in the AMA regarding TDP, I just felt the need to point it out because this is a misconception I see on a semi regular basis.
    Reply