Nvidia does not have plans to bring its ray tracing-enabled GPU architectures to smartphones or other ultra-mobile devices right now, CEO Jensen Huang told journalists at a Computex meeting this week. The statements come just days after AMD confirmed that upcoming Samsung smartphones using AMD RDNA2 GPU architecture will support ray tracing.
According to Huang, the time for ray tracing in mobile gadgets hasn't arrived yet.
"Ray tracing games are quite large, to be honest," Huang said, according to ZDNet. "The data set is quite large, and there will be a time for it. When the time is right we might consider it."
AMD, meanwhile, has licensed its RDNA2 architecture, which supports ray tracing, to Samsung for use in the upcoming Exynos 2200 SoC expected to power its laptops and other flagship mobile devices. AMD CEO Dr. Lisa Su said this week that the SoC will indeed support ray tracing.
"The next place you’ll find RDNA2 will be the high-performance mobile phone market," Su said, as reported by AnandTech. "AMD has partnered with industry leader Samsung to accelerate graphics innovation in the mobile market, and we’re happy to announce we will bring custom graphics IP to Samsung’s next flagship SoC, with ray tracing and variable rate shading capabilities. We’re really looking forward to Samsung providing more details later this year."
Currently, Samsung's Exynos-powered smartphones use Arm Mali-powered graphics; whereas, Qualcomm Snapdragon-based handsets use Adreno GPUs.
Nvidia is in process of taking over Arm, which develops general-purpose Cortex CPU cores as well as Mali graphics processing units for various system-on-chips (SoCs). Nvidia has long tried to license its GeForce technologies to designers of mobile SoCs and devices without any tangible success. If Nvidia's acquisition of Arm is approved by various regulators, Nvidia will be able to offer its latest GeForce architectures to Arm licensees. Yet, it appears Nvidia has no immediate plans to bring GeForce RTX to smartphones.
Nvidia's Ampere and Turing architectures seem to be too bulky for smartphone SoCs (and even for entry-level PC graphics) anyway. For now, the company will have to use its GeForce Now game streaming service to address demanding gamers on smartphones and tablets.
"That's how we would like to reach Android devices, Chrome devices, iOS devices, MacOS devices, Linux devices -- all kinds of devices, whether it's on TV, or mobile device or PC," said Huang. "I think that for us, right now, that is the best strategy."
Yet, ray tracing is nothing new on mobiles. Imagination Technologies architectures since the PowerVR GR6500 introduced in 2014 have supported ray tracing, so it's up to hardware designers to decide on implementing the capability and game designers to leverage it. Imagination's PowerVR ray tracing implementation is currently supported by Unreal Engine 4 and Unity 5, but it's unclear whether it's primarily used for eye candy, performance increase and/or power reduction.
Maybe something good comes out of the cooperation with Samsung and they can deliver a chip that can really support at least a litte raytracing on mobile devices and not only something for marketing purposes on a spec sheet. Nevertheless the power envelop on mobile devices is quiet tight ... lets wait and see ...
Besides, PowerVR Imagination beat both AMD and NVIDIA to the punch years ago.
AMD isn't saying phones are ready for ray tracing they are saying the hardware that Samsung is licensing is capable. That's different.
Also AMD has been massively limited by nVidia. Short memories... For years (like 15 years ago) nVidia cheated in benchmarking software. The drivers would detect the software and overclock the gpu and ignore thermal throttling (to a point) just so they would always win benchmarks. This increased their sales which hurt ATI which in turn limited their r&d budget.
Then they got caught and not long after we saw the birth of game works. Code so closed source that nVidia didn't even let the game devs see it which meant they couldn't optimize for it. Meanwhile AMD was releasing effects and making them open source so anyone could improve them which is why AMD's hair solution looks better than nVidia's hair works (by a significant margin) and runs on average 12-20% better.
AMD's driver team would also rewrite game works shader code. They would intercept something like HBAO+ and replace it with their own identical but better running solution. That's where the fine wine effect came from. Usually it would take a few weeks for the updated drivers but the games would get a good performance leap and often run better than they did using game works on an nVidia gpu of similar power.
AMD still has a fraction of the r&d budget of both Intel and nVidia but despite that they are absolutely crushing it in the cpu space and VERY quickly catching up to nVidia in the gpu space. I wouldn't be shocked to see them on par or better when RDNA3 is released.
That's why nVidia decided to gimp the Ampere generation with piss poor VRAM capacities so you have to upgrade when RT is actually something you can run! THE NERVE!
Also, Jensen is salty he can't put an nVidia GPU on mobile and their only real client is Nintendo. But hey, I'm sure as soon as Nintendo wants to put just a bit more eye candy in their Switch, Jensen will be all over that cake singing praises to RT on mobile then.
And for the record, I don't disagree with the premise: RT on mobile devices, unless they're using low resolution sampling, it's kind of overkill for the hardware capabilities. I'd love to be proven wrong by AMD and make Jensen eat his words though.