Chris: Exciting, okay cool. Obviously you have a direct line to these hardware vendors. What do you want to see in the next generation of GPUs that’ll make your job easier?
Johan: That's a fun question. We have a pretty long list of all the stuff we typically go through and then talk to them with, but one very concrete thing we’d like to see, and actually Intel has already done this on their hardware, they call it PixelSync, which is their method of synchronizing the graphics pipeline in a very efficient way on a per-pixel basis.You can do a lot of cool techniques with it such as order-independent transparency for hair rendering or for foliage rendering. And they can do programmable blending where you want to have full control over the blending instead of using the fixed-function units in the GPU. There’s a lot of cool components that can be enabled by such a programability primitive there and I would like to see AMD and Nvidia implement something similar to that as well. It's also very power-efficient and efficient overall on Intel's hardware, so I guess the challenge for Nvidia and AMD would be if they were able to do that efficiently because they have a lot of the different architectures there. So that's one thing. What other components do we have? Usually when the architects are over we have these meetings of just sitting and talking for 14, 15 hours, or an entire day about everything.

Chris: That'd be a fun conversation to sit in on.
Johan: Yeah, it's really fun. We want to enable, and I mentioned a bit about it last week during my talk last week about Mantle, was enabling the GPU to execute in a little bit more of a heterogeneous fashion of being able to run multiple compute shaders in parallel with your graphics work and ideally having more collaboration between the CPU and GPU. We can do things like that on the console because they're integrated machines, so the CPU and GPU are on the same die. On the PC you are seeing it more and more with the APUs and Intel's Ultrabooks that also have integrated graphics.
I want to see more of this type of collaboration between CPU and GPU to drive many more advanced rendering techniques. For example once we've rendered the Z-buffer for a scene then we know the depth of every single pixel that we have in that our frustum and based on that information we can actually do things like shadow maps that are adapted specifically only to cover the area that they actually need to. Typically you don’t really have that knowledge and on the CPU you prepare that data that the GPU will render a few frames later, so you have to brute force a lot of things. You have to send out a lot of work and you can't really be reactive. With many of the things that we can do with Mantle and I think going forward also with closer CPU and GPU interaction in general we can do a lot more clever techniques and less brute force type of techniques as well. That's a pretty frequent topic the when we talk with architects.

Chris: Sure, so I also want to know about features that'll make the biggest difference to realism, but in my previous question I was talking about features that'd make your job easier. So as a follow-up to that one, are there different features you want to see that'll improve the experience an end-user has when they play your games from the perspective of realism?
Johan: Yeah, so realism. I think that there's a few things, well I guess it goes in both categories. Another thing that I haven’t mentioned yet is that Nvidia has been doing a lot of good work with nested data parallelism or I think they call it dynamic parallelism in their big Kepler cores where you actually run compute work that is sort of nested and can interact in very interesting ways. That enables a lot of other programability mechanisms with it and nice performance.
For realism specifically, we are having some challenges in general going forward because there are so many rendering techniques that we implement through just standard rasterization and post-processes. These things will sort of start to break down more and more as we have more and more complex scenes, and we want to have more transparent surfaces in those. Doing just the standard rasterization and then trying to do depth of field and motion blur correctly on those, but doing them as post-processes is very, very limited. Let’s say you have a transparency and you may have two or three layers of windows together with some particle effects there, and then you want to do depth of field on that scene afterwards, but you only have your depth buffer. It doesn’t know about these transparent surfaces, or if it knows about them it doesn’t know what's behind them.
I think a challenge there is figuring out what is a good, efficient future graphics pipeline both for mobile, which maybe have different constraints, but also for depth stuff, because the rasterization pipeline is really quite efficient, but has its limitations there. There’s various other types of alternatives such as micro triangle or micro polygon rasterizers or stochastic rasterization where you sort of can start in with depth of field and motion blur, and they can be more of an integral part of your rendering. This of course has a lot of other potential drawbacks or difficulties in how these things interact, but you sort of get to the point where more of these techniques can more freely interact. I think that can really bring on a lot more extra realism. At least I'm talking from purely what the GPU vendors and we can do together on that.
There's a lot stuff that we can do just with our engine also, and that we are doing going forward. Things like more physically based rendering and shading where we try and use more real life measurements of light sources and materials and try to represent those accurately within the game. Typically, in previous games and engines, we sort of look at the reference and then you try to recreate that, but you don’t really measure, you don’t know the ranges. There's nothing really to truly compare it with, so the type of games we're doing now with very big, complex content and levels of game play, it gets more important to be able to have that reference; the frame of reference of something that's real and trying to recreate that.
- Chris Angelini And Johan Andersson Talk Battlefield 4 And Frostbite 3
- The Future Of Hardware And Game Realism
- Preparing For A World Of VR And HMDs
- The Future Of Frostbite: More Streamlined Development
- Parallelism And Graphics On Next-Gen Consoles
- Fully Utilizing Next-Gen Console Hardware
- What Does A World With Many Low-Level APIs Look Like?
- What Are The Chances That AMD Shares Mantle With Anyone?
Come on AMD! Bring it already!!!! SteamMachines are around the corner!
-Your game is good in this?
-Yes our game is good in this becouse [...].
Just the phrasing and the form changes.
At least the users in tomshardware can still offer some solid information. And later people dont understand why 80% of the readers automaticly jump to user comments before reading the full article.
Good lord! The PS4 can't even run Battlefield 4, a launch title, at 1080p? Where will these consoles be in 5 years time?!?!
Good lord! The PS4 can't even run Battlefield 4, a launch title, at 1080p? Where will these consoles be in 5 years time?!?!
May be spending $500-600 on a CPU + a 5years warranty reliable Asus Sabertooth X79 isnt a bad investment. lol
Its still early in the console cycle and devs will need time to fully unlock them. With the semi heterogeneous architectures of the new consoles its going to be a steep learning curve to figure out the best ways to utilize GPU compute power. They talk about having the CPU at 95% utilization but just because its busy doesn't mean its efficient. Busy is easy to achieve, efficiency is not. There's a lot of room to grow yet.
That plus I don't think most exclusive console gamers are really worried about 1080p (other than a random number in a vacuum that seems bigger than other random numbers). If they were truly worried about resolution they'd be on a PC. The games are still prettier than a 360/PS3 so they'll be happy in the end.
Good lord! The PS4 can't even run Battlefield 4, a launch title, at 1080p? Where will these consoles be in 5 years time?!?!
Frostbite is probably most demanding engine out there... check if your PC can run BF4 at min. 60 FPS in high details @ 1080p. There are many that can't.
Next gen consoles are speced like mediocre gaming PC of today. WTH you expect.
In near future programmers will be able to squeeze bit more juice of them because of more unified and exposed hardware but that's all.
Performance and Quality is on PC.
As for the PS4 can't do 1080p, so what? How much do you have to spend on a PC to 1080p on BF4 on ultra? $1500+ compared to $400.
Good lord! The PS4 can't even run Battlefield 4, a launch title, at 1080p? Where will these consoles be in 5 years time?!?!
Go get a PS3/XB 360 emulator and try run it on your PC, watch your system be brought to its knees.
The Console is a process monster capable of scaling its hardware to the point it gives guarenteed fixed smooth performance. I have a 2400 and a GTX670 DCUII and BF4 has CPU spikes as high as 110ms despite rendering around 60FPS on ultra preset. The end result is stutter and lags which you will never see on a console even on the biggest home entertainment systems today.
A guarenteed 30FPS with no latency at any TV setup is far better than my friends IBe and 780ti micro stuttering system with serious latency spikes.
Mate, it is actually far less than $1500... try $499; BUT a caveat, only possible once Mantle is out AND available in Linux AND the game has a linux-steamMachine port; else Windows-reinstall on the SteamMachine with BF4 running with Mantle.
But it is still all theory, no proof! Here's to hoping!
http://rootgamer.com/8383/tool/steamos/steam-machine-amd-hardware-cost-499
Mate, it is actually far less than $1500... try $499; BUT a caveat, only possible once Mantle is out AND available in Linux AND the game has a linux-steamMachine port; else Windows-reinstall on the SteamMachine with BF4 running with Mantle.
But it is still all theory, no proof! Here's to hoping!
http://rootgamer.com/8383/tool/steamos/steam-machine-amd-hardware-cost-499
$499 for a hypothetical machine that doesn't say anywhere it will do BF4 on max, or that BF4 would even be ported to it. This article is about BF4, comments were about BF4, so you post a steam machine page.