High resolution: Am I missing something here? What's the point of high res, low quality visuals?

wedouglas

Honorable
Oct 7, 2012
18
0
10,510
I'm sitting hear watching Monsters University in 720p, and the visuals are about, oh I don't know, infinity better than any video game ever made in 8K.

What is the point of using performance to increase resolution when the visuals themselves aren't even good enough to warrant it? Why are the pixels themselves more important than what the pixels are showing?

Doesn't make any sense to me. Am I missing something here, or is high resolution nothing more than a waste of performance?
 

revolution2718

Honorable
Apr 8, 2012
272
0
10,860
Resolution makes everything sharper. Have you tried watching standard def TV on an HD monitor? It looks like garbage. And you can't compare a pre-rendered movie to games. Games are rendered in the console or computer. That movie is rendered on computers long before you ever see it, so they can use the highest quality visuals available.
 
Sep 22, 2013
482
0
10,810


There are really two things we're talking about here.

There is the resolution of the source and the resolution of the display. If the source (like your movie) is in low resolution, it will still look bad (or the same) on a high-res display.

Ideally, the resolution (which is really pixel density) from the source would match the display. So, you're right, the "what the pixels are showing" is as important as the pixels themselves. In other words, a high-res display won't make a low-res movie, game, etc. look any better. Changing the original video/image to a higher resolution won't solve this either; it's entirely dependent on how the source image was created in the first place.

Higher performance allows displays and graphics cards to display higher resolution sources, higher refresh rates (which is critical to smooth playback/animation) and wider color ranges. In terms of 3D animation, it also allows for a higher polygon count which makes for more realistic textures and smoother transitions on objects (less jagged lines).

So, in a nutshell, higher performance is necessary to keep up with the depth and speed of imagery as our ability to create better imagery, whether through animation, photography or film increases. Our current capability in 3D animation and related technologies as well as film and photography FAR outpaces our ability to properly display the content we have the ability to create for the average user.
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510
esolution makes everything sharper. Have you tried watching standard def TV on an HD monitor? It looks like garbage. And you can't compare a pre-rendered movie to games. Games are rendered in the console or computer. That movie is rendered on computers long before you ever see it, so they can use the highest quality visuals available.

What is better looking, Battlefield 3 at 720p or Battlefield 2 at 8K?

Why bother making things sharper and sharper if they don't look good yet in the first place? The comparison is valid because it's just a matter of processing power. I'm not saying modern graphics hardware could turn out anything close to a Pixar film, but rather that the amount of pixels is irrelevant to the quality of the visual, and that a lot of performance seems wasted on boosting pixel quantity rather than pixel quality.

1080p and even 720p look very good on even 4K+ displays. Like most things, there are diminishing returns. Standard def surely looks bad on a 1080p display, but even 720p looks quite good on a massive, high-res display.

720p Avengers on my 1080p, 100 in projected screen proves that 720p graphics can be a million times better than 4K Crysis 3, so why not use the extra performance of graphics cards to focus on bringing 720p Crysis up in visual quality before increasing it to a ridiculous resolution?

Seems dumb.
 
Sep 22, 2013
482
0
10,810


Again, you're missing the point. You're comparison is BF3 @ 720p vs BF2 @8K. I don't know what 8K is because 4K has just come out, but I'm assuming you mean 8K as a very high resolution.

This is irrelevant because the source image (BF2) wasn't even 720p when it came out. Boosting resolutions in displays is to support higher-res sources, that's it.

The point isn't to make lower-res stuff look better; that's not how it works.

You can hardly say there's diminishing returns when there is very little 4K content even available to make a comparison. Of course 1080p and 720p look fine on that display; same as DVD looks fine on my 1080p... but it doesn't look at good as something shot at *real* HD.

In regard to processing power, again, an irrelevant comparison. This is a chicken/egg scenario, except in this case, we now know which has to come first: the hardware. You can't create better, more detailed graphics on hardware that can't display those graphics; not even close. And you certainly can't play a game rendered at that level unless the hardware exists to support this.

And again, it comes back to the *source*. If I play a blu-ray with a movie that was shot in 1975 on my 1080p TV, it will probably look better than a VHS tape, but it won't look much better, if at all, than a DVD because it wasn't *SHOT* at 1080p resolution.

It's the same reason you can't take a 8x10 pic and blow it up to poster size and retain the resolution - because it simply isn't there.

Regarding Crysis example: the game has extremely large textures to support both higher resolutions and deeper detail in textures. Crysis has the texture quality available to have extreme detail but the hardware has to support displaying that texture level *at the specific resolution* - these things are all interdependent. In addition, without a certain high resolution and pixel depth, you wouldn't be able to tell the quality difference at a certain point anyway because your display wouldn't be able to properly display every pixel it's being fed.

To that same point, you can't make a graphics card make a game look better if the textures don't exist in the game already. So the point you made saying "focus on bringing 720p Crysis" up to better quality can't be done by the graphics card anyway unless the developer has programmed the necessary textures to support whatever resolution you're viewing.

Last, the fact that your comparison is made with a projection screen and not a monitor rules out this entire argument.
 

Pete_the_Puma

Honorable
Mar 4, 2012
168
0
10,710
Go back to 480I (640x480) and it will take you about 3 seconds to understand...

All else being equal higher resolution always looks better. I bought a 1440p screen and my 1080p is sitting in a corner gathering dust.
 


You are definitely missing something.

No PC could generate movie quality graphics at any playable frame rate on any resolution. These movies are made with super computers with 10's and 100's of thousands of CPU's and GPU's working months to generate those images. PC games are just not capable of those graphics.

Now, let's get down to reality here. Both image quality and resolution will have diminishing returns. Increasing from high to very high settings, in a lot of games, has very little improvement in image quality. The more advanced games get, the less improvements we see, while needing increasingly more power from the system.

We are at a point that in many cases, you see a bigger improvement in image quality by having higher pixel density then using a higher setting. They both improve what you see.

And if you have a more powerful system, you can use high settings and high resolutions. There is nothing that is preventing you from doing both.
 
Your comparing film (Avengers) and rendered Cinematic (Pixar) films against games and wondering why they look so much better, even at lower resolutions.
You realize they are completely different right? Film doesnt need rendering (other than encoding and such) so visual quality isnt determined by computer hardware but rather whatever is actually in front of it. Pixar animations require more rendering power than a real-time game could ever demand (say within the next 20yrs). Think warehouses full of servers that's whole task is to render these films.
http://www.slashfilm.com/cool-stuff-a-look-at-pixar-and-lucasfilms-renderfarms/
Also I can bet Pixar films are rendered at an insane resolution, then downscaled for consumer use.

Your confusing the concepts of visual quality (resolution and higher PPI's) with the idea of Aesthetics. Whether something looks good and its resolution are completely different and they needn't have any impact on each other. Pixel art looks good despite a deliberately low PPI, but BF2 wont look good even if blown to 8K because resolution is not what makes something look good (as well as aforementioned argument about source and displayed resolution), it makes it look smoother which is needed if your going for a realistic aesthetic.

EDIT:
Your also assuming that everyone is chasing ever higher resolutions and thinks thats all that matters. Gamers have been balancing resolution, visual settings and FPS since Quake, its not like the PC community at large thinks more pixels is better at all costs.
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510
Hmm, I feel like something is getting lost here. Let me try and clarify. This came into my head when I saw people complaining about XO running CoD at 720p while PS4 would be 1080p, so I think it relates more to consoles due to hardware consistency, but anyway...

Say you have a graphics card capable of the following:

BF4 @ 2K @ 300 fps
BF4 @ 4K @ 100 fps
BF4 @ 8K @ 30 fps

All things being equal, people tend to play at the highest resolution they can in terms of acceptable FPS. FPS is based on everything from resolution, to lighting, to effects, to physics, etc. Pretty much everything going on in the game has some effect on FPS.

My question stands: Why do we want to push the boundaries of resolution rather than the boundaries of everything else? Why aren't we targeting max settings on games for 30 fps @ 1080p and then working our way down from there? We know that 1080p offers more than enough resolution to have amazing looking images, so why aren't we trying to exploit 1080p first? Why are we creating games where high-end systems are only taxed by ridiculous resolutions?

Heck, if we know 720p is capable of photorealism, why aren't we trying to deliver 30-60 FPS @ 720p simply with better game assets and effects?

I understand the need for a certain resolution given our viewing habits -- big screens at 10 feet away require a decent PPI and I agree that we need something like 720p or 1080p given these viewing preferences. I just don't understand the push for FPS at higher resolution when we can't even exploit some of the lowest resolutions. Seems pointless.
 
Because we already can achieve 30FPS at 1080p max settings. There is not much more room to improve at 1080p, when mid-range cards can all get high-max settings in games then there needs to be something higher than that to drive the industry. If AMD/Nvidia kept pumping out cards that are capped by the 1080p resolution, who would buy a new one?
I understand your idea of "make visual effects better instead of more pixels" but that's already happening, games get more intensive as time goes on regardless of whether they are played at the same resolution as each other. But its diminishing returns, more complex graphics only mean so little when you dont have the pixels to express them.



Because its more difficult to create a new Crysis than it was before. That game just couldn't be maxed out on hardware available at the time at 1080p (which was I think just becoming PC gaming standard at the time), that simply hasn't happened since.

yfyxx.jpg


Why would they make it far more demanding for little benefit?
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510


How can you say that? Isn't this screenshot proof enough that 720p can accommodate far more impressive visuals than our latest games at even 4x the resolution?

1 million pixel CG visuals @ 24fps >> 8 million pixel game visuals @ 60fps

If we don't have enough performance to obtain amazing visuals at 720p, why on Earth are we trying to up the resolution? Why are we using processing power to add resolution instead of all the things that actually make the visuals good in the first place, like more complex lights, particles, animation, physics, etc.

49hq.png
 
Again, that render farm thing comes into play. The hardware necessary to make something like that, just isn't possible on a desktop. No amount of optimization and work will compensate for the fact that the hardware used for that far exceeds whats available on a deskrop.

Take the movie Avatar for example, a single frame that included CGI could take several hours to render using four and a half thousand server-grade machines.
http://www.geek.com/chips/the-computing-power-that-created-avatar-1031232/

What you are proposing just isnt possible.
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510
I'm not actually proposing CG quality visuals. I'm proposing improving the visuals rather than boosting the resolution of current visuals. Resolution says nothing about the visual quality of the image.

If we can't display realistic visuals in 720p, why are we trying to display them in even higher ones? It's a waste of resources. It's like saying I want to see more blades of grass that look fake rather than less blades of grass that look photorealistic.

Obviously the trade off isn't so extreme, but it must exist, so what is the real trade off in outputting higher resolution vs high quality visuals?
 


Do you realize that game developers don't have to create games specific for a given resolution? It is just a setting punched in, and the game is rendered to fit what ever resolution you choose. They do not do anything special to accommodate different resolutions, other than adjusting to different aspect ratios.

What resolution you play at is entirely up to you and the vast majority choose 1080p these days.

Some are excited about 4k, as it does give you crisp text and pictures. This is especially nice at the desktop, but that doesn't mean there are a bunch of people using that resolution for games. That doesn't mean dev's change anything other than allowing the resolution to be rendered.

Gamers are the ones who have to make the balance between resolution and settings. You may also be surprised at how little difference visual settings make between high and very high, which often is the difference needed for the same FPS on a game at 1080p compared to 1440p.

These 4K monitors biggest improvement is for the desktop. At the desktop, there is 0 loss is quality. Nothing has to be turned down for the same performance.

 


I don't think you realize how dev's work. They already work hard to give the best visuals. They don't care what the resolution is now. The dev's make no trade off for resolution. They don't work in resolutions.

The PC, that any given gamer is on, is what determines the resolution, and many pixels are used. The dev's don't care, it doesn't effect them one bit. They don't gain anything from limiting everyone to 720p. The visuals would be the same. Resolution only determines how much power is needed on the PC to achieve good FPS and is independent to every given PC. The users make the trade offs now.

If you want to know what it would be like if every game was designed for 720p, simply set the resolution to 720p now.
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510
Devs must have target resolutions otherwise they could end up with a game that is unplayable for many people. Imagine if BF4's visuals were such that a top of the line card could only get a playable fps on low settings at 640x480. It would be a disaster. They create games based on what kind of hardware is out there.

I am getting the impression though that a lot of visual quality is being left on the table. I say this because I see a high-end AMD card pushing 40fps @ ~4K @ max quality.

Maybe I'm wrong, but I have to believe the visuals and experience would be significantly better if they were such that a high-end card could only get 40 fps @ high quality @ 720p, and that these high quality settings were in fact uber ultra amazing settings that we think of today.

If you know there is processing headroom, why leave it just for increasing resolution?
 
Sep 22, 2013
482
0
10,810


Okay, I'm just going to say this flat-out because I don't think it's getting through to you: processing power is NOT just for higher resolutions.

It is more so for the combination of a given resolution, the texture depth and the polygon count, polygon count and texture depth being the most demanding.

Games are configured to display at a certain resolution so that the given objects on screen are proportional to the display they're being viewed on. This has NOTHING to do with the texture quality or polygon count, but it does affect the size of the polygons individually as well as the size of the textures (not QUALITY, size - as in proportion).

Developers most CERTAINLY must know the target resolution to develop the rules for the game engine to scale the textures and objects to the screen size, and this is very much so an aspect of the actual game development. Source: I'm a PM for a software development company.

The target resolution is neither a restraint nor a guideline for the polygon count or texture depth, but it is certainly a relevant factor and has an affect on the demand placed on a GPU because it dictates how large the objects on-screen are.

Over all resolution also determines maximum pixel depth. This allows for (or limits) the higher-depth textures and polygon count (a higher polygon count is needed for the reasons in the picture example above to achieve a realistic texture) and thus the overall quality of the picture.

To say that you could simply raise polygon count and texture depth in 720p and achieve the same quality as at 1080p is simply wrong for one specific reason: pixel count. If you have a texture that has a 1000-pixels but you only have 720, you can't fully display the texture; it will be compressed or distorted (this is a layman's version of how this works, but it's right, in a nutshell).

The idea that you can fully separate image quality and resolution ignores how displays and rendering works in the first place.
 


If there is a target resolution, it is 1080p. That is what most users have in their system. There are games today that will push a 290x to its limits at 720p; Crysis 3 for example, Metro 2033 with PhysX and Advanced DoF on. Far Cry 3 maxed out, Metro LL, and so on.

But you can buy 2 of them in SLI/CF, and push higher resolutions. The problem is, most people do not want this. I'd have no issue, but most people do not like it. They whine and whine that their decent PC can't play at maxed settings, so the average dev does not push the limits of a PC.

But who cares, you are NOT getting Pixar quality out of a gaming PC. The difference in visuals you can push at 720p but not at 1080p is small. Much smaller than you realize. It is already up to each person to balance their resolution and visual quality settings and frankly, 1080p seems to be where most people are at.

You also may be shocked at how bad 720p can look when you are viewing it from 2 feet away or less. Particularly at the desktop.

You also may be shocked to know that many people do not consider 40 FPS to be a good experience. I personally get sick within 15 mins playing at that FPS in any 1st person view game. 60 FPS will allow me to play 30 mins before I start to feel a bit nauseated. I need 80+ FPS before I can play comfortably.

PC gaming is designed around the idea that the end user will optimize their own experience.

Console games already do what you want. They are already rendering at 720p. They are already running at 30 FPS most the time. Are they giving you the spectacular experience you think they should? Probably not, because what you want is not possible on a PC or console.
 
Sep 22, 2013
482
0
10,810


You're not wrong, but you're also not quite right about the point I made.

Quite specifically, when software is developed, the supported resolutions must be *specified* in order for the game engine to adapt the objects and textures to the given resolution with the result being that everything is properly proportioned.

In other words, the game must specifically support 1080x1920, 1200x1920, 2160x3840, etc.

The point is, the textures and how they appear is literally linked to the resolution because it HAS to be; the game engine needs to know precisely how to scale the polygon size and texture size in order to display the image at the selected resolution.

The point is the OP's suggestion that resolution doesn't matter ignores the fact that resolution dictates what can and can't be displayed and that raw processing power must support ALL of these: texture depth, resolution and polygon count in order to display a rendered image.

You can't have one without the other, period. You can't just scale to ridiculously detailed textures and massive polygon counts at 720p and see an equivalent image to 1080p, or for that matter, a BETTER image.

The result would be a scaled-down version of the texture, even if polygon count was the same.

Does that mean some existing titles wouldn't benefit under these conditions when viewed on 720p, no, they certainly could.

But this change can't be made at the HARDWARE level - it has to exist in the SOFTWARE in the first place. Hence upgraded texture packs for games.

 


But don't most dev's just create a table of textures for each supported resolutions, or upscale and downscale these textures as is? Every game I've ever played allowed for just about any resolution supported, except some of the rare resolutions, like 2560x1080.
 
Sep 22, 2013
482
0
10,810


Yes and no. Some textures are scaleable to an extent, like say the textures you'd see at 1920x1080 vs. 1920x1200. It's really a preference of the developer and how they'd like to address different resolutions. So some textures will be reused under multiple resolutions, but the developer will still need to identify which textures are appropriate for which resolutions.

So if I have a texture called "wood", I might have "wood_small", "wood_med" and "wood_large" (hah hah). But when I use that texture is going to depend on the specific conditions I've set. If the user selects 1920x1080, but sets texture detail on "medium" I might use "wood_small" or "wood_med" and only use "wood_high" on high and ultra settings. Either way, I have to tell the engine when to use which texture.

They may also allow custom resolutions, in which case a *range* of resolutions and low/med/high/ultra combinations would be set for which textures to use. Essentially an if/then scenario: "if > = 1920x1080 and < 2048x1536 and < high_texture_depth then wood_med".

Hopefully that make sense!
 


I thought that was what I said, without going into details.
 

wedouglas

Honorable
Oct 7, 2012
18
0
10,510


Thanks for the more in-depth response. Appreciate it.