Do pc part makers in general always take into account the prices/market when designing a new part?

Rafael Mestdag

Reputable
Mar 25, 2014
1,442
1
5,460
For example, who's to say that Windows doesn't consume a lot more ram than it should/could normally if it wasn't for a possible lobby with the ram makers?

And another example comes from the pc gaming industry, it's natural that with time and development games become more and more advanced and hence require more from the pc and its parts, but who's to say that some game makers don't exaggerate(Mafia 3?) and overdevelop their titles in order to establish a market lobby with the video card makers for example(not to mention ram and cpu's)?

All this, if you think about it, makes for more top of the shelf pc parts being sold and the market keeps growing.
 
Solution

user11464

Notable
Feb 25, 2017
661
0
1,160
Ofcourse. That is how business works. The PC/Gaming industry isn't immune to it. If a company manages to undercut the industry and provide a product that crushes the competition... the competition will buy them out. lol
 
A GAME DEVELOPER normally looks at the current computer hardware to determine how they should make the game to maximize PROFIT.

Game issues happen, and some games are just too demanding even on low performance but that's a game developer issue and not common.

Most of these improvements are just incremental. NVidia comes out with a faster GPU. Thus, the high-end shifts as does the low-end (on average) of what people have the performance for.

NVidia (and AMD) design to maximize profit thus they aim at that. If you can sell a LOT of cheaper cards you can get more profit than a lot fewer more expensive cards but then there's BRAGGING rights for the best GPU etc so it's not simple.

But...
Overdeveloping a game to thus make a DEAL with hardware manufacturers? I don't really see that. They probably lose profit (aside from "Crysis" like games that people buy even if they can't play well due to hype).

The hardware manufacturer is going to develop their hardware no matter what. And what "deal" would there be? That suggest a transfer of money. Does NVidia pay a game developer to make a game that ONLY their top cards can handle?

It happens that we have games like this, but I'd assign that mostly to internal problems with the game team (or higher up the chain).

CONSOLE:
It's a shame that VISUALS are still the focus and not a smooth gaming experience. Heck, we're looking at 4K (upscaled) with constant drops below 30FPS. WTF?

In this case the best console (in terms of budget) is made then game developers decide what gives them the best profit. Unfortunately it may be the CONSUMER that's a big part of the problem as we look at flashy game trailers and say "that looks AWESOME!" so then cycle continues.

I'd really like to see 1080p/60FPS as the minimum. It's perfectly feasible right now if you drop the graphics quality. I think we're one console generation away from that. PS5 or whatever which I expect to have a 6C/12T or 8C/16T Ryzen processor and 2x6TB GPU setup (the PS4 PRO has two GPU's already).
 
There is always a need for more performance/better products in every industry not just computers. Companies have to keep up to compete or in theory try to outperform their rivals so they can take market share. The top products of their r&d is going to be the top and then just the price has to be figured out with the current prices and competition.

The key to success in business is to cater for the majority/larger audience if you want to make money. The part where you say something about making more top shelf parts being sold wouldn't make sense. If they make software that needs top of the line, then people just won't buy it since they can't use it.

With games, there is always more advanced options that require more performance that current hardware isn't capable of. They want to incorporate more of these which goes back to trying to outperform the competition. You just get the option to turn them off so you can still cater to the majority.

Software is designed around current hardware capabilities. Windows could probably use less ram but most pcs have more so why waste time and resources in fixing something that isn't broken. We also see things like them splitting svchost processes in the next windows 10 release to make things easier for end users but they even mention it uses more resources with individual processes. If you take a look at windows, there's a lot of unused legacy things in there. That causes extra resources and inefficiencies.
 

Rafael Mestdag

Reputable
Mar 25, 2014
1,442
1
5,460
Take two extremes of the same market(pc gaming) for example: Mafia 3 and Max Payne 3. The first one was clearly overdeveloped at first then started to bring out patches to make it better for the consumer although Mafia 3 remains extremely demanding on the system.

On the other end of the scale there's Max Payne 3 with DirectX11 extremely well made graphics even for today's standards, but it is so well optimized even an old dual core like an AMD X2 from 2008 is capable of running it reasonably well and with satisfying graphics(with a reasonable cheap card at least).

Max Payne 3 and Mafia 3, despite the age difference between them are on completely opposite ends of the same scale. Why is that? Why didn't Mafia 3's developers make it as well optimized as Mafia 2 or at least closer to it so more people could enjoy it?
 


I would suggest development problems. Sometimes the team is told to push the game out by a specific date. I'm frankly baffled by the number of poorly optimized games.

If the MINIMUM SPEC machine for a game can't play the game enjoyably then something is wrong.

I think also the software has gotten more complex and difficult to manage (including video drivers). With DX12 and Vulkan (slowly) filtering in I think we'll start to see games get increasingly reliable.

In fact, I think gaming stability being baked into the game engine will continue to get so good that game developers can focus on game CREATION rather than stability. Like, when I draw a picture I don't want to make my pencil then worry about if the graphics will blow off the picture if the light from China hits it when someone is eating toothpaste.

For example, the game could automatically change variables to meet a 60FPS target without us tweaking (resolution, physics etc). CPU bottlenecks will diminish as the game code gets better threaded. The XBOX ONE is a W10 PC essentially so cross platform titles should get easier to do.
 
Solution
What he said. This has nothing to do with the hardware industry and is an issue of game development. The higher ups push things and ruin games and it just screws everyone but themselves so they can profit. Games is like any other industry, it's all business.