Waiting for Broadwell/Skylake to upgrade is pointless

Extreme6800

Honorable
Sep 20, 2013
15
0
10,510
According to what is already confirmed by Intel for Broadwell and the variants announced for Skylake it seems pointless to wait to upgrade a desktop system due to:

1- It seems Intel will keep the same core count for Skylake (2 cores for Core i3, 4 cores for everything else and 6/8 cores for extreme editions) This is extremely disappointing, even if intel would realize how silly it is to keep having dual core CPU´s at 180$ in 2016 and increase the core count on i3/i5 models It would have to play and cut/add meaningless features like Hyper Threading, SIMD instructions and Quicksync options and performance between 2 different lines of CPU´s would be too close to each other and could risk making Haswell faster than Skylake in optimized benchmarks.

2- It seems that they will reutilize the integrated GPU on Broadwell (HD 5000/6000 series) for Skylake in order not to risk compatibility and cost in the same 14nm process, it is already known that they are only a very minor improvement over the existing Haswell ones, (only 2/4/8 extra EU's which is very little). This means that it will still have unplayable performance for any game released in the past 2 years even at the absolute minimum settings and wont be able to beat the ill fated AMD APU´s in any price segment.

3- Manufacturing of Haswell and Broadwell CPU´s will stop long before Broadwell and Skylake get introduced in the market in order to prevent the current prices of outdated parts to fall down (Intel has done this previously with Sandy Bridge to Ivy bridge etc) Waiting for prices to fall down due to introduction of new parts is a thing of the past ever since Sandy Bridge shortened the LGA life cycles, now you get the same CPU every 2 years with a 100 Mhz clock boost, and 3% more performance per clock for the same price 5 years ago, with the argument that is uses less power and is "next-gen".

4- DDR4 memory will not bring tangible performance improvements, there is a reason for Intel limiting support on many of its mainstream CPU´s at DDR3 1600 Mhz, because it is already more than enough bandwidth for most of the programs and intel architecture is less sensitive to memory subsystem, combined by the fact that many of the first DDR4 modules will just be DDR3 chips with increased latency performance improvements are negligible.
At least there is a chance that DDR3 prices will fall down, that is the only good news so far as DDR4 modules were made to scale to very large sizes, maybe a rise in GB will come, but for most people that dont use memory intensive applications this is again meaningless.

5. New chipset brings more lanes to PCIe 3.0 and PCIe 4.0 on overpriced mobos (meaningless, extra PCIe bandwidth only makes a small performance impact on rare extreme situations when GPU onboard memory is not enough , and budget motherboards will not feature it, also we will have extra USB ports to increase motherboard costs as if the excessive nº of ports on current mobos wasn't enough.

6. Chances of prices rising up compared to the corresponding existing models as it seems that AMD is out of the desktop market (maybe even gone forever?) and there is no competition, this can be very harmful for the entry level budget segment and reinforces the fact that Intel does not need to increase core counts on i3 and lower segments and can keep "laser-burning" caches of perfectly good working CPU´s and changing microcode to turn i3´s into Celerons and Pentiums and still get profit out of it. Also there will be no Pentium-K this time, and Celeron 20th anniversary will have to wait until 2018.

Some of this facts might not be 100% accurate, but it is clear that CPU (and GPU too) market is evolving into a sad reality where power consumption is overvalued above everything else and performance/features do not improve across several generations, not to mention that soon we will ran out of brand choices and pricing can be arbitrary.
(Sorry for the long post.)

Discuss.


 
There's been rumors of cpu's turning to different technology other than silicon for it's foundation. Most likely true though when is unknown. I have a feeling they're both against a wall (amd and intel). Aside from games, there's really nothing pushing the cpu market. It doesn't take even a dual core for most 'general' use like web surfing and watching youtube vids. Anymore people are upgrading just to upgrade. I'd be willing to bet a large majority of pc users/owners don't have a need for the processing power they currently have. For awhile the industry successfully duped everyone into a kneejerk upgrade scenario but I think since people are seeing how capable processors from even 4yrs ago still are it's becoming clearer.

Even if they do manage to get more power into cpu's, why? For ill programmed games? It's already been shown that graphically intense games can be coded for and successfully played on older hardware. Not that everything pc related is gaming centric, it just happens to be one of the most widely used features people use their pc for that actually requires all this horsepower.

Just like other tech, it's becoming bloated for the sake of being bloated. A dvd holds 4.7gb of data. A dual layer hold 9.4gb. Either I'm blind and can't tell the difference (doubtful) or others see a huge difference between dvd and blueray because they want to. I've seen dvd and br movies both on a full hd tv one after another. Not so much different. Not enough to warrant a br being 25gb to 50gb. Bloat for the sake of bloat. Maybe 4k is different, but honestly you can only get so clear or so many distinguishable colors on a screen before it surpasses human capabilities. Then people become the bottleneck and anything past that is pointless. Same with computing speed.

I have a feeling that's why the focus is on energy consumption - when you run out of possibilities for improvement, you turn elsewhere. Although it does need to be managed, there for awhile gpu's were going the way of the cpu's - just keep adding more power. I'd hate to see a gpu with 4 or 5 pcie power cables attached to it. So it does need energy consumption placed in check or it would get out of control.

They definitely aren't making breakthroughs in performance like they were with new tech 10yrs ago. Every time something gets better, it's successor has to be exponentially better to provide a perceivable difference or else we end up where we are - 10% performance increases.

Things have been riding the coattails of the previous generation of hardware for quite some time now and I'm sure eventually a breakthrough will happen. Look how far the microchip has come since it's beginnings. Ram went through several phases until it moved from edo to ddr - rather than sticking with ddr, 2, 3, 4 - something new will need to take it's place. Same thing with cpu's and gpu's.