People "probably Won't" Need Discrete Graphics Cards Anymore - Intel
Shanghai (China) - The days of discrete graphics cards are coming to an end, according to an Intel representative we talked to at the Shanghai Intel Developer Forum. Ron Fosner, an Intel Graphics and Gaming Technologist and former video game programmer, told TG Daily that multi-core CPUs will put an end to multi-GPU madness and that people "probably won’t need" discrete cards in the future.
Video interview with Intel’s Ron Fosner
Fosner made these comment while demonstrating Intel’s ’Smoke’ demo at the Technology Showcase at IDF. We posted a short version of the demo a few days ago. Today we went back to get some more detailed answers.
The demo is actually an incomplete demo that was supposed to simulate fire fighters putting out a raging inferno. "We didn’t put in the fire fighters yet," said Fosner. So instead of triumphant fire fighters, Intel chose to rain fiery destruction onto a house comprised of thousands of individual pieces, all simulated on a Nehalem 4-core, 2 threads per core, processor.
Nothing is pre-baked in this demo and the meteors don’t land in the same spot on each run. Lucky for us, Fosner managed to smash a meteor through the front porch of the shack. He explained that the demo then procedurally simulated fire as a particle emitter system with bounding boxes. When the particles hit a tree branch bounding box, the branch then becomes an emitter itself.
The demo also simulated animals like deer and rabbits running around on the ground. Dozens of birds are also flying around. Threads are devoted to creature AI and the animals try to run away, sometimes unsuccessfully, from the flames. Multi-core systems could lead to more realistic crowd animations and better weather and environment effects, according to Fosner.
Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered "more bang for the buck" and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. "The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system," said Fosner.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards - this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that "everybody" knows how to program for a CPU and that this new way of programming will "get rid of" a path of graphics obsolescence.
When asked if discrete graphics cards will be needed in the future, Fosner answered, "Probably not". He explained that computer didn’t have discrete graphics in the 80s and that CPUs are becoming powerful enough to take over that role.
-
jinxed_07 What is this? I don't even....Reply
I know this is two years old, but to think that Intel would have some employee talking such BS is ridiculous. For shame.
Computers didn't need discrete/dedicated graphics in the 80's because most only needed to put text on a screen.