Sign in with
Sign up | Sign in
Your question

Intel says: I am King

Last response: in Graphics & Displays
Share
April 3, 2008 5:27:19 PM

According to intel, we won't be needing gpus for the most part in the future. Maybe not for windows, but for gaming they are dead wrong. Their point of view seems to be 2-fold, sli/crossfire is not as efficient as extra cpu cores, and cpus will/can do everything gpus can. One super cpu will never be as good as one super gpu. end of. </rant>

http://www.tgdaily.com/content/view/36758/135/

Short extract:

"Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered “more bang for the buck” and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. “The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system,” said Fosner.

Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards – this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that “everybody” knows how to program for a CPU and that this new way of programming will “get rid of” a path of graphics obsolescence.

When asked if discrete graphics cards will be needed in the future, Fosner answered, “Probably not”. He explained that computer didn’t have discrete graphics in the 80s and that CPUs are becoming powerful enough to take over that role."

More about : intel king

April 3, 2008 5:56:12 PM

HAIL to the KING, AMD/ATI...

:) ... JK..
a c 192 U Graphics card
a b å Intel
a c 123 à CPUs
April 3, 2008 6:01:41 PM

This, from the same company that brought us the Graphics Media Deccelerator?



There's ALWAYS a drone...
Related resources
April 3, 2008 6:16:14 PM

Might be true, don't know how long it will take though. I'm interested to see future games using raytracing. Although, I'm sure nvidia will try and make a discrete card that is capable of doing ray trace calculations if need be.
April 3, 2008 6:16:26 PM

Onus said:
This, from the same company that brought us the Graphics Media Deccelerator?



There's ALWAYS a drone...


lol, you said deccelerator :bounce: 
April 3, 2008 6:20:24 PM

spoonboy said:
According to intel, we won't be needing gpus for the most part in the future. Maybe not for windows, but for gaming they are dead wrong. Their point of view seems to be 2-fold, sli/crossfire is not as efficient as extra cpu cores, and cpus will/can do everything gpus can. One super cpu will never be as good as one super gpu. end of. </rant>

http://www.tgdaily.com/content/view/36758/135/

Short extract:

"Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered “more bang for the buck” and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. “The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system,” said Fosner.

Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards – this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that “everybody” knows how to program for a CPU and that this new way of programming will “get rid of” a path of graphics obsolescence.

When asked if discrete graphics cards will be needed in the future, Fosner answered, “Probably not”. He explained that computer didn’t have discrete graphics in the 80s and that CPUs are becoming powerful enough to take over that role."


Intel is so full of themselves, if they believe this ****, I want to see some in game benchies, and I wanna see them now. :pt1cable: 

Diddy mow! :lol:  :lol: 
April 3, 2008 6:23:19 PM

Dude, those graphics suck balls. What is that...Oregon Trail?
a c 107 U Graphics card
a b å Intel
a c 126 à CPUs
April 3, 2008 6:31:52 PM

If 4 cores with 2 threads each only get a max of 18 fps with some fire. yeah...i think video will be around for a while....

I can see running all the physics off the cpus, but not both physics and graphics...

But the ray trace developers are hoping cpus will replace gpus in the future as well
April 3, 2008 6:38:16 PM

nukemaster said:
If 4 cores with 2 threads each only get a max of 18 fps with some fire. yeah...i think video will be around for a while....

I can see running all the physics off the cpus, but not both physics and graphics...

But the ray trace developers are hoping cpus will replace gpus in the future as well



All that I see is a physics demonstration. Oh yeah and the thousands of pieces that flew off the house? I saw about 20, where did the other 980 pieces go? Oooo look at the fluffy bunnies... :ouch: 
a c 107 U Graphics card
a b å Intel
a c 126 à CPUs
April 3, 2008 6:42:44 PM

They are talking about the cpu doing EVERYthing in the future....its still far off
April 3, 2008 6:54:04 PM

San Pedro said:
Might be true, don't know how long it will take though. I'm interested to see future games using raytracing. Although, I'm sure nvidia will try and make a discrete card that is capable of doing ray trace calculations if need be.

This sounds kick-ass and a very real possibility.
However, I see this happening at least sometime after DX11 (since DX11 is mostly finalized now)
a c 147 U Graphics card
a b à CPUs
April 3, 2008 6:59:44 PM

When intel shows me a benchmark beating the highend ATI and Nvidia cards I will believe them. Untill then they can keep barfing all over the news.
April 3, 2008 7:01:40 PM

nukemaster said:
They are talking about the cpu doing EVERYthing in the future....its still far off

The technology to have a CPU morph into a GPU and back again has been around for 20 years already. Called Gate-Array.
But for ray-tracing, the power needed is still far off. However, I expect some specialized ray-tracing core sometime at least 3 years away,

Also, having a CPU do everything, like what I thought you said, comes from the CPU graphics card using the same x86 instructions as the CPU. So you can run the same code on a CPU or GPU, even if both are very different things.
So future GPUs will look more like CPUs and CPUs will perform like GPUs.
Anyway, something like this.
April 3, 2008 7:03:19 PM

jay2tall said:
When intel shows me a benchmark beating the highend ATI and Nvidia cards I will believe them. Untill then they can keep barfing all over the news.

I agree. However, I also think Intel can knock ATI and Nvidia silly if they wanted to.
April 3, 2008 7:14:13 PM

They do want to. I don't think they will do it with a physics demonstration and some fluffy bunnies running around some flaming trees at 18FPS though.
April 3, 2008 7:34:18 PM

I agree he's most probably wrong... but I know why he's saying this.

For Intel, future gaming platform will be ray-traced. At this, a fast cpu with multi core will always outperform a fast VPU. I can't explain why, but I know that's the way it is. Upcoming LARRABEE will try to sell under that perspective.

Where it doesn't work is that the whole gaming industry will need to do a complete redesigning of it game engine and even way of working... Just like Intel tried to do when it release the Itanium new 64-bit instruction set. The idea looked great, but was doomed from the beginning.

I'm not sure it's the same today, but to work properly first generation will need to work very well with present generation game engine on top of being very popular so game developper will be willing to work on it. Is this possible???? That's my question!

IF, and only IF, they do manage to build a great VPU for a reasonable price, they will have the necessary leverage to make game designer do new engine for it. The advantage of ray-tracing are so great, first one being possible incorporation of physic engine right into the visual one, not need to have different engine "talking" to each other. But will they be willing (game developper) to start anew??????
April 3, 2008 7:38:18 PM

SpinachEater said:
Dude, those graphics suck balls. What is that...Oregon Trail?


Brilliant game.
April 3, 2008 7:44:54 PM

Nightlysputnik hit the nail on the head. Even if ray tracing becomes possible, support for old rasterized games must be addressed or no one will switch platforms.
April 3, 2008 7:50:42 PM

It's funny how you dis the graphics power on a cpu...

A cpu with 16 cores...
and 8 of them can be grafix cores...

amd if they wanted to could build a multi core CPU, with GPU on board...
again you would not need a video card...

Intel isn't that far off it's rocker...
only thing Intel has to do is build a good GPU and pack it into it's CPU cores.

April 3, 2008 8:00:38 PM

Intel kind of is on the right track, after all, current GPUs are basically many processors working together.

I doubt that CPUs will be able to replace dedicated GPUs anytime soon though, if ever.
a c 147 U Graphics card
a b à CPUs
April 3, 2008 8:14:03 PM

enewmen said:
I agree. However, I also think Intel can knock ATI and Nvidia silly if they wanted to.

Agreed.
a b U Graphics card
a b à CPUs
April 3, 2008 8:26:16 PM

^ Agreed. Intel has shear amounts of money that they can spend on R&D.
April 3, 2008 8:51:53 PM

All they are saying is that they will be stealing from amd once agian. :kaola: 
April 3, 2008 10:24:44 PM

In fact intel is quite there when the idea of a discrete gpu ever popped. And no surprise the cpu will gradually take over the job from all accessories. In fact this process division between various components was done to simplify the work for cpu and now when cpu is smart enough to handle it all compact why not ??
April 3, 2008 11:43:48 PM

In the 1980s I had an ATI EGA Wonder in my 8mhz Turbo XT. Can that be considered discrete graphics?

Anyways, I like the idea of not needing Shader Models and DirectX for gaming. Any game that can use pure CPU power to ray trace or render graphics that match or better DX10 quality and run at 60+FPS is good for this one reason: LINUX GAMING!

If games no longer need DX9/10 to run then theres not much reason to have Windows anymore. I wouldn't miss MS Office all that much really. I can just load up Ubuntu x64 Gamer Edition and be on my way. I think this is one of two big factors holding linux back. The other being distros that don't allow cross distro software compatibility.

GPUs offer features that are not found on CPUs though. That includes dedicated UVD hardware and other video acceleration capabilities. I suppose if a CPU has enough cores this won't be a problem. From a software standpoint though, working with discrete graphics is a lot easier. Until programming techniques are highly optimized for multi-core/multi-threaded hardware, CPU-only graphics and gaming probably won't take off in the near future.

Intel is on to something with this though. I hope this allows my dream of pure Linux gaming to come true.
a b U Graphics card
April 4, 2008 12:02:32 AM

the only thing im looking forward to now is amd/ati fusion!
April 4, 2008 12:27:35 AM

I place my bets on the cpu absorbing the gpu. What we have right now is x amount of pixels. They have NOT been increasing at a very fast pace. Infact I haven't changed resolutions on my computer for almost 10 years. Been "stuck" at 1280x1024 for a long while. With 1080p coming out, that isn't a LOT more. It's a good chunk but truth be told, it's pretty pathetic compared to those 30" LCD's that have been in studio's for a while. With all that being said, the cpu technology has bounded considerably and made their speed improvements almost exponentially. Then add the fact that they are multiplying the NUMBER of these cpu cores. What you have is a product that if programmed correctly, could update a 60Hz display with great graphics!
The underlying programming is all that's different. When you look at the CPU vs the GPU, they are turning into each other slowely and what do you all think will happen... did ATI buy AMD? No. So will NVidia buy Intel... think about it. The technology's will merge as they are doing and you will be able to do everything with a "CPU" but it won't resemble a classic CPU. They will be hybrid designs like they are doing now and the best part about this is the internal bus speed advantages. Right now there is a lot of data needing to be passed between 3 major chips on an Intel board. The CPU, GPU, and Northbridge are trying to manage 2 processing units that each have their own memory. That's like trying to get the rest of the civilized world to help with Iraq. It might work but it would have been done with if America WAS the whole world. 1 processing unit with 1 set of memory to work with. Could become cheaper, faster, and better on power consumption.
April 4, 2008 8:55:49 AM

MadHacker said:
It's funny how you dis the graphics power on a cpu...

A cpu with 16 cores...
and 8 of them can be grafix cores...

amd if they wanted to could build a multi core CPU, with GPU on board...
again you would not need a video card...

Intel isn't that far off it's rocker...
only thing Intel has to do is build a good GPU and pack it into it's CPU cores.


But the problem with that is you would end up buying a high end cpu to play games, but for office work and general tasks a much lower ability cpu would suffice, so you might aswell just by an add-in video card. Not to mention by the time intel make this über cpu that can 'run' todays games, ati & nvidia will be on the refresh of the (upcoming) next generation cards or even have released the generation after that, running tomorrows games. All this cpu is the only way to go is total hokum and doesn't stand up to any scrutiny, ray-tracing squeezing out rasterization in the market place is at least 5 years away, and a 45nm 8 core 16 thread intel chip, even with the right instruction set, will lose HEAVILY to a 65nm g82, let alone whatever gpus will be on 45nm by the end of the year.
Related resources
!