Intel shows no love for anyone! not even ATI!

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990
http://news.softpedia.com/news/Intel-Multi-Core-CPU-039-s-Will-Get-Your-Graphics-Card-Killed-82565.shtml
"According to Fosner, multi-core processors could handle life-like animations, such as weather or effects better than dedicated GPUs. For instance, multi-core processors can handle the graphics tasks in a better manner than a high-end graphics board could ever do."

Imagine CPU technology can run games well. Yes, Intel's going to add something new to the GPU war, thanks to their CPU experiences.

http://news.softpedia.com/news/Intel-039-s-Larrabee-Gets-Detailed-at-the-Pre-IDF-Briefing-81081.shtml

16 Graphics cores!
nVidia and ATI are sweating buckets!
 

smalltime0

Distinguished
Apr 29, 2008
309
0
18,780
lol, isn't this the route AMD is going with fusion?
CPUs have done graphics in the past

Now I wonder how much one of this would cost?
 

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980
I might like it if CPUs ad GPUs fuzed. It would make things easier and cheaper for us consumers. idk though, water cooling would have to become standard though. I'm looking forward to all this, lets just hope they speed things up a notch. Ray Tracing vs Rasterization.
 

Zenthar

Distinguished
I think integrating components with 2 different purposes isn't the best of idea. What I would have liked is multi-socket MB that would allow heterogeneous kinds of processors. For example, if you had a 4 socket board, you could put try different setups like:
■1 CPU + 1GPU + 2 free (entry level)
■3 CPU + 1GPU (parallel processing)
■1 CPU + 3 GPU (graphic workstation)
■2 CPU + 2 GPU (balanced gaming rig)

Given that each of there processor could be multi-core, I think it would make systems much more customizable.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780

Lol, then you can get the best bang for your buck and not have any power wasted? Fat chance. They're not gonna let that happen. :na:
 

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980
What a crappy blog. Couldn't you just link us to the full article directly, rather than make us suffer through sampled Daft Punk?

Honestly, people need to take that song-on-front-page-plays-automatically BS back to myspace where it belongs--in equally bad company.
let the kid do what he wants sheesh, none forced u to read this post
 
All of this Intel killing the graphics card is getting a little out of hand. What you have to realize is that (1) writing a new engine is hard and (2) games follow the money. (1) Look how difficult the switch has been to DX10 (or 10.1) and that is only a minor change. Intel's crazy technology would be a huge change. Would new games only play on one or the other? (2) Look at the ATI NVidia battle. NVidia has more money and thus can "encourage" programmers to optimize for their hardware. Intel has money, but NVidia is far from dead. I'd fear that if Intel went forward with this, they could split the industry. I imagine the only way for this to work FAR in the future is for Intel to buy NVidia to kill them, and then go ahead with their plan.

I think Intel is just bluffing. They are awesome at CPUs, but they know nothing about graphics (as we see from their IGPs). It would be hard even for Intel to make the entire industry make a change that drastic.
 
Not only that, but this is an exspensive venture in a tight market, with low margins. Intel needs not only to deliver a gpu thats competitive, but also needs to write drivers on a consistant basis. Were talking overhead here. What I beliebe is going to happen, if anything actually does at all, is that both ATI and Intel will pretty much put an end to igps, with both selling their ideas of a "fusion" type system, for lower power and decent graphics. The more Intel stirs this up, (with nVidia unfortunately falling for it) the better off they are. Marketing hype, done the Intel way
 

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990
Hey, I was interested enough in what he had to say to follow that link. What greeted me? Kanye West BLARING his "music" over whatever I had playing on xmms at the time... Pavarotti or something. It was pretty irritating.

1997 has long since passed us by; people should know NOT to put noisy, distracting rubbish on their main pages by now.

How old are you? Do you have a job?
Childish. Seriously, get a life...
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280


Your damned on the money i reckon. They (intel) have, depending on how you look at it, either no experience in (real) pc graphics and are looking to go from hero to zero and magically muscle into the mid range (larabee wont be a wonder machine in performance terms when it arrives - only the slow of brain would believe that) graphics segment by end of 2009 (a hugely lofty goal and they'll be lucky to have anything as powerful let alone as backwards compatible as today top cards) OR you say they've been producing graphics chips for something like 15 years and each one has been a stinker (when asked to do more than run windows and play dvds, and even then some are out of their depth). Not to mention the fact that they've only just released their FIRST dx10 driver. Drivers are arguably as difficult as to get right as the hardware is.

Either way its alot of talk from intel, and people - all starry-eyed from what the core architecture did to K8 - are taking it seriously. If intel really take making a decent desktop gpu seriously, and larabee lasts beyond the current intel ceo (I say that not cos i think hes doing anything badly, but he wont be in the job forever and the next guy (or gal) in the big chair might decide that the company should stick to its core business and lets the non gpu-on-a-cpu side of things drop) then it wont likely be a compelling product until its second or third generation. End of rant.