NVIDIA and Intel War To Start In 2010
The Intel-NVIDIA war heeds warning from CRT Captial Group, saying NVIDIA maybe providing motivation to the chip giant. CRT Captial Group analyst Ashok Kumar issued a warning stating the recent tirade by NVIDIA CEO, Jen-sun Huang may have adverse effects – angering a “huge, rich, motivated design powerhouse.”
Kumar believes NVIDIA’s current products are better than Intel’s offerings, but mentions Huang is greatly overlooking Intel’s lack of interest in the high-end 3D-gaming market, and thus the reason why NVIDIA is so successful.
Many saw Huang’s outburst as NVIDIA’s attempt to gain higher ground before an upcoming epic battle between NVIDIA and Intel. The battle is due out in 2010 and is known as Larrabee, a multicore x86 chip as a discrete graphics card from Intel. Along with support for OpenGL and DirectX, Intel announced the chip design will include SSE-like extensions known as Advanced Vector Extensions.
Early reports on Larrabee showed chip designs incorporating 16-cores, with each core capable of operating over 2 GHz. Intel claims Larrabee is capable to scaling to several thousand cores.
Larrabee’s main attraction so far has been its potential as a ray-tracing chip. However, in a recent blog by Tom Forsyth, a developer on the Larrabee project, Intel’s primary design focus was on rasterization since it would be the only way to render the large library of DirectX and OpenGL games on the market. NVIDIA’s ongoing “war” with Intel started with Huang adlibbing during a financial analyst meeting, stating NVIDIA was about to “open a can of whoop ass” on Intel.
Huang further belittled Intel’s graphics solutions as a “joke” and being abysmal in the visual computing world. NVIDIA’s VP of content relations added to the fire with a declaration that the CPU is dead.
ROFFAL.
Stats: nVidia 4985 employees 2007 Sales:4,097,900,000
Intel 86300 employees 2007 Sales:38,334,000,000
... even these stats notwithstanding Intel has the advantage because.. bigger company,talented crew , more money for R&D, production capacity and if you ask the average joe on the street he will have no clue who nVidia is but Intel... that stupid song will play in the guys head .. you know from the Pentium commercials...DIN DONG DING DONG
I like nVidia but "open a can of whoop ass" dude..... what if you don't win
Truthfully, after reading an article in CPU magazine, I see AMD/ATI/NVidia ganging up to build a gaming OS where Intel sticks with Microsoft. The resulting gaming OS could potentially win back a lot of ground for the gpu depending on the price of said OS. Basically they would just need to knock out Intel from having the cpu advantage and put most everything onto the gpu like the ps3 has done with the cell processors. Imagination running wild; Sony helps NVidia/AMD/ATI take on Intel, haha. That would be awesome for PC gaming.
IMO, this was headed towards a showdown long before the mega-egos in both camps started flappin' their gums in a public arena. 2010 is going to be a very interesting year. I hope the outcome is beneficial rather than disastrous for PC gaming .
can Nvida keep up though? with 80cores at 2k speed?
Like lopopo said, "DIN DONG DING DONG" for average joes. Imagine Intel hard on advertising spoiling the game. AMD is nearly out and guess what Nvidia is about the take AMD place rofl.
Nvidia looks like a big shot lately, but they have no competition. Now some is coming up, and they can't even be respectful to the competitor? Sounds to me like they have an awful good thing going and are angry to see an upcoming challenge.
Also looks to me like they started it, by moving into the chipset market, and then starting the hype about replacing CPUs with GPUs.
But... Nvidia is small beans compared to Intel. Look at it this way, what does Nvidia have to do to get Intel out of the game? It's surreal. Intel is HUGE and has their hands into a million different pots, and we aren't even talking about political clout.
What does Intel have to do to knock Nvidia out of the game? Just 1 thing, make a CPU handle graphics.
Personally, I think given a few years - we can see why Nvidia lost face and blew their lid, overreacting like they did. Surest sign is the sign of fear in their faces.
Ultimately it comes down to developer support. Intel is betting that raytracing is going to be the next big way to render 3d graphics, but everything I've seen from folks like Carmack say that raytracing is still a long ways off.