Sign in with
Sign up | Sign in
Your question
Closed

Plausible: Nvidia Working on x86 CPU

Last response: in News comments
Share
February 9, 2009 5:44:22 PM

I remember reading this on the Inquirer a few days ago. It may be interesting to see Intel come down on them.
Score
1
February 9, 2009 5:46:20 PM

Competition is a great thing for the market. I hope it's true.
Score
5
Related resources
February 9, 2009 5:46:44 PM

I could be wrong but isn't x86 32-bit? I would think if you're going to invest a crapload of money into developing a new chip, shouldn't you probably do a 64-bit one? Unless you think 32-bit is going to be around forever.
Score
-8
February 9, 2009 5:53:14 PM

My two cents (I'm no expert and I don't work in IT) is that Nvidia is working on a hardware decoder to turn x86 commands into something its Graphics cards can handle. That way they can throw 2, 3 or 4 cards in a computer and not need a processor. Also this makes it so they don't have to ask the community to recompile their programs for their arch.
Score
7
February 9, 2009 6:06:25 PM

saljrWhat about merging with AMD/ATI?



Do you want a monopoly in the graphics card sector? Because if they merge, that is what you'll have. Remember in 2006 when AMD/ATI didn't have a GPU that could compete with the nVidia 8 series? $400 8800GTS 640MB graphics cards for almost a year was the result. nVidia didn't come out with a single GPU that was noticeably better than the 8800GTX for 2 years because they didn't have to.

nVidia merging with AMD/ATI would be just as bad as AMD going under and Intel being the only CPU maker.
Score
7
Anonymous
a b U Graphics card
a b à CPUs
February 9, 2009 6:06:42 PM

Well, x86 now generally refers to the instruction set supported by CPUs used in PCs (and now Mac too). On the other hand, if NVIDIA is building a x86 CPU, this should be top secret, and you have to wonder how likely it is that an NVIDIA hater like Charlie at the Inquirer would have access to that. NVIDIA may be hiring engineers with CPU experience, but given AMD's layoff and mandatory pay cut for everyone, isn't that normal for any company looking to hire at this time? If you're looking for an hardware engineer with good experience, you're going to have many candidates who are current or former AMD engineers.
Score
1
February 9, 2009 6:07:12 PM

dragabainMy two cents (I'm no expert and I don't work in IT) is that Nvidia is working on a hardware decoder to turn x86 commands into something its Graphics cards can handle. That way they can throw 2, 3 or 4 cards in a computer and not need a processor. Also this makes it so they don't have to ask the community to recompile their programs for their arch.

I'd guess something similar. If they can use the x86 CPU as the co-processor of some sort, then translate it into GPGPU commands, then the GPU can run any program available. Who knows...
Score
0
February 9, 2009 6:08:07 PM

dragabainMy two cents (I'm no expert and I don't work in IT) is that Nvidia is working on a hardware decoder to turn x86 commands into something its Graphics cards can handle. That way they can throw 2, 3 or 4 cards in a computer and not need a processor. Also this makes it so they don't have to ask the community to recompile their programs for their arch.


now THAT would be interesting to see. even if it would require a new motherboard design.
Score
-1
February 9, 2009 6:36:28 PM

I smell an Apple-motivated move for NVIDIA, seeing that Apple uses NVIDIA exclusive chipset for everything they do.

deltatux
Score
1
February 9, 2009 6:42:28 PM

Isn't this what Transmeta tried to do?
Score
0
a b à CPUs
February 9, 2009 6:46:02 PM

What legal means does Intel use to restrict x86 licenses? Any patent would have expired in the 90's. It's certainly not a trade secret, it has to be the most documented architecture out there. I doubt copyright applies to a computer architecture. Therefore, couldn't NVIDIA create an x86-compatible chip, and simply avoid any trademarked names and be fine, if they are indeed creating a x86 compatible chip to begin with.
Score
0
February 9, 2009 7:10:18 PM

I would imagine that Nvidia would do something with Via since they don't have an x86 license. The recent news of the ION platform only require a slow Atom processor, they would easily substitute with a Via C7 processor instead. Another thing to note is that the patent for a 486 processor (remember those?) are set to expire this year. Nvidia could probably make a 486 like processor clocked at 1.6ghz easily.
Score
2
February 9, 2009 7:36:22 PM

hellwigWhat legal means does Intel use to restrict x86 licenses? Any patent would have expired in the 90's. It's certainly not a trade secret, it has to be the most documented architecture out there. I doubt copyright applies to a computer architecture. Therefore, couldn't NVIDIA create an x86-compatible chip, and simply avoid any trademarked names and be fine, if they are indeed creating a x86 compatible chip to begin with.


Not entirely true, you can't just steal x86 architecture and market it in your own products, which is why AMD has an agreement with Intel for x86 licensing. Nvidia is not about to resort to piracy. However acquiring a x86 license must be an issue of importance to Nvidia. Without it, they just may be at a dead end, or as mentioned in the article have to deal with loopholes/legal settlements.
Score
1
Anonymous
a b U Graphics card
a b à CPUs
February 9, 2009 7:58:41 PM

I suspect it's no more than a negotiating move, to get Intel to renew their bus license agreement which, IIRC, expired recently. That said, Nvidia is getting squeezed between ATI/AMD (which ended Nvidia's plump marins with the 48xx introductions ... more to come), and Intel with the soon-to-arrive Larabee (graphics on a processor chip) introduction. Tough. Couldn't happen to a better bunch of people. First they pematurely life-shorten 8xxx series chips with "bump" circuit flaws (I'm two 8600GT's down on that count alone, days after warranty expired), and then they have the nerve to rename those lemons twice (to 9xxx and now GTX2xx). I hope they (corporately) disappear forever.
Score
-2
Anonymous
a b U Graphics card
a b à CPUs
February 9, 2009 7:58:43 PM

nVidia will be VIA's high-end CPU collaboration.

The 64 bit extension to x86, sometimes IA-64 or AMD64 or EM64T or "x86-64" is still part of x86. I'd assume if nVidia is working on an x86 chip then it will have 64bit extensions and SSE/2/3 and possible 4 compatibility. Or maybe they will offload the SSE (SIMD) functions to their GPUs.
Score
0
a b à CPUs
February 9, 2009 7:59:02 PM

ph3412b07Not entirely true, you can't just steal x86 architecture and market it in your own products, which is why AMD has an agreement with Intel for x86 licensing. Nvidia is not about to resort to piracy. However acquiring a x86 license must be an issue of importance to Nvidia. Without it, they just may be at a dead end, or as mentioned in the article have to deal with loopholes/legal settlements.

But you didn't answer my question, what exactly would NVIDIA be stealing? "x86 architecture" isn't something that can be stolen, its just a phrase. Did Intel Patent their architecture (in which case it already expired). Did they copyright the architecture, and in that case, what exactly did they copyright? Remember, you patent ideas, and copyright works.

I'm not saying NVIDIA should circumvent the law, I just don't understand what "law" is at work here.
Score
0
February 9, 2009 8:10:40 PM

hellwigBut you didn't answer my question, what exactly would NVIDIA be stealing? "x86 architecture" isn't something that can be stolen, its just a phrase. Did Intel Patent their architecture (in which case it already expired). Did they copyright the architecture, and in that case, what exactly did they copyright? Remember, you patent ideas, and copyright works.I'm not saying NVIDIA should circumvent the law, I just don't understand what "law" is at work here.


Intel owns x86. It is theirs. The instruction set was designed and patented by Intel. They can license it away as they choose and they are choosing not to license it away to Nvidia.
Score
1
February 9, 2009 8:55:24 PM

cpu@cpucomnVidia will be VIA's high-end CPU collaboration.The 64 bit extension to x86, sometimes IA-64 or AMD64 or EM64T or "x86-64" is still part of x86. I'd assume if nVidia is working on an x86 chip then it will have 64bit extensions and SSE/2/3 and possible 4 compatibility. Or maybe they will offload the SSE (SIMD) functions to their GPUs.


The problem is that SSE2,3,4/3Dnow instructions are mixed up between Intel and AMD so it would be impossible for nvidia get license to produce processors with those high end instructions. I doubt that Nvidia is going to make a standalone high end processor that would compete of the likes of Intel/AMD. I would imagine that they would cross license an low end x86 processor or create their own low end processor to skirt any patents. They will combine it with their Ion platform in a single die and make it like AMD's fusion processors for netbooks and set up boxes for a cheap pricce.
Score
0
February 9, 2009 9:08:13 PM

hellwigBut you didn't answer my question, what exactly would NVIDIA be stealing? "x86 architecture" isn't something that can be stolen, its just a phrase. Did Intel Patent their architecture (in which case it already expired). Did they copyright the architecture, and in that case, what exactly did they copyright? Remember, you patent ideas, and copyright works.I'm not saying NVIDIA should circumvent the law, I just don't understand what "law" is at work here.


While you're right that Inellectual Property (In this case, it falls under Industrial Property) patents expire after 20 years, many new patents are filed with each uarch. nVidia may be able to use instructions from the orignal 8086/8088 line - but each jump in technology has granted new patented CPU instructions and technology. Here is a list of a few of the patents such as DSP, Multiscalar, Multiprocessor arrangements, Bus commands, etc:

http://www.wipo.int/tools/en/gsearch.html?cx=000395567151317721298%3Aaqrs59qtjb0&cof=FORID%3A11&q=x86#1290
Score
0
February 9, 2009 9:21:37 PM

taybIntel owns x86. It is theirs. The instruction set was designed and patented by Intel. They can license it away as they choose and they are choosing not to license it away to Nvidia.


Not true. Patents only have a 20 year shelf life. The patent to the 486 processor will expire this year. So Nvidia can develop a processor based on the 486 processor as their own, as long as it doesn't violate any patents from AMD and Intel.
Score
1
Anonymous
a b U Graphics card
a b à CPUs
February 9, 2009 9:52:35 PM

Sadly, its more politics than it is technology and work. That's really too bad for the consumer. I can't believe the x86 architecture can have rights after this long. I mean, this is a long long time. Plus, who is to say that x86 is the best? If we were all using Linux, the x86 would not be such an issue and then nVidia would be free to JUST CREATE SOME SUPER AWESOME HARDWARE and not worry about old x86 compatibility.
Score
-1
February 9, 2009 10:34:13 PM

yea its going to be tough. they can make an x86 cpu but the additional instructions are more likely going to be copyrighted like MMX and SSE. i wish companies just could make there own CPUs so we can finally get away from x86 cpu's and move onto something else cause the time is finally coming where they will finally reach the limits of x86
Score
-2
February 9, 2009 10:34:50 PM

pug_sNot true. Patents only have a 20 year shelf life. The patent to the 486 processor will expire this year. So Nvidia can develop a processor based on the 486 processor as their own, as long as it doesn't violate any patents from AMD and Intel.

AFAIK the patent is for the microcode, not the architecture.

I think they could clean room it, have one team deconstuct the code, make a list of requirements of the code then forward it to a second team who has not seen the code who can make an emulator of sortrs. It would never be as efficient as at x86 instructions as a true x86 design...

And @Valkin, every program in use to date uses the x86 code, and if we all used linux it would still be an issue, as it only runs on x86 compatible hardware.

And its not a case of making a second computer program port for the new hardware, you'd have to start from the base up.
Score
0
Anonymous
a b U Graphics card
a b à CPUs
February 9, 2009 11:17:24 PM

Linux runs fine on platforms other than x86 or x64. For example, there are Linux distros for PowerPC/Cell, Sparc, IA64 and even MIPS IIRC. Because Linux is open-source, if you can find a C compiler you can have Linux on your architecture.
Score
1
a c 106 U Graphics card
a b à CPUs
a b Î Nvidia
February 10, 2009 2:38:12 AM

dragabainMy two cents (I'm no expert and I don't work in IT) is that Nvidia is working on a hardware decoder to turn x86 commands into something its Graphics cards can handle. That way they can throw 2, 3 or 4 cards in a computer and not need a processor. Also this makes it so they don't have to ask the community to recompile their programs for their arch.


I suppose that may be possible on future hardware, but GPUs really aren't built for the type of instructions you would have on an x86 CPU. Even if they did that it would be rather inefficient. Still, I suppose that the overhead would be small on a GPGPU optimized for such a task. At the least they could make the ion platform without Intel at all ^_^.
Score
0
Anonymous
a b U Graphics card
a b à CPUs
February 10, 2009 4:59:59 AM

What I expect is that GPU and CPU merge in a single entity sometime in the future. Instead putting multiple expensive cards in a PC, you could put multiple of these futuristic CPUs.
Score
0
February 10, 2009 5:38:10 AM

I Think this is a smart move for Nvidia, Intel has already stated that Larabee is going to be a Multi-core x86-based chip. Think of a chip with 20-80 pentium1 class cores running at 2GHz, that can all execute x-86 instructions. If Larabee takes off and developers start writing x-86-based video drivers for larabee that would leave out AMD/ATI and NVidia, as their chips could not execute these instructions. this is a a ploy by Intel to steal NVidia's and ATI/AMD's thunder in the GPGPU arena as both AMD/ATI and NVidia each have their own propritary API's to use the GPGPU functionality of their respective chips. this would make CUDA and BADABOOM obsolete, and force NVidia to re-design all thier chips to execute x-86 code to run larabee optimized code on Nvidia's future chips, and Nvidia has a very long way to go to catch up to Intel's experience in optimizing compilers for x86 micro-code. AMD/ATI already have experience developing/optimizing code for x86 architectures.
Score
1
February 10, 2009 6:00:02 AM

If the rumor is true, Nvidia is going to face many issues getting into the CPU industry. One of the high entry barriers in the CPU industry is consumers want the CPU company to have a proven track record before purchasing their products. Both AMD and Intel have this already, for a new company in the CPU market like Nvidia, they're already have a big disadvantage.
Score
-1
a b U Graphics card
a b à CPUs
February 10, 2009 6:01:17 AM

LOL will it be a dust buster? or will the packaging fail and cost HP millions to fix?
Score
0
Anonymous
a b U Graphics card
a b à CPUs
February 10, 2009 6:43:20 AM

At what point did the Inquirer become a credible news source that is worthy of being quoted - there is so much misinformation and completely wrong stories on that site, that I'm stunned Tom's would consider this worth linking.

Recall this is the sight that gave us "Reverse Hyperthreading", a claim that AMD was "dancing in the aisles" back in Apr 07 with a 3.0+GHz K10 chip, a flawed Intel 45nm process that lead to them developing 2 separate 45nm processes... and I can go on and on. so now you pick up an article based on conjecture, written by someone who has a personal axe to grind with Nvidia and call it news?
Score
2
February 10, 2009 7:18:24 AM

Most probable nVIDIA is doing its own "Larrabug" - though do not think that they may succeed - they just could not do even thrue HT-conforting chipsets (so had to "raise hands" before x58).

Would not be not a great surprise if AMD is also creating its own "Larrahopper" - but here the chances are much greater.
Score
-2
a b U Graphics card
a b à CPUs
February 10, 2009 10:25:42 AM

you just never know what Intel can pull off....
Score
0
February 10, 2009 11:36:45 AM

I keep hearing these type of rumors from NVIDIA though I have yet to actually see any hard evidence. When/If a product comes to market then I'll check it out, until then it doesn't much matter.
Score
0
February 10, 2009 12:41:48 PM

Competing with Intel can be deadly in normal CPU area... But those mobile parts may be possible. They allrady have very good mobile GPU chips so having an CPU for handheld devises is possible. You don't need all those instruction sets, but it is not easy...
Score
0
February 10, 2009 1:20:45 PM

Let's assume the rumors are true. nVidia is making an x86 compatible CPU. Don't you think nVidia would research and plan plan plan ahead of time (even longer than two years which is the stated developement time) before even attempting a design that is going to be a fairly big undertaking?

Let's assume that the x86 patent has expired. Let's also assume that the "architecture" could also be changed to become more like a GPU core but be able to run x86 code. Now imagine hundreds of tiny x86 cores that handle the basic functionality of windows. I know there is the issue of MMX and SSE but what if you could replace all of that with CUDA or OpenCL? You would completely avoid paying all of those licencing fees to use that technology. Very adventageous.

Have a look at this article:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=307...

AMD has created a set of SSE5 instructions. This article talks about the weak points of the x86 architecture and how SSE instructions over the years have fixed these problems by creating more robust instructions sets. This leaves me to believe that these instructions can be created by anyone that has an x86 licence.

Let's assume that you CAN create your own instruction sets in order to utilize the x86 architecture and thus interface with modern operating systems. After this assumption it is pretty clear that thousand(s) of tiny x86 processors on a chip (scale may be off) could be a very very powerful force on the CPU landscape. Now combine this chip with the simple number crunching power of the GPU and you have a computing monster on your hands.

Disclaimer: Don't flame me for experimental thought. These are only ideas based on articles I read here. I am not a programmer. I am not an engineer. I am just a fan of technology.
Score
1
February 10, 2009 2:46:19 PM

Also, Nvida may be banking on Intel stepping into the GPU arena and Intel stepping on their patents. If Intel steps on their patents enough, they can just start making X86 parts. Then if Intel sues, they counter sue. Mutually Assured Destruction, kinda like Palm and Apple at the moment.

Also, with AMD's falling market share, they may have some "monopoly" leverage. Back in the days, you could only use ATT/ma bell leased phones, but they forced them to allow aftermarket phones. Microsoft was forced to open up it's API's a bit as well. So who knows. It's all rumors (charlie... inquirer...) at this point anyways.
Score
0
a b à CPUs
February 10, 2009 3:53:24 PM

exit2dosWhile you're right that Inellectual Property (In this case, it falls under Industrial Property) patents expire after 20 years, many new patents are filed with each uarch. nVidia may be able to use instructions from the orignal 8086/8088 line - but each jump in technology has granted new patented CPU instructions and technology. Here is a list of a few of the patents such as DSP, Multiscalar, Multiprocessor arrangements, Bus commands, etc:http://www.wipo.int/tools/en/gsear [...]

Ah ha. Ok, so the comment about needing an x86 license would be because they need a license for any of the extensions: SSE, MMX, 3DNOW (AMD), etc. I mean, I knew there could be no way the original x86 architecture (or microcode or whatnot) was still covered under any patent. Depending on what they need it for, I don't see why NVIDIA couldn't create a 386 clone (which had 32-bit extensions) and simply update the architecture on their own to have whatever performance they need from it. I mean, their own GPUs are far better at multimedia, why would NVIDIA need any of those extensions? Anyway, it's all rumor at this point, so I guess the "why" is the big question here.
Score
0
February 10, 2009 5:58:52 PM

is this a jokeAt what point did the Inquirer become a credible news source that is worthy of being quoted - there is so much misinformation and completely wrong stories on that site, that I'm stunned Tom's would consider this worth linking.Recall this is the sight that gave us "Reverse Hyperthreading", a claim that AMD was "dancing in the aisles" back in Apr 07 with a 3.0+GHz K10 chip, a flawed Intel 45nm process that lead to them developing 2 separate 45nm processes... and I can go on and on. so now you pick up an article based on conjecture, written by someone who has a personal axe to grind with Nvidia and call it news?


Tom's didn't link it....A USER DID!
Score
0
February 10, 2009 6:19:36 PM

hsears@gmail.com
Score
0
February 10, 2009 6:20:42 PM

What is the point? Seems like reinventing the wheel. Intel and amd already have very advanced designs based on the x86 architecture, and via also makes x86 chips. How many cpus of a given species can the market absorb, especially in such a seriously depressed economy? Qould have to be some special chip I'd say.
Score
0
February 10, 2009 7:12:08 PM

Intel is going to have 32nm parts out this year... NVidia is at least a year or two behind Intel technology wise (they're at what... 45nm?) - and that's a HUGE disadvantage even if they have a competitive processor design.

I don't think NVidia has a chance of entering the general purpose CPU competitively... they'd have to invest billions in fab and chip development just catching up, all at a time when most companies are seriously tightening their belts.

As someone said earlier... maybe a mobile part or a niche market, but the i7 has nothing to fear from anything NVidia is going to come up with in a reasonable time frame.
Score
0
February 10, 2009 8:52:26 PM

...And @Valkin, every program in use to date uses the x86 code,

Um... what? You mean everything written for Microsoft systems MS-DOS and up, excluding Windows CE and Mobile? There are still some PowerPC-based Apple machines out there that are perfectly usable and still run OS-X 10.n, but software support for them in certain applications (like iLife) is waining.

"...and if we all used linux it would still be an issue, as it only runs on x86 compatible hardware."
You couldn't be more wrong. Linux has be ported to run on x86, MIPS, ARM, PowerPC, SPARC, RISC, 68k, too many embedded processors to name... You can put it on both of the latest Playstations, both Xbox's, Gamecube, Wii, DS, PSP, freakin' iPods...
Score
0
February 11, 2009 12:48:39 AM

How about that, put the CPU on the GPU!

I'm all for it...bring it!
Score
0
February 11, 2009 11:43:46 AM

You may see that sort of convergence someday... but in all likelihood it'll be Intel doing it (maybe working with someone else... maybe with a new gpu chipset of their design).

According to Toms... Intel is dropping $7 billion in the US this year getting their 32nm fab up, NVidia would be at a serious disadvantage going up against that kind of firepower. I know they seem to be taking pot shots at the big guy... but IMO they'd be better served by protecting their home turf from the likes of ATI and keeping their eyes out for disruptive technologies.
Score
0
a b U Graphics card
a b à CPUs
February 11, 2009 12:13:41 PM

cpu@cpucomnVidia will be VIA's high-end CPU collaboration.The 64 bit extension to x86, sometimes IA-64 or AMD64 or EM64T or "x86-64" is still part of x86.




IA-64 isnt an extension of x86, it's a completely different architecture.
Score
0
February 11, 2009 9:01:25 PM

Any one ever heard of CUDA?
If NVIDIA make's an X86 arc chip it will most likely be awesome.
I can picture it now running quad 240 core chip's doing AI.
Score
0
February 11, 2009 9:50:10 PM

CUDA is not a general purpose tool... it's a specific library that plays to GPU strengths to accelerate highly parallel functions.

It says absolutely nothing about NVidia's ability to create a General Purpose Processor - though if they decided to get into the DSP market they might be able to give it a go.
Score
0
a b U Graphics card
a b à CPUs
February 12, 2009 10:30:03 AM

bsharpAny one ever heard of CUDA?If NVIDIA make's an X86 arc chip it will most likely be awesome. I can picture it now running quad 240 core chip's doing AI.


Different chip/design/application - what may be good at rendering might not be good as a "processor" in windows/apps etc

Its like our brains - there complex and powerful, yet off the top of our head can we calculate 3424344.433332 x 233 + 12? nope, but a calculator can, doesnt mean it can do anything else like walk/ballance, see, smell, feel, think, act etc - not even remotely posible!

Aswell the legalities of coming into the x86 market - dont think its posible, but it is posible to design a cheap and effective cpu, then use some form of modified linux to work with it to make a cheap netbook, and other applications etc (set top boxes with basic internet, hand held devices etc).

Nvidia has to do something with those loss figures they posted recently...
Score
0
Anonymous
a b U Graphics card
a b à CPUs
February 23, 2009 6:35:47 PM

Nobody actually uses X86 architecture anymore

Both intel and amd use converters to break x86 (a complex long winded assembly language) into a smaller, easier managed language, whereby the work is actually done by the CPU's

INtel tried to sue AMD back in the day, because amd's first processors, K6 and the like, did this conversion and thier processors were jsut as fast as intel's offerings. When they lost the suit, intel began doing the same thing, processors haven't actually "processed" real x86 in years, but everything is still written to code to it in assembly, whereby each manufacturer has its own proprietary sub language that it converts to for the processor to do its "work"

No doubt Nvidia would take the same approach, rather than license the real x86 language, which isn't very efficient, apparently
Score
0
!