Sign in with
Sign up | Sign in
Your question
Closed

Patent Filing Reveals Nvidia May Build Tiny Computers

Last response: in News comments
Share
November 2, 2011 5:08:01 PM

If old Radio Shack ads are any indication, old bulky desktops were known as "little computers" back in the day.

If this is now a "little computer", I have to wonder what the term will refer to in a decade or two.
Score
15
November 2, 2011 5:16:48 PM

It probably will need to be under 50 bucks or no one will buy.
Score
-9
November 2, 2011 5:21:02 PM

will work good as a terminal for a root server :) 
hope they can run linux.
Score
8
November 2, 2011 5:27:02 PM

I could imagine having this thing plugged into an available hdmi slot on a typical monitor or television. With Wi-Fi capabilities and a usb port(s) for keyboard/mouse, or bluetooth capable for wireless keyboard/mouse, I could leave the typical desktop machine powered off and use this lil guy to surf the web and post on Tom's, for under a $100? That should be a worthwhile long-term power saving device. There's certainly plenty of other alternatives to achieve the same goals, but choice is always welcome. I wouldn't buy it but it could be useful to some people.
Score
10
November 2, 2011 5:34:43 PM

keyanfIf old Radio Shack ads are any indication, old bulky desktops were known as "little computers" back in the day.If this is now a "little computer", I have to wonder what the term will refer to in a decade or two.


in one decade there will be such incredible changes I don't think people realize what is coming and how soon it will be here. The very word "computer" may no longer have meaning ten years from now. The IBM "watson" computer will be ordinary tech in a desktop size by then and owned by private individuals. In two decades I'm thinking personal android assistants will be available. The technological singularity is expected to arrive in 2045 if current rate of technology acceleration continues.
Score
0
November 2, 2011 5:36:45 PM

This Sounds Great Idea ! I would buy one even if it works for web and documents.
Score
3
November 2, 2011 5:46:21 PM

xX_PEMDAS_Xxwill work good as a terminal for a root server hope they can run linux.


Forget it, your stupid Linux never had good hardware support and never will, since open source fools are stubborn and refuse to accept proprietary drivers easily. So unless nVidia will develop it for Linux from day 1 and support it (which is sooooo not likely!), expect a ton of issues. Linux users like them, it's why they don't need gaming - they play with their OS all the time instead.
Score
-9
a b Î Nvidia
November 2, 2011 5:46:53 PM

This could be very neat if you go to client sites to provide support or give demos. You can travel light, no more laptop to carry... But, will this Tegra processor run PowerPoint or whatever software you want to run?


Score
5
November 2, 2011 5:49:10 PM

clonazepamI could imagine having this thing plugged into an available hdmi slot on a typical monitor or television. With Wi-Fi capabilities and a usb port(s) for keyboard/mouse, or bluetooth capable for wireless keyboard/mouse, I could leave the typical desktop machine powered off and use this lil guy to surf the web and post on Tom's, for under a $100? That should be a worthwhile long-term power saving device. There's certainly plenty of other alternatives to achieve the same goals, but choice is always welcome. I wouldn't buy it but it could be useful to some people.


Got a netbook for all that already :D  I'd buy this thing just for the hell of it if it'd be cheap - I think this is what nVidia is hoping for. Don't see any real use for it right now.
Score
4
November 2, 2011 5:52:31 PM

Sounds similar to the $25 raspberry pi computer.
Score
10
November 2, 2011 6:00:45 PM

loomis86in one decade there will be such incredible changes I don't think people realize what is coming and how soon it will be here. The very word "computer" may no longer have meaning ten years from now. The IBM "watson" computer will be ordinary tech in a desktop size by then and owned by private individuals. In two decades I'm thinking personal android assistants will be available. The technological singularity is expected to arrive in 2045 if current rate of technology acceleration continues.


Let's not get ahead of ourselves. We've been prognosticating for several decades that a lot of technology, such as robots, flying cars, cheap space travel, etc. were only X years away, and many of those things are still a long ways away. The big technological leaps tend to be ones few were predicting.
Score
7
November 2, 2011 6:11:29 PM

Zagen30Let's not get ahead of ourselves. We've been prognosticating for several decades that a lot of technology, such as robots, flying cars, cheap space travel, etc. were only X years away, and many of those things are still a long ways away. The big technological leaps tend to be ones few were predicting.


Thank you. So sick of this "ZOMG, in 10 years everything is gonna be SO different.."
Score
3
November 2, 2011 6:13:27 PM

Good lord, they are trying to patent the idea of putting all the components for a computer into a single box!
Score
12
a b Î Nvidia
November 2, 2011 6:18:26 PM

keyanfIf old Radio Shack ads are any indication, old bulky desktops were known as "little computers" back in the day. If this is now a "little computer", I have to wonder what the term will refer to in a decade or two.


The Eniac, in 1946, could do something like 400 multiplications per second. It weighed 27 tons.

In the 90's supercomputers were getting close to 1 GFLOP. That's 2.5 million times better.

China had a computer performing at 2.5 Petaflops in 2010. That's another 2.5 million times better.

At this rate, I'd say the supercomputers of 2030 will be pretty good.


As for the computers you will buy for home... These days a GTX 560 claims 1075 GFLOPs. That's like the supercomputer of 20 years ago, but 1000 times faster and 1000 times smaller and many thousands of times cheaper. And that's the graphics card I did NOT buy after all, because I wanted something a little faster, LOL.

What will happen in another 20 years? At this rate, a mainstream card will be the size of a nickel, it will cost a nickel, and two or three of them in SLI will beat China's supercomputer of 2010.

Score
1
November 2, 2011 6:45:51 PM

amk-aka-phantomGot a netbook for all that already I'd buy this thing just for the hell of it if it'd be cheap - I think this is what nVidia is hoping for. Don't see any real use for it right now.


Bah you missed the point of being able to browse the web on a 19-30" monitor at resolutions from 720p to 1080p, and type on a standard size keyboard. If I had a netbook, I wouldn't look to use it at home. :p 
Score
2
November 2, 2011 6:46:00 PM

Zagen30Let's not get ahead of ourselves. We've been prognosticating for several decades that a lot of technology, such as robots, flying cars, cheap space travel, etc. were only X years away, and many of those things are still a long ways away. The big technological leaps tend to be ones few were predicting.

Well, one of the main reasons technology isn't evolving as fast as we thought it would is because there has been a bottleneck for the past 100 years. Which is energy. It isn't the fact we don't know how to build flying cars, laser guns, cheaper space travel. It is the fact electricity and fossil fuel is just too weak to handle such devices. Nuclear energy on the other hand is just too dangerous to be used by consumers. Guaranteed, the second there is a new, powerful, reliable, and safe power source, that technology will leap and move so fast, we won't be able to keep up.
Score
7
November 2, 2011 6:47:11 PM

By the time 2045 roles around we better have minority report type of technology. :) 
Score
-1
November 2, 2011 6:52:42 PM

Marvell has been selling these for a while now. They're called Plug Computers. They start around $99.

http://www.marvell.com/solutions/plug-computers/

They essentially are cell phones without the LCD.
Granted a Tegra-3 or Tegra-4 would make for a MUCH faster one.
Score
1
November 2, 2011 6:53:22 PM

ikyungWell, one of the main reasons technology isn't evolving as fast as we thought it would is because there has been a bottleneck for the past 100 years. Which is energy. It isn't the fact we don't know how to build flying cars, laser guns, cheaper space travel. It is the fact electricity and fossil fuel is just too weak to handle such devices. Nuclear energy on the other hand is just too dangerous to be used by consumers. Guaranteed, the second there is a new, powerful, reliable, and safe power source, that technology will leap and move so fast, we won't be able to keep up.


Yeah. We have the technology to develop and build devices that could be sent up into the ozone layer and repair it. It's financially unavailable, similar to many things being restricted by the power sources.

Anyway... it'd be nice to browse the web for under 50 watts total - power for: display, computer, interface devices like keyboard/mouse, modem, and router... Maybe someday... to the experts, how close are we realistically? :D 

Maybe if I pump a bunch of hamsters full of growth hormones and get them to spin up an alternator... hmmm... Ok, I'm just going to stop here... lol
Score
0
November 2, 2011 6:53:49 PM

loomis86in one decade there will be such incredible changes I don't think people realize what is coming and how soon it will be here. The very word "computer" may no longer have meaning ten years from now. The IBM "watson" computer will be ordinary tech in a desktop size by then and owned by private individuals. In two decades I'm thinking personal android assistants will be available. The technological singularity is expected to arrive in 2045 if current rate of technology acceleration continues.


not exactly, i mean we came a long way by making things smaller and smaller but we are coming to a point where we cant make them smaller, and we have to use more efficient materials.

the day we switch over to grahpene, we will get 25-200 ghz off air cooling, but they wont be smaller than what we currently have.

we could have bio chips by than... but... ethical reasons that tech will be on the wayside from many MANY years.

basically what im saying is making them smaller is close to impossible, hell the way i see it, ssd tech, as it is right now, will only get to about 640gb for 100$, assuming that 7nm is close to the peak of how small it can go,
Score
1
November 2, 2011 8:41:40 PM

"The Eniac, in 1946, could do something like 400 multiplications per second. It weighed 27 tons.

In the 90's supercomputers were getting close to 1 GFLOP. That's 2.5 million times better.

China had a computer performing at 2.5 Petaflops in 2010. That's another 2.5 million times better."

Not to nitpick, but in the earliest top500 list I can find [6/1993], the #1 supercomputer was at 60 GFLOPS and as of their most recent list [6/2011], the #1 supercomputer is at 'only' 102,000 times that.

And http://boinc.berkeley.edu/talks/singapore_public.pdf has the Eniac listed at 50 KFLOPs not 400 FLOPS, so the jump from 1946 to 1993 is probably more 1.2Mx
Score
1
November 2, 2011 8:52:14 PM

alidannot exactly, i mean we came a long way by making things smaller and smaller but we are coming to a point where we cant make them smaller, and we have to use more efficient materials.the day we switch over to grahpene, we will get 25-200 ghz off air cooling, but they wont be smaller than what we currently have. we could have bio chips by than... but... ethical reasons that tech will be on the wayside from many MANY years. basically what im saying is making them smaller is close to impossible, hell the way i see it, ssd tech, as it is right now, will only get to about 640gb for 100$, assuming that 7nm is close to the peak of how small it can go,



I don't believe it. They've been saying for 25 years the end of moore's law is just around the corner. But ya know what? The industry always finds a way over the wall and moore's law keeps on keepin on. IBM's watson is a crazy insane achievement and there's no reason why progress is going to stop there. You kiddies on this site just don't understand that a $35 programmable scientific calculator is more powerful than a $2500 personal computer was 30 years ago. A $400 smart phone is more powerful than a $100million supercomputer was 40 years ago. The military has unmanned planes that will very soon need ZERO human instructions to do their job. there will be fully autonomous androids very soon, as in ten years. There's just no way it couldn't happen short of complete destruction of modern human civilization. The progress to date has been achieved by an economy of 300million or less people(america). Very soon, china and india will be equal to america in terms of technology and manufacturing and they are over a billion people EACH. If you think that won't kick technological advances into a higher gear you are delusional. China is already taking the lead in thorium reactors which are the next generation of nuclear power.
Score
-1
November 2, 2011 9:15:37 PM

Or plug it into a smart phone.
Score
0
Anonymous
November 2, 2011 9:28:10 PM

"In volume production, Nvidia could push the bill of materials well below $100, which could make it the most affordable new computer on sale."
next to RaspberryPI (^__~ )
Score
0
November 2, 2011 9:40:35 PM

loomis86I don't believe it. They've been saying for 25 years the end of moore's law is just around the corner. But ya know what? The industry always finds a way over the wall and moore's law keeps on keepin on. IBM's watson is a crazy insane achievement and there's no reason why progress is going to stop there. You kiddies on this site just don't understand that a $35 programmable scientific calculator is more powerful than a $2500 personal computer was 30 years ago. A $400 smart phone is more powerful than a $100million supercomputer was 40 years ago. The military has unmanned planes that will very soon need ZERO human instructions to do their job. there will be fully autonomous androids very soon, as in ten years. There's just no way it couldn't happen short of complete destruction of modern human civilization. The progress to date has been achieved by an economy of 300million or less people(america). Very soon, china and india will be equal to america in terms of technology and manufacturing and they are over a billion people EACH. If you think that won't kick technological advances into a higher gear you are delusional. China is already taking the lead in thorium reactors which are the next generation of nuclear power.


you are looking at it from the wrong prospective, 30 years ago, its really hard to explain without a firm understanding, which i lack, of the tech back than, but i can tell you this, they weren't hitting the limitations of what the tech could do, but they were hitting the limitations of what they could manufacture. we, today, are getting close to the actual physical limitations of tech.

androids will most likely never happen, i wont go into much detail, but that tech will get put into the same realm as human cloning. we will get human like robots, which can do basic tasks, and we will get controllable human shaped robots, but never an android.

and the reason china is going ahead with nuclear tech is partially due to need, and partially because of the douches in america who try to stop everything that is nuclear related.

size wise, we cant get a whole lot smaller, it may be possible that a pc in about 15 years can be the size of 2 bluray cases and an extra space for a hdd or a smaller hdd than the 2.5 or 3.5 we have now, and would be about equivilant to now, but thats assuming everything scaled, from heat index to the size of everything else in the pc. at some point the pc will be so small that you cant build them any more, and i have to believe that many people and places will fight that to the bitter end.
Score
1
Anonymous
November 2, 2011 9:46:58 PM

amk-phantom: Really, Linux doesn't have good hardware support? This isn't 2006, it's easier to find a computer that doesn't work well with Windows than it is with Linux. And being that Linux is the premier ARM platform by a long-shot, you can rest assured that it will be in the Linux Kernel's GIT tree long before there's any product on the shelf.
Score
1
November 2, 2011 9:56:05 PM

alidanwe, today, are getting close to the actual physical limitations of tech.


Not really. We've only scratched the surface of 3D chips. There's a ton of technology in the pipe which can go for another 10 years easy before having to move to quantum computing.

I see batteries lagging heavily but even they have new materials/processes coming online in the next couple years.
Score
0
November 2, 2011 10:20:06 PM

alidanyou are looking at it from the wrong prospective, 30 years ago, its really hard to explain without a firm understanding, which i lack, of the tech back than, but i can tell you this, they weren't hitting the limitations of what the tech could do, but they were hitting the limitations of what they could manufacture. we, today, are getting close to the actual physical limitations of tech. androids will most likely never happen, i wont go into much detail, but that tech will get put into the same realm as human cloning. we will get human like robots, which can do basic tasks, and we will get controllable human shaped robots, but never an android.and the reason china is going ahead with nuclear tech is partially due to need, and partially because of the douches in america who try to stop everything that is nuclear related. size wise, we cant get a whole lot smaller, it may be possible that a pc in about 15 years can be the size of 2 bluray cases and an extra space for a hdd or a smaller hdd than the 2.5 or 3.5 we have now, and would be about equivilant to now, but thats assuming everything scaled, from heat index to the size of everything else in the pc. at some point the pc will be so small that you cant build them any more, and i have to believe that many people and places will fight that to the bitter end.


One of us does not know what the word "android" means, and I think it is you. asimo by honda is an android...too expensive and too many limitations to be a truly useful personal assistant, but give it another ten years, bud. Ten years in the electronics fields brings ORDERS OF MAGNITUDES in improvements. That little asimo will have the brain of IBM's watson in a decade and cost 1/1000th as much.
Score
-1
November 3, 2011 8:45:43 AM

Do any of the common interface ports at present have enough juice? Obviously usb2 doesn't deliver anywhere near the 10W needed to power such a thing, but do any other powered ports ? if not, it'll take a while to catch on, as nvidia would first have to convince samsung and the rest to include a capable port.
Score
1
November 3, 2011 10:54:54 AM

How on earth can they get patent for an existing product, the pc, that is simply smaller? Right now every patent seems to be for computers designed for a specific use, not something patent worthy imho.
Score
0
November 3, 2011 1:05:06 PM

So, if nVidia can do this, where is the standard usb port on my phone?
Score
0
November 3, 2011 3:16:37 PM

jablieseSo, if nVidia can do this, where is the standard usb port on my phone?


On the other end of the phone's usb cable =/
Score
0
November 3, 2011 3:21:00 PM

neiroatopelccDo any of the common interface ports at present have enough juice? Obviously usb2 doesn't deliver anywhere near the 10W needed to power such a thing, but do any other powered ports ? if not, it'll take a while to catch on, as nvidia would first have to convince samsung and the rest to include a capable port.


PoE+ (Power over Ethernet) can supply 25-50 Watts.
You could always add more ports for more power.
It's already a standard which is a plus. IEEE 802.3at-2009

Score
0
November 3, 2011 4:08:11 PM

If they can make these things reasonably well and on the cheap side I would buy one.
Why not build this type of computer into a mouse or tiny fold up keyboard? The trouble with these pocket portable PCs is where do you put the input devices?
Score
0
November 3, 2011 4:27:22 PM

loomis86I don't believe it. They've been saying for 25 years the end of moore's law is just around the corner. But ya know what? The industry always finds a way over the wall and moore's law keeps on keepin on. IBM's watson is a crazy insane achievement and there's no reason why progress is going to stop there. You kiddies on this site just don't understand that a $35 programmable scientific calculator is more powerful than a $2500 personal computer was 30 years ago. A $400 smart phone is more powerful than a $100million supercomputer was 40 years ago. The military has unmanned planes that will very soon need ZERO human instructions to do their job. there will be fully autonomous androids very soon, as in ten years. There's just no way it couldn't happen short of complete destruction of modern human civilization. The progress to date has been achieved by an economy of 300million or less people(america). Very soon, china and india will be equal to america in terms of technology and manufacturing and they are over a billion people EACH. If you think that won't kick technological advances into a higher gear you are delusional. China is already taking the lead in thorium reactors which are the next generation of nuclear power.

Our drones will never become automic. The risk of hacking is too high, and will remain that way until we all kill ourselves.
Score
-1
November 3, 2011 4:36:46 PM

Quote:
amk-phantom: Really, Linux doesn't have good hardware support? This isn't 2006, it's easier to find a computer that doesn't work well with Windows than it is with Linux. And being that Linux is the premier ARM platform by a long-shot, you can rest assured that it will be in the Linux Kernel's GIT tree long before there's any product on the shelf.


LOL! Of course Linux doesn't have good hardware support. HW acceleration on AMD APUs? Doesn't work. Dedicated graphics card support? Half-assed. Install Linux on any laptop with above-average hardware - chances are that most fancy multimedia devices such as touch panels for volume control, hotkeys and the like won't work. There're STILL no BIOS flash tools that can be run from the OS. Many peripherals - USB Ethernet adapters, USB 3G modems, some printers, etc. - don't work. This isn't 2006, this is almost 2012 and your Linux still sucks. And I work with it on a daily basis. Great OS for servers, useless for home.
Score
1
November 3, 2011 6:21:56 PM

loomis86I don't believe it. They've been saying for 25 years the end of moore's law is just around the corner. But ya know what? The industry always finds a way over the wall and moore's law keeps on keepin on. IBM's watson is a crazy insane achievement and there's no reason why progress is going to stop there. You kiddies on this site just don't understand that a $35 programmable scientific calculator is more powerful than a $2500 personal computer was 30 years ago. A $400 smart phone is more powerful than a $100million supercomputer was 40 years ago. The military has unmanned planes that will very soon need ZERO human instructions to do their job. there will be fully autonomous androids very soon, as in ten years. There's just no way it couldn't happen short of complete destruction of modern human civilization. The progress to date has been achieved by an economy of 300million or less people(america). Very soon, china and india will be equal to america in terms of technology and manufacturing and they are over a billion people EACH. If you think that won't kick technological advances into a higher gear you are delusional. China is already taking the lead in thorium reactors which are the next generation of nuclear power.

You had me until you said that the USA was the only one creating technology. You forgot about all of the countries in Europe (which came up with the vast majority of scientific discoveries pre WW1), Canada, Australia, Japan, Korea, etc. The US is probably responsible for 30% of world scientific progress in the past 50 years.
Score
-1
November 3, 2011 6:31:46 PM

loomis86One of us does not know what the word "android" means, and I think it is you. asimo by honda is an android...too expensive and too many limitations to be a truly useful personal assistant, but give it another ten years, bud. Ten years in the electronics fields brings ORDERS OF MAGNITUDES in improvements. That little asimo will have the brain of IBM's watson in a decade and cost 1/1000th as much.


when i hear android, what i think is it being hard to distinguish one from a human, right now the most advanced ones dont even look human, and the ones that look human are little more than a real doll with a face that can move.

we will never create a true android, like what you see in science fiction because people just will never allow it to get that far.

CazalanNot really. We've only scratched the surface of 3D chips. There's a ton of technology in the pipe which can go for another 10 years easy before having to move to quantum computing. I see batteries lagging heavily but even they have new materials/processes coming online in the next couple years.


when i say physical limitations, im talking about size, and scale... as in there is a limitation of about 6 or so nm that we cant make them smaller than. ill admit im not a physicist, and don't know the possible ways around it, but i have yet to read something that is promising that would lead to smaller tech.

even when we do go into 3d chips, as we are just hitting the surface with them i can see cooling them being a major problem, i can also see that it would allow us to make a chip smaller, sort of...
Score
1
November 3, 2011 7:11:12 PM

Yes cooling is an issue for 3D chips but when you look at a cube you have 6 sides to apply cooling. One will have pins but that leaves 5 to apply heat sinks.

Think of a 1 floor house like a Ranch.
Now think of a 10 floor Hotel.

Both have construction, energy and cooling challenges. That didn't stop people from making buildings with 160 floors.

There are many advantages of 3D especially with memory efficiency. A server CPU can use 50-100 Watts just in the DDR3 memory interface because it has to travel several inches across a PCB. Stack that on the CPU and you can cut that by 10-100 times which also reduces the heat.
Score
1
November 3, 2011 11:20:06 PM

If I'm not wrong, wasn't there already a system much like this?
Score
0
November 4, 2011 1:30:31 AM

CazalanYes cooling is an issue for 3D chips but when you look at a cube you have 6 sides to apply cooling. One will have pins but that leaves 5 to apply heat sinks. Think of a 1 floor house like a Ranch. Now think of a 10 floor Hotel.Both have construction, energy and cooling challenges. That didn't stop people from making buildings with 160 floors. There are many advantages of 3D especially with memory efficiency. A server CPU can use 50-100 Watts just in the DDR3 memory interface because it has to travel several inches across a PCB. Stack that on the CPU and you can cut that by 10-100 times which also reduces the heat.


yes think of it like a cube, it has 6 sides to cool, but those sides will be the coolest part, think of the center. i don't think we could currently make a 3d chip without it being far to complex to cool correctly, intel is just scratching the surface with theirs, it will probably take under clocked graphene chips before 3d is a viable option, considering when ghz they can push on air, the center may never get hot enough fast enough to do damage, but look at what happens to a current cpu if its not cooled 24/7

topics like that interest me, really wish i knew more about it.
Score
1
November 4, 2011 3:53:11 AM

Nice to see some people understanding that atom size is the ultimate limiting factor AND liquid cooling is soon a must on desktop CPU-GPU because of ever increasing power density. There are numerous other factors involved like quantum tunneling, increasing fab cost, etc


Score
0
November 4, 2011 9:51:12 AM

CyberAngelNice to see some people understanding that atom size is the ultimate limiting factor AND liquid cooling is soon a must on desktop CPU-GPU because of ever increasing power density. There are numerous other factors involved like quantum tunneling, increasing fab cost, etc


i don't think water cooling will ever be a must, correct me if i'm wrong, but doesn't water cooling just attach a piece of metal to the chip, but pump water through it instead of having the fan attached to it?

you may be thinking of oil immersion though, but even that wouldn't be able to cool a 3d chip, at least not the way im thinking.

whats more likely to happen is we find ways to use more of the power we put into the chips and reduce their power imprint, reducing or eliminating the need to cool it entirely.

fabrication costs, that can be marginalized. because ones we get down to about the 6nm level, there will be no real reason to completely overhaul a fab plant again, at least i don't think... i may be thinking of this wrong, it will cost money to get there, but the benefit outweigh the negatives, and worse come to worse, i could see military funding going to making these plants.
Score
0
Anonymous
November 24, 2011 5:04:38 PM

I think if NVIDA plays its cards right, it could be a new competitor for Home entertianment like consoles. IF they team up with STEAM and actually put together some impressive specs and are upgradable like a PC we could see the end of consoles as they are today, over priced outdated technology that most just play games, most which arn`t very good.
Score
0
!