Sign in with
Sign up | Sign in
Your question
Closed

Nvidia Tegra K1 In-Depth: The Power Of An Xbox In A Mobile SoC?

Last response: in Reviews comments
Share
January 6, 2014 8:01:55 AM

"Approx. Comparison To Both Last-Gen Consoles"
Why don't you update the graph title to "Prev Gen Consoles" since last gen consoles has 1.23 TFLOPS (XBOX One) and 1.84 TFLOPS (PS4) performance difference to this K1 chip is huge. It looks like NVidia will loose another battle...
Score
-12
a b Î Nvidia
January 6, 2014 8:12:23 AM

azzazel_99 said:
Where is the maxwell stuff did they not even address it yesterday? WTF


most likely they will talk about it when the actual gpu is close to release.
Score
0
January 6, 2014 8:38:30 AM

Quote:
At least in theory, Tegra K1 could be on par with those previous-generation systems ...


AMD does it. Intel, too. But ...

When it comes to over-the-top hype, embellishment and hyperbole, nVidia is the king.

Score
8
January 6, 2014 10:04:10 AM

@ Rupert Jr

Well multiplied 3x & with higher memory bandwidth you are there.
1.5x Maxwell & 2x cores (28nm to 16-14nm).
Hyper memory cube.
All this in 2015.
Score
2
January 6, 2014 10:11:32 AM

@ZolaIII
but what you are assuming does not exists nowadays and maybe until years...
where will be other chips at that time?
Score
-2
January 6, 2014 10:48:00 AM

"in fact, the first Maxwell-powered discrete GPUs are expected in the next few weeks"

Excellent.
Score
2
January 6, 2014 11:07:55 AM

RupertJr said:
It looks like NVidia will loose another battle...

Well, they are right on at least one thing: 2014's SoCs are now about on par with high-end components from ~7 years ago or mid-range gaming PCs from ~5 years ago and are managing to do so on a 2-3W power budget, which is rightfully impressive IMO.

If SoCs continue improving at this pace while PCs remain mostly stagnant, the performance gap between SoCs and mainstream PCs will be mostly gone by the time 2015 is over.
Score
5
January 6, 2014 11:27:52 AM

I'm assuming this SoC will run at around 15 watts. With 192 CUDA cores and DDR3 LP, your max output will probably be somewhere around 180 GFLOPS. Yes yes it CAN do the same effects as consoles and PC, so what? If the Game Cube had DX 11.2 api running on it, it could as well, but obviouslt it coiuldn't put too much on the screen. This will be the same deal with the K1.

It's cool, I like it, don't get me wrong, but the stuff they're saying, though technically correct, is so misleading to about 90% of the market. People are going to think they have PS4's in their hand when in reality they have half a xbox 360 with up to date API's running on it.
Score
0
January 6, 2014 11:48:04 AM

Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
Score
5
a b Î Nvidia
January 6, 2014 11:49:02 AM

Factor of 10??

Would it not be a factor of 100?
(200W/2W)
Score
0
January 6, 2014 12:49:09 PM

nvidia forgot it rated the ps3's GPU 2TFLOPs to market it to sony.
Score
2
January 6, 2014 1:54:35 PM

azzazel_99 said:
Where is the maxwell stuff did they not even address it yesterday? WTF



That says a lot right there, Maxwell is far away!
Score
0
January 6, 2014 1:56:10 PM

azzazel_99 said:
Where is the maxwell stuff did they not even address it yesterday? WTF



True but K1 at the right price can be a force in the SOC arena
Score
0
January 6, 2014 2:15:32 PM

So another things guys no mention of current products Tegra 4 /4i...lol, all I see is Tegra 5/6. Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways. Nvidia is good at building Tegra hype but can never deliever. Hopefully they do not mess up k1!
Score
0
January 6, 2014 3:15:31 PM

photonboy said:
Factor of 10??

Would it not be a factor of 100?
(200W/2W)


It would be except they take into account that there is only one SMX rather than eight. So it would really be (25W/2W).

On a side note, if a company like Epic Games were to go back and do a thorough port of UDK 3, I believe it would be reasonable for a publishing company like 2K to either have it's developers go back and recompile the games, or hire new ones to do so, and we might see a market emerge for the K1.

The dream scenario for me would include having Sony add drivers for the dualshock to android, and being able to play borderlands on a tablet with it. Now if only there were someone to throw money into their faces...
Score
0
January 6, 2014 3:18:10 PM

redeemer said:
So another things guys no mention of current products Tegra 4 /4i...lol, all I see is Tegra 5/6. Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways. Nvidia is good at building Tegra hype but can never deliever. Hopefully they do not mess up k1!


Actually, the article mentioned that 4i was still going to be released in the first half of 2014 as a solution for phones. Makes you wonder why they bother after reading the article however.
Score
2
January 6, 2014 10:39:27 PM

redeemer said:
Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways.

Tegra3 launched several months late, just before everyone else released their new next-gen SoCs so it was too little too late to have much of a chance to gain much market share. Tegra4 was largely in the same boat and I'm guessing many device manufacturers may have shied off due to lack of unified shaders and GPGPU too.

If Tegra5/K1 is delivered on-schedule and is priced right, it should have a decent shot at the market. That's a lot of ifs with a sub-par track record so Chris' implied skepticism (as well as many people in these comments, myself included) is very well warranted.

Time will tell.
Score
0
January 7, 2014 3:30:52 AM

After the obvious "but can it run crysis" reference, The main thing i dont understand is how the writer expects nVidia to bring great games over to tablets. The limitations are so massive ompared to PCs when it comes to keyboard/Mice that its simply not going to work.Sure, you can carry a Pad with you, but then again, its more stuff to carry around, and the main reason ppl get tablets is not to have to carry around heavy Laptops.Also dont forget that size matters when it omes to screens.Get your average good game on PC and put it in a tablet. Now imagine playing it in a small screen, with no keyboard or mice, with crappy sound and the limitation of a battery.There is a reason why people keep going to the Cinema even if they got a 40 inch tv at home.
Score
2
January 7, 2014 10:15:40 AM

This was the funniest demo conference ever.I was watching it live on Twitch and the chat was ridiculous.#GTXTRASHCAN #AWKWARDCLAPS
Score
0
a b Î Nvidia
January 7, 2014 11:18:43 AM

InvalidError said:
redeemer said:
Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways.

Tegra3 launched several months late, just before everyone else released their new next-gen SoCs so it was too little too late to have much of a chance to gain much market share. Tegra4 was largely in the same boat and I'm guessing many device manufacturers may have shied off due to lack of unified shaders and GPGPU too.

If Tegra5/K1 is delivered on-schedule and is priced right, it should have a decent shot at the market. That's a lot of ifs with a sub-par track record so Chris' implied skepticism (as well as many people in these comments, myself included) is very well warranted.

Time will tell.


does GPGPU really important for android? AFAIK for the latest snapdragon 805 qualcomm already dropping OpenCL support for the chip.
Score
-1
January 7, 2014 3:08:33 PM

Where is the development in the laptop market? It seems to have stood still with the price points remaining fixed. With all this development, can't someone make a nice cheap gaming laptop?
Score
0
January 7, 2014 7:43:32 PM

dragonsqrrl said:
"in fact, the first Maxwell-powered discrete GPUs are expected in the next few weeks"

Excellent.


Not so good, they will be the crap models running on 28nm (750 isn't high end). The real one comes much later on 20nm where we'll get the REAL replacement for 780ti. Not a complaint about NV, they are just waiting on TSMC like everyone else. Can't wait for NV to move something over to Samsung soon as they haven't made Apple late once yet in years and NV has already got test wafers from them and taped something at 20nm in March 2012 IIRC. I'm thinking June or later, but it's a guess based on 20nm risk production, High volume starting this month (16nm just hit risk) etc. The first 20nm chips from TSMC sound like they are going to Apple (or at least SOCS they say) and at 165K wafers on it too. That seems like a lot of 20nm in 2014 going directly to apple if these numbers are true (based on what TSMC says their product mix of 20nm will be vs. 28 etc).
http://www.patentlyapple.com/patently-apple/2013/12/tsm...
Pretty much the same story everywhere and a lot say ramping for apple. So maybe apple in the first few months then moving to the rest as the year moves on (june?)? It's far easier to roll a new process out on a 80-100mm soc than 540mm GPU (no idea of maxwell size, just guessing it's Kepler roughly).
Score
0
January 7, 2014 10:34:46 PM

I don't see the performance gap between PCs and mobile devices closing, but the fact is my laptop with 384 Kepler cores (GT 650M) has good enough gaming to where I haven't bothered to repair my desktop (GTX 570). If a mobile device with 192 Kepler cores can pull off 40% of my laptop's performance, then the game has chanced completely.While PCs may be capable of processing 5x the graphics of a mobile device at the same price at that point (compare maybe $500 gaming PC to smartphone?), there's a limit to useful hardware. I have a GTX 570 and can't think of any possible reason to upgrade it because it maxes out most games and plays BF3 on Very High. While more powerful gaming devices exist, most people don't have discriminating enough eyes to care about the difference between GTX 570 and GTX 780Ti performance.If Maxwell can push mobile performance a bit further (to the GTX 570 v. GTX 780Ti comparison above), then all of a sudden PC performance won't matter.
Score
0
January 8, 2014 7:34:15 PM

AMD came to CES with a working MULLINS chip... that is the equivalent of the over hyped Nvidia 64bit chip that will NOT be out until a YEAR from NOW!AMD will have MULLINS in the market place in Q2.Nvidia will have the equivalent in the market place in 2015.AMD has been working with 64 bit chips for over a decade!Nvidia has NEVER produced a 64 bit chip!
Score
0
January 8, 2014 7:34:54 PM

AMD came to CES with a working MULLINS chip... that is the equivalent of the over hyped Nvidia 64bit chip that will NOT be out until a YEAR from NOW!AMD will have MULLINS in the market place in Q2.Nvidia will have the equivalent in the market place in 2015.AMD has been working with 64 bit chips for over a decade!Nvidia has NEVER produced a 64 bit chip!
Score
0
January 9, 2014 3:52:08 PM

Quote:
dragonsqrrl said:
"in fact, the first Maxwell-powered discrete GPUs are expected in the next few weeks"Excellent.
Not so good, they will be the crap models running on 28nm (750 isn't high end). The real one comes much later on 20nm where we'll get the REAL replacement for 780ti. Not a complaint about NV, they are just waiting on TSMC like everyone else.
It's possible we won't see desktop SKU's until 20nm. Based on the number of AIO's and notebook's at CES with undisclosed "next gen discrete graphics" I suspect we'll probably see mobile versions of Maxwell come to market first. This would align with Nvidia's architectural emphasis on mobile moving forward. Perhaps they think it would be beneficial to release Maxwell on current lithography to put more of a focus on its primary architectural enhancements, which could very well be power/performance efficiency.
Score
0
January 9, 2014 3:58:19 PM

Bottom line is NOT the process node, its that Nvidia will NOT have a 64 bit chip until 2015.

ALL AMD chips are 64 bit!

You can compare AMD's MULLINS which will be in computers in Q2 to Nividia's NON existent chip when it comes out in 2015.

The 32 bit chip from Nvidia will get its clock cleaned by AMD!

Furthermore, AMD will compete for the MSFT business, not the hyper competitive Android business.

Score
0
a b Î Nvidia
January 9, 2014 10:05:44 PM

Ken Luskin said:
AMD came to CES with a working MULLINS chip... that is the equivalent of the over hyped Nvidia 64bit chip that will NOT be out until a YEAR from NOW!AMD will have MULLINS in the market place in Q2.Nvidia will have the equivalent in the market place in 2015.AMD has been working with 64 bit chips for over a decade!Nvidia has NEVER produced a 64 bit chip!


Ken Luskin said:
Bottom line is NOT the process node, its that Nvidia will NOT have a 64 bit chip until 2015.

ALL AMD chips are 64 bit!

You can compare AMD's MULLINS which will be in computers in Q2 to Nividia's NON existent chip when it comes out in 2015.

The 32 bit chip from Nvidia will get its clock cleaned by AMD!

Furthermore, AMD will compete for the MSFT business, not the hyper competitive Android business.



before declaring AMD as the winner they need design win first. what happen to temash?
Score
0
January 10, 2014 5:53:00 AM

"Could you imagine the gaming performance of your old Xbox in a tablet form factor"Yes. It would be called Ouya 2, what that thing should have been from the get-go.
Score
0
January 10, 2014 8:31:54 AM

renz496 said:
Ken Luskin said:
AMD came to CES with a working MULLINS chip... that is the equivalent of the over hyped Nvidia 64bit chip that will NOT be out until a YEAR from NOW!AMD will have MULLINS in the market place in Q2.Nvidia will have the equivalent in the market place in 2015.AMD has been working with 64 bit chips for over a decade!Nvidia has NEVER produced a 64 bit chip!


Ken Luskin said:
Bottom line is NOT the process node, its that Nvidia will NOT have a 64 bit chip until 2015.

ALL AMD chips are 64 bit!

You can compare AMD's MULLINS which will be in computers in Q2 to Nividia's NON existent chip when it comes out in 2015.

The 32 bit chip from Nvidia will get its clock cleaned by AMD!

Furthermore, AMD will compete for the MSFT business, not the hyper competitive Android business.



before declaring AMD as the winner they need design win first. what happen to temash?


AMD is NOT banking on MULLINS the way Nviidia is promoting K1....

AMD is successfully (belatedely) shrinking the energy draw of their APUs for small form factors. But, competing against Qualcomm, Apple, Intel, Samsung, Nvidia in ultra mobile is NOT AMD's focus.

AMD is focused on devices that require high end computing performance, such as Consoles, Desk tops, Servers, etc.

ALL the AAA games are being OPTIMIZED for AMD!!!
Score
-1
January 11, 2014 8:36:28 AM


Quote:
before declaring AMD as the winner they need design win first. what happen to temash?


You mean, like this ??



:lol: 

Score
0
January 12, 2014 2:36:31 AM

cats_Paw said:
After the obvious "but can it run crysis" reference, The main thing i dont understand is how the writer expects nVidia to bring great games over to tablets. The limitations are so massive ompared to PCs when it comes to keyboard/Mice that its simply not going to work.Sure, you can carry a Pad with you, but then again, its more stuff to carry around, and the main reason ppl get tablets is not to have to carry around heavy Laptops.Also dont forget that size matters when it omes to screens.Get your average good game on PC and put it in a tablet. Now imagine playing it in a small screen, with no keyboard or mice, with crappy sound and the limitation of a battery.There is a reason why people keep going to the Cinema even if they got a 40 inch tv at home.


You don't seem to understand what is happening with the bigger picture. They are coming for your desktops/notebooks/servers. Tablets/phones are just a foot in the door to the gaming/app words of desktops etc.
http://www.brightsideofnews.com/news/2014/1/7/ces-attac...
"Android K1 Desktop"...It has a mouse and keyboard ;)  and is 28in. What small screen? ;)  This thing has a screen 4in bigger than my 24in PC. Also note the new K1 ref tablet has 4GB so very workable for devs and I expect a 6-8GB 2015 announcement from many (at some point in 2015) as we basically move these to be called desktops etc. PCIE in there too, so I'd guess we may be seeing desktops at some point with a discrete card in them and 500w-1000w psu. That is the point after all. Get Intel's market share without needing x86 (why else do you make a custom CPU?).

"ThinkVision 28 4K2K is a 28” All-In-One with a 10-point touchscreen, running Google Android 4.4. This is as close as Google has come to becoming a desktop product and should the price be what we were told it will be, both iMac and other AIO’s are in trouble. Miracast technology is supported, but the connectivity features of this PC are what is expected of a 2014 device: WiFi, Bluetooth 4.0 and NFC. Can Google become a desktop OS vendor courtesy of Tegra K1 and Lenovo? That remains to be seen."

So clearly aiming at desktops which is what we knew Denver was for all along. All tegras before this were just delays and getting noticed in mobile until we got to this point (as pcper points out they just had to suffer through these first revs). I'm a bit surprised it wasn't announced with a modem inside, but maybe we just don't know enough yet, or maybe that will be a different model. Or they just plan T4i for that this year with a 20nm T5i with Denvers probably taking over 2015.

Make gaming more prominent on mobile and get good games over there while killing consoles. Then get apps going on 64bit so some real stuff starts getting made for real work as you move to desktops. You kill or at least wound x86/windows/directx by helping android/google/valve take over OS duties. I think we're talking just a wound, but look at rimm, nokia, palm etc and maybe you can just get killed...LOL. At that point assuming a weaker Wintel model (sales keep going down as even just chromebooks alone now account for 20% of all laptop sales...See the trend? PC sales off again 6.79% this Q4), you can maybe pull the old Intel move and make your gpus not work on x86. NV denying their gpus from Intel machines is just like Intel screwing them out of chipset lics right? Surely this is Jen's goal of assaulting x86. If all new games by then are OpenGL, WebGL, OpenCL, Java, HTML5 (6?) etc and they are all on android/arm etc (major portability choosing these APIs, all ps3/ps4 games are made with opengl so easy to port those already), on top of the 2.5B units being sold by then if gartner, IDC etc are correct by 2016, you can force a GAME changer (pun intended). It took a few weeks to get Serious Sam 3 and Trine2 working on K1. Incidentally most of that time was used to program the controls. So porting itself is easy as unreal 3 showed being ported in 4days to mobile.

We are already showing 28in monitors, keyboards, mice, gamepads. The only things missing are a discrete gpu in a box, and the next wave of games aimed at K1 etc. Pile on some great apps as ARM moves to 64bit/4GB+ and why do I need windows (well, I'm in IT so I'll be forced to hold onto windows/office longer, but I mean for the rest of us)? Nvidia's only way (all of ARM vendors too) into x86's market is to make a complete ecosystem around ARM/Android/linux (games+apps that actually use 4BG+ like a regular desktop x86). There is a LOT of revenue in x86 (55-60Billion, which Intel takes home about 11B profits). I would call ARM vendors stupid if the goal wasn't getting into this $60B TAM. ARM's TAM is less by far. I think NV has the best shot owning the bets GPU IP of the current ARM group and tons of game experience, drivers etc, work with devs...But get ready for an ARMY of arm chips coming from many vendors and directly aimed at x86.

I can see many use cases. But in your case, say a 10in with 16nm socs, that are easy to take with you, then you come home and dock it to a 28in with key/mouse and maybe a gpu box sitting there hooked up to early on for mass power outside the tablet unit, then eventually incorporated into the 500w boxes at some point). Clearly the AIO example they just put out isn't for toting around much. But you must assault the desktop to get these keyboard/mouse apps to up their game a bit. Like get a full adobe suite, 3dsmax etc running in a few years when socs are 4ghz and in machines with 8-16GB etc. Google, NV etc (someone, or a few big names) need to help fund a few of these big apps either being made from the ground up for ARM or ported. You don't need every app, just the ones most used so people feel less pain moving to NON Wintel boxes. Everyone plays games, that's why they come first. Next stop high use apps like adobe and of course an OFFICE replacement (google apps packaged up properly aimed at HOME users). The rest will come if memory is there, perf is there (think 16nm Denver rev3 or something), 64bit, and obviously a HUGE TAM already over 1.2B units yearly compared to 380mil PC units.
Score
1
January 15, 2014 4:36:25 AM

There is one big problem with the screens on phones/tablets sporting Tegra K1 they use way too high native resolutions to make 192 cuda cores shine, and this cores won't be clocked like on discrete cards.This still is nothing short of amazing... phone equiped with low end desktop graphics core :-). Frankly considering rendering at 720p it would teoretically run most of nowadays PC/Console games (if there were os/arch compatibile).Scepticism is correct in here because it's not a GPU market, big performance numbers don't guarantee good sales.

somebodyspecial said:
cats_Paw said:
After the obvious "but can it run crysis" reference, The main thing i dont understand is how the writer expects nVidia to bring great games over to tablets. The limitations are so massive ompared to PCs when it comes to keyboard/Mice that its simply not going to work.Sure, you can carry a Pad with you, but then again, its more stuff to carry around, and the main reason ppl get tablets is not to have to carry around heavy Laptops.Also dont forget that size matters when it omes to screens.Get your average good game on PC and put it in a tablet. Now imagine playing it in a small screen, with no keyboard or mice, with crappy sound and the limitation of a battery.There is a reason why people keep going to the Cinema even if they got a 40 inch tv at home.


You don't seem to understand what is happening with the bigger picture. They are coming for your desktops/notebooks/servers. Tablets/phones are just a foot in the door to the gaming/app words of desktops etc.


Say what ? servers ? desktops ? They are never comming for thoes by simple physical constraints... you now they were able to put lowly clocked 192 cuda chip on mobile device. Consider one desktop GPU has 10 times that with twice the clock which is 20 times more powerful device and you can easily fit two into your regular desktop solutions.

Here is a "simple chart" ;-) for you to visualize the processing power difference
Nvidia tegra K1 - *
2 GPU Desktop - ****************************************

Tablets are hitting hard but not on pc/workstation but on the Laptop market... but they still have drawback of compatibility smaller application pool and least to say fragile keyboard docks. 10" is nice in your palm but exactly at full arms length that's why they would have hard time with 15 inch laptops, and no match for comfort of full sized 17"-19" laptop (that's what I'd buy for work not 10" with tiny keayboard). What's realy interesting is the Padphones because they offer 3 devices for the 1.5 price.
Score
0
January 15, 2014 10:24:21 AM

cypeq said:
There is one big problem with the screens on phones/tablets sporting Tegra K1 they use way too high native resolutions to make 192 cuda cores shine, and this cores won't be clocked like on discrete cards.This still is nothing short of amazing... phone equiped with low end desktop graphics core :-). Frankly considering rendering at 720p it would teoretically run most of nowadays PC/Console games (if there were os/arch compatibile).Scepticism is correct in here because it's not a GPU market, big performance numbers don't guarantee good sales.

somebodyspecial said:
cats_Paw said:
After the obvious "but can it run crysis" reference, The main thing i dont understand is how the writer expects nVidia to bring great games over to tablets. The limitations are so massive ompared to PCs when it comes to keyboard/Mice that its simply not going to work.Sure, you can carry a Pad with you, but then again, its more stuff to carry around, and the main reason ppl get tablets is not to have to carry around heavy Laptops.Also dont forget that size matters when it omes to screens.Get your average good game on PC and put it in a tablet. Now imagine playing it in a small screen, with no keyboard or mice, with crappy sound and the limitation of a battery.There is a reason why people keep going to the Cinema even if they got a 40 inch tv at home.


You don't seem to understand what is happening with the bigger picture. They are coming for your desktops/notebooks/servers. Tablets/phones are just a foot in the door to the gaming/app words of desktops etc.


Say what ? servers ? desktops ? They are never comming for thoes by simple physical constraints... you now they were able to put lowly clocked 192 cuda chip on mobile device. Consider one desktop GPU has 10 times that with twice the clock which is 20 times more powerful device and you can easily fit two into your regular desktop solutions.

Here is a "simple chart" ;-) for you to visualize the processing power difference
Nvidia tegra K1 - *
2 GPU Desktop - ****************************************


AMD's first chip for ARM is a SERVER chip. NV's Boulder 2015 (2016Q1?) is a SERVER chip. I could go on, but there are at least 6+ vendors coming directly at SERVERS with ARM64. DENVER is aimed at DESKTOPS. Read any review on it. Not much of a point in wasting time on someone who doesn't even GOOGLE things apparently. Google ARM SERVER. But here's a little help for you :) 

http://www.arm.com/products/processors/cortex-a50/corte...
Arm's own page describing A57's use:
"The Cortex®-A57 processor is ARM’s highest performing processor, designed to further extend the capabilities of future mobile and enterprise computing applications including compute intensive 64-bit applications such as high end computer, tablet and server products. "

http://www.zdnet.com/blog/btl/arm-holdings-2015-plan-gr...
Jeez dude...READ MORE. I can link dozens of these stories with the EXACT words SERVER & DESKTOP in them. Mobile is already owned by ARM and x86 has $60 Billion in revenue compared to ARM around $30Bil (most from qcom's 24B in revenue). The only way to grow your market much bigger now is STEAL INTEL's! Well duh. That's a 2011 article, just to demonstrate this has been the big picture for YEARS! I believe I can go back even further a year or two but this illustrates the point. Chromebooks already take 20% of ALL notebook sales. This war has just begun...LOL. It ends with Intel/MS stock in 1/2 or worse. They have nothing to stop game devs from 60% going mobile (which will all run on desktop versions coming with A57 etc), and nothing to stop apps being made over there once they get out a 500w-1000w box with an NV/AMD gpu in them (I guess you missed this point).

I don't quite understand why you think they can't make an ARM box with a PC GPU in it also. You think NV/AMD won't sell to Android (whatever the OS, linux?) box with 500w psu?

Everything BUT directx is starting to win the war (OpenGL, which all ps3/ps4 games come from,many PC games etc can now be ported to K1). OpenGL, WebGL, HTML5 (soon 6 probably) Java etc all run everywhere with minimal dev effort unlike getting involved with a DirectX game which is hell to port anywhere and costly. Unreal 4 already running on K1. What do you think is happening here?...Read the writing on the wall. The only question here is how bad will it get for WINTEL. Even worse both companies have pissed off everyone over the years so they are all looking for a way out of heavy handed crap dealt from these two for 2 decades. The competition has a free OS and $25-40 chips. Good luck keeping your OS at $100-300 and chips at $330 (4770k). AMD is no longer Intel's problem. Qcom, Nvidia, Arm, Samsung etc are Intel's problem now and NONE of them are broke. MS gets hurt worse here in the end I think as Intel can put chips on Android already. What does MS do?

This happens in all empires eventually (gateway, dell, rimm etc etc etc). They get lazy, milk the people to death and crumble in a REVOLT. Just like the USA, which has Obama and congress spending like drunken sailors and printing money until we bankrupt the dollar and crumble. Until this stops we're heading for an Empire like crash.

You sir, are wasting my time. :(  Not sure if this will look proper but it's visual for you ;) 
Score
0
January 15, 2014 10:36:35 AM

For clarification in case you can't understand the bars: they're showing they have 0% share of desktops and servers and WANT TO OWN SOME share :)  Everything in that pic that is GREEN they are coming for. Of course an updated pic would show 25% or more of mobile computers as Chromebooks are 20% of NOTEBOOKS already. You don't have to believe me...ARM IS TELLING YOU.
Score
0
January 16, 2014 4:58:39 AM

somebodyspecial said:
For clarification in case you can't understand the bars: they're showing they have 0% share of desktops and servers and WANT TO OWN SOME share :)  Everything in that pic that is GREEN they are coming for. Of course an updated pic would show 25% or more of mobile computers as Chromebooks are 20% of NOTEBOOKS already. You don't have to believe me...ARM IS TELLING YOU.

My comment and the article is about K1.
Score
0
January 16, 2014 7:36:19 AM

Quote:
Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
Quote:
Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
Quote:
Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
Quote:
Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
I don't think that would be a problem in the near future. The Denver CPU core and will likely bring a lot more power next gen. Also, by adding an extra SMX core to a chip, Nvidia could manage to add true desktop level computing into their chips in two years time.
Score
0
January 16, 2014 7:43:11 AM

Quote:
redeemer said:
Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways.
Tegra3 launched several months late, just before everyone else released their new next-gen SoCs so it was too little too late to have much of a chance to gain much market share. Tegra4 was largely in the same boat and I'm guessing many device manufacturers may have shied off due to lack of unified shaders and GPGPU too.If Tegra5/K1 is delivered on-schedule and is priced right, it should have a decent shot at the market. That's a lot of ifs with a sub-par track record so Chris' implied skepticism (as well as many people in these comments, myself included) is very well warranted.Time will tell.
They made the sacrifice of using last year's processes to develop this year's chips. This should allow them to use the exact same suppliers as last year and increase their speed to market. Even though they're using 28nm processes, their chip is still more efficient than everything on the ARM market by at least 2x. The average advantage is 3x.
Score
0
January 17, 2014 9:20:48 AM

cypeq said:
somebodyspecial said:
For clarification in case you can't understand the bars: they're showing they have 0% share of desktops and servers and WANT TO OWN SOME share :)  Everything in that pic that is GREEN they are coming for. Of course an updated pic would show 25% or more of mobile computers as Chromebooks are 20% of NOTEBOOKS already. You don't have to believe me...ARM IS TELLING YOU.

My comment and the article is about K1.


K1 comes with denver just later this year. 64bit custom cores. Unless they amp mhz on it (or a special model to say 4ghz), it will probably only assault better than lowend notebooks and likely crap end desktops. But @20nm that changes, which I'm guessing will come with at least 1-3 more smx units and 3ghz+. But there is nothing stopping NV from slapping together a SOC box that uses their gpus. At worst, these will be pretty much as fast as AMD's APU's so well suited to desktops at 20nm Q1 next year, and this is before AMD has an Arm mobile or desktop part. Seattle is for servers-and it's based on the same thing as Denver (ARM V8 A57/A53...LOL). I guess you should email AMD and tell them they should trash Seattle today because Arm V8 isn't for servers ;)  Not all desktops come with 2 gpus...LOL. Less than 2% do, but they can throw a K1 into a box with 2 gpus if they'd like to design one. Heck they may be able to make a small box work with thunderbolt and use an external GPU that is easily upgradable for a while...ROFL. Again, Denver is for desktops (no matter how weak you might think they are, or they turn out to be).

How does talking K1 change my story or your comment? It's for desktops/laptops/tablets/phones (at some point if they add a modem at 20nm, but that might be M1 based on maxwell or something).
http://www.theverge.com/2013/3/19/4123636/nvidia
March 2013 - Nvidia wants to be on every machine and OUST INTEL. They're talking CPU & GPU here.
"the message was clear: Nvidia isn't just about entertainment and mobility; it can build serious personal computers if it wants to, and it might be able to do so without help from Intel or AMD."

MESSAGE IS CLEAR. So without them YOU are powering the whole machine right? They are talking DENVER 20nm finfet with maxwell! There is no difference here between a SOC and an APU from AMD. What happens if I amp up the watts on a K1 (any arm) to 95w with an actual heatsink/fan attached? How fast do you think it is then? What makes you think they HAVE to be in a tablet/phone? A chromebook already is a lowened notebook (and it at 21% of their market share...ROFL). Now if the gpu isn't enough for a gamer, sell them a PCIE gpu for that box too ;)  I don't understand your point. NV isn't alone trying to kill x86:
http://www.brightsideofnews.com/news/2012/9/20/nvidia-p...
At the same time, similar story is taking place at Apple, who wants to replace Intel products from its consumer notebook and desktop products.
What do you think they bought PAsemi for? A cpu team. What do you think a high watt SOC goes into? Laptops/desktops as they intend to do eventually at apple. Just waiting for iOS to replace MacOS now. Probably 2015/2016 as he suggest (iOS 8/9).

Nvidia's PROJECT DENVER page (K1 is DENVER):
http://blogs.nvidia.com/blog/2011/01/05/project-denver-...
"ARM is already the standard architecture for mobile devices. Project Denver extends the range of ARM systems upward to PCs, data center servers, and supercomputers. ARM’s modern architecture, open business model, and vibrant eco-system have led to its pervasiveness in cell phones, tablets, and other embedded devices. Denver is the catalyst that will enable these same factors to propel ARM to become pervasive in higher-end systems."

Higher end than tablets, UPWARD to PC's, supercomputers etc. What part of it do you not understand? Read the K1 article, this thing comes with Denver later.
"Denver frees PCs, workstations and servers from the hegemony and inefficiency of the x86 architecture."
Umm, so no INTEL? As he said before they'll either run it or stream to it, that is the goal. Nothing stopping them from slapping more than one in a larger die either (octa core would already have 750mflops then right? 2smx), and at 20nm this would easily become 4 smx at worst, up to 8 probably at best before blowing past AMD's ~245mm die of Kaveri. T4 was 80mm. If you tripled that size it would eat K1 for lunch. If K1 is the same size, triple that too. See the point, it's a Kaveri competitor at that point right? Iris too. I don't understand your logic. Nvidia already said they could have built the proc for Xbox1/PS4, it was not technical issues that stopped them, it was profits. Both consoles are just lowend pc's.

My comment was about K1 too...It's DENVER soon. Did you read the K1 article? Comes in two flavors.
Score
0
January 19, 2014 12:17:58 PM

Nvidia is touting 28nm K1 but the competitor solutions are going to be 20nm soon enough
Score
0
a b Î Nvidia
January 19, 2014 5:17:45 PM

redeemer said:
Nvidia is touting 28nm K1 but the competitor solutions are going to be 20nm soon enough


even qualcom snapdragon 805 supposed to be based on TSMC 28nm.
Score
0
March 12, 2014 5:03:21 AM

I m realy curious about Tegra K1 and its succsessor... Leave K1 beside for a moment and see if gtx 750 having 512 cuda cores n draws 55W and gtx 750ti having 640 drws 60W then if u calculate maxwell's each cuda (veriably) draw 0.039W(if clocked at 1ghz or abov). Means if next Tegra uses 2smx of maxwell (256 cores) it might use only 4W (CONSIDERING 20NM AND ~600MHZ CLOCK GPU) and and max 5W with entire SoC
Score
0
!