Nvidia Demonstrates Its Kepler Mobile Chip

Status
Not open for further replies.

weierstrass

Honorable
Aug 2, 2012
69
0
10,630
To compare they should have shown the same game on both devices and compared rendering speed and quality. Otherwise they could have also shown Tetris on Kepler and conclude that A6X is better because it can do 3D !
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


I don't think that's the case, unless you have an extremely selective memory.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


This isn't intended for laptops. This version of Kepler will be integrated into Nvidia's next-gen mobile SOC (Tegra), targeted at smartphones and tablets.
 

NuclearShadow

Distinguished
Sep 20, 2007
1,535
0
19,810
I have to admit that is quite amazing. Looking at the BF3 footage it looks great. The competitive nature that is causing such a drive in advancement in the mobile market. I would love to see the game side by side compared to the console and PC version next time.
 

CrArC

Distinguished
Jun 5, 2006
219
0
18,690
Any else feel like the mobile market has been developing at a rate vastly higher than the more traditional portable (laptops/netbooks) and desktop markets?

Seriously - the kind of advancement we've seen in 6-7 years just feels like it eclipses the performance/efficiency increases seen in these other sectors. Look at the boring changes in Ivy bridge and now Haswell, let alone what AMD are doing (or rather, aren't). Snore.

This is what real competition gets you, compared to the AMD-Intel duopoly, which has nearly petered out itself with Intel in the lead. I hope the mobile market continues advancing this aggressively for a long time to come.
 
^ we can only hope so. but in the end it might end up like intel-amd or nvidia-amd in the desktop space. Texas Instrument already leave this SoC race and cease the development of their OMAP platform (did they already sell the divison?). what company is still competing in high end SoC other than Qualcomm, Samsung and Nvidia?
 

Maxx_Power

Distinguished


Okay, I'll be more specific. Except for the recent Kepler generation, and then ONLY for the top models, the Nvidia GPUs have been more power hungry per whatever metric (usually performance) you want to measure them with against, say AMD. For the lower power, lower budget arena, this was true for: the GTX 200 series vs. the HD 4000 series, the GTX 400 series were that way vs, the HD 5000/6000 series, the GTX 500 series were that way and finally, the GTX 600 series from 650 Boost and below seems that way. That's when they are directly competing with AMD. Compared with PowerVR's awesome deferred rendering stuff that's built from the ground up to be power efficient, I don't know how much of a chance NV stands, all else being equal.

On the Tegra-only side of things, Tegra 4 didn't win as many design wins primarily because of power consumption. And on the topic of Tegra 3, to quote from Anandtech and its forum:

From Anandtech:

"NVIDIA's GPU power consumption is more than double the PowerVR SGX 545's here, while its performance advantage isn't anywhere near double. I have heard that Imagination has been building the most power efficient GPUs on the market for quite a while now, this might be the first argument in favor of that heresay."

3d1-gpu.png


From the forums:

"It seems like NV might be struggling with power consumption on their GPUs in comparison to Imagination Tech (PowerVR).
Certainly from a platform power standpoint, it does indicate some difficulties in competing with ARM SoCs if it's true, although it doesn't seem to result in significant penalties in Android ARM tablets. "

http://forums.anandtech.com/showthread.php?t=2291670

That's exactly what I was getting at, that the philosophy of NV is to be faster at nearly all costs, to the point of obvious diminishing returns. Based on the history of NV and their tendency to optimize everything for extra power for the last bit of performance (whether to beat AMD for bragging rights or whatever), I would venture to think that a lot of people are concerned with how their future products are on a power constraint.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290

I really don't think it's nearly as one sided as you're making it out to be. With the exception of the Geforce FX and GTX 400/500 series, Nvidia hasn't had markedly higher power consumption than the competition. The X800 series topped out in the same area as the Geforce 6. The X1900's consumed more than their competition. The HD2900 XT consumed substantially more than its competition. And while the HD4800's did consume less at load, they also consumed more at idle than the GTX 200's. They also ran notoriously hot. And then there's the HD7900's like you mentioned, which consume more than their competition.

I would probably give the slight advantage to AMD/ATI overall, at least as far back as power consumption has been a concern in the consumer graphics card market. But I definitely wouldn't say Nvidia has had a big issue with power consumption "since nearly forever", without also saying the same for AMD.
 

Maxx_Power

Distinguished


In all fairness, I'm not siding with either AMD or NV. The original response with "since nearly forever" is a bit of a hyperbole, given that the pace in the electronics industry, particularly computers, has been really rapid. I would give a slight power advantage to AMD as well.

As for the HD7900 series, the philosophy for that particular silicon is flip-flopped with the Fermi series and the general philosophy of GPUs for Nvidia for the last 3 generations. The HD7900 series is more complicated and built for both graphics and massive compute capabilities (benchmarks shows this as well, the current gen AMD is rather far ahead of Kepler, save for perhaps Titan, for which there is no AMD competitor), so there is power to be paid for this extra complexity.

I am concerned mainly with NV's competition in the low power arena - PowerVR. It kinda brings back memories when 3Dfx was around and competed directly with PowerVR on the desktop end, back in the days of Quake 1/2 and Unreal 1 - UT99. Against that architecture, and with NV's mentality of "better performance at all costs" that they seemingly (to me, and I guess others as well, depends on who you ask) adopt for every segment, I just am concerned a bit with the power consumption.

BUT, who knows, I do give NV the benefit of the doubt until I see actual numbers. It is just a concern...
 
Status
Not open for further replies.