Sign in with
Sign up | Sign in
Your question
Closed

AMD's Deccan, Kerala Slated for Ultrabooks

Last response: in News comments
Share
October 25, 2011 12:08:16 AM

YAH this is what makes an ultra book ULTRA!

No more intel HD and 1k + :D  AMD POWER
October 25, 2011 12:17:10 AM

I'm waiting on a 12"-13" Trinity powered laptop. That should be a nice upgrade.
Related resources
Can't find your answer ? Ask !
October 25, 2011 12:26:22 AM

Well... if Ivy bridge's graphics turns out as intel is boasting AMD will be in trouble. They need to release it NOW. No more fooling around, just hurry up and release the damn product.No more time wasting doing useless 8ghz benchmarks. 95% of the userbase won't touch overclocking.
October 25, 2011 12:39:22 AM

One area where Amd can actually compete in. APU's are so awesome i love my E-350.
October 25, 2011 12:44:02 AM

sot010174Well... if Ivy bridge's graphics turns out as intel is boasting AMD will be in trouble. They need to release it NOW. No more fooling around, just hurry up and release the damn product.No more time wasting doing useless 8ghz benchmarks. 95% of the userbase won't touch overclocking.

Ivy Bridge's GPU will receive a 60% boost in performance, which is pretty significant. It puts it around the same level as an AMD Llano A8. However, Trinity, AMD's new line of APUs, is to release next year. We really don't have an idea on Trinity's performance, though it will be based on Bulldozer's arch (yikes!). I'm sure AMD got a wake-up call that BD is not a good CPU, decent at best, and that it's arch needs work. I'm hoping Trinity isn't a flop. I want a decent gaming laptop for high school and so forth.
October 25, 2011 12:47:10 AM

Oops late again? Or another big flop?
October 25, 2011 12:47:52 AM

jdwiiOne area where Amd can actually compete in. APU's are so awesome i love my E-350.

That, and GPUs (though their drivers need some work). I think AMD should take time off the enthusiast market and stay in the low to mid-end market(where the Phenom II's are) for now until they can correct whatever mistakes were present in BD (maybe 4 REAL cores with great performance rather than the 4 Module/8 Thread design that are crap compared to their previous CPUs and the competition's).
a b À AMD
October 25, 2011 12:51:57 AM

AbdullahGIvy Bridge's GPU will receive a 60% boost in performance, which is pretty significant. It puts it around the same level as an AMD Llano A8. However, Trinity, AMD's new line of APUs, is to release next year. We really don't have an idea on Trinity's performance, though it will be based on Bulldozer's arch (yikes!). I'm sure AMD got a wake-up call that BD is not a good CPU, decent at best, and that it's arch needs work. I'm hoping Trinity isn't a flop. I want a decent gaming laptop for high school and so forth.


Its alittle more than 60%. IBs GPU will get 2x the EUs (24 vs 12) plus major enhancements to the GPU itself including a Tesselation engine, DX11 support etc. I think it will be more than 60% as current HD3K is about 2x faster than HD2K (12 EUs vs 6 EUs).

And being out by 2013 and competing only with Haswell reall. I don't think GFs 28nm will be nearly as efficient as Intels 22nm is going to be.
October 25, 2011 12:53:20 AM

If they got the call it's already too late. They should've answered back in 2006 when intel intro'ed the Core 2 Duo. Phenom was a flop. Phenom 2 was not good enough and BD is (to my eyes) a disaster. Its 8 core design gets its backside kicked by a one year and a half quad core. And to make matters worse, it's not even cheap. If trinity turns out to be a killer CPU (which it won't, seeing how things went from 2006 till now) intel will have Ivy bridge to counter any threat amd poses. And the final nail in the coffin is the enormous amounts of money intel is making nowadays. They would just invest tons of cash and come up with something spectacular. And on the graphics front, we must not forget the billion dollar intel made with nvidia back in 2010(?) to improve its graphics.

And yes, I was an AMD fanboy till Core 2 Duo. Now I'm not saying that I switched teams (I had an E-350 laptop but It got stolen) but intel today is just better.
October 25, 2011 12:55:48 AM

sot010174(I had an E-350 laptop but It got stolen)

Point proven, i bet you couldn't give away intel graphics...
October 25, 2011 12:56:44 AM

AbdullahG I want a decent gaming laptop for high school and so forth.


Bahahahaha, cause school's all about bludging(Y) +1

But seriously, it really needs to release in the next month or so. Respect for AMD has decreased considerably (with respect to it's processor line) since the BD Debacle of 2011. If it doesn't get it's act together now, they'll never be able to get even and competitive with Intel. Then we, as consumers, will be forced to pay higher prices because there's no market competition and Intel can do whatever the hell they want.

With that said, I never trust company promotion of its products. Intel saying IB will receive a boost of 60% in graphics performance might as well be lies. For me, marketing and promotion is false unless proven otherwise. But if it really is true, then AMD is in for some serious trouble.
October 25, 2011 12:57:24 AM

...and AMD sells every CPU and APU they can produce so I guess not everyone believes their products are crap.
October 25, 2011 1:20:12 AM

A 60% boost to GPU performance on the Ivy Bridge will only just allow the HD3000 to match the performance of the AMD Llano in DX10. Don't forget too that DX10 performance and DX11 performance cannot be compared Apples to Apples. There are a number of extra effects in DX11 which DX10 does not have, and thus the Intel GPU cannot display. So in my opinion the Ivy Bridge GPU will still fall behind the Llano's

As for the Trinity, AMD has stated that it will be 50% faster than the Llano. This will put it in competition with the quad core i5 Sandy Bridge and possibly even the low end quad core Ivy Bridge. If this is the case, under the same thermal envelope the Trinity will be a winner for HTPC and Laptops, possibly even more so than the Llano. If they can produce enough too AMD will make a lot of money.

Now ... If they could just fix the Bulldozer.
October 25, 2011 1:20:22 AM

Intel and GPU don't belong in the same sentence. Last time they tried the results were disastrous.

The GPU's inside the APU's are just low powered Radeon cores, ATI has just a little bit more experience and then Intel is designing graphics units.
October 25, 2011 1:33:38 AM

ukee1593There are a number of extra effects in DX11 which DX10 does not have, and thus the Intel GPU cannot display.

The GPU in Ivy Bridge supports DX11.
Anonymous
October 25, 2011 1:33:59 AM

So let me get this straight.

So Sandy Bridge was maybe 20-30% faster than Phenom II, Bulldozer comes out and beats Intel in a few areas, closes the gap in others, and slips in a few. That's an epic fail.

But, Llano graphics are 200-300% faster than Sandy Bridge, and that's not a big deal, it's because AMD cheated and used more die space. Furthermore, Ivy Bridge is, according to Intel, going to be 60% faster, which will "cause trouble for AMD", while I'm sure AMD's graphics will just stand still for Trinity, despite AMD's claim that they're improving GFLOPs by more than 60%.

Do have that right?
October 25, 2011 1:46:33 AM

Quote:
So let me get this straight.

So Sandy Bridge was maybe 20-30% faster than Phenom II, Bulldozer comes out and beats Intel in a few areas, closes the gap in others, and slips in a few. That's an epic fail.

But, Llano graphics are 200-300% faster than Sandy Bridge, and that's not a big deal, it's because AMD cheated and used more die space. Furthermore, Ivy Bridge is, according to Intel, going to be 60% faster, which will "cause trouble for AMD", while I'm sure AMD's graphics will just stand still for Trinity, despite AMD's claim that they're improving GFLOPs by more than 60%.

Do have that right?



Intel fanbois will be fanbois, what else is there to explain.

And yeah, the graphics performance on APU's is beyond anything Intel could build for at least another four or five years, they just don't have the experience nor the platform to develop from. What most people don't seem to get is that you can't just "create" and new platform and have it magically work no more then you can "create" a new CPU arch and have it work. It takes many revisions and modifications and redesigns to iron everything out. Both ATI and NVidia have had years upon years to develop and refine their arch's and platforms. Everything from the processing cores to the memory access bus to the software drivers has been refined and built upon. There is a reason S3's 3D accelerators failed, along with Intel's and Mattrox's. Intel could no more design a successful GPU then Nvidia / ATI could design a successful x86 CPU (licensing restrictions aside).

This is one of those area's where AMD's strategic purchase of ATI is showing its value. They have the capability to create a union of x86 and a developed GPU on the same die without any extra licensing / development costs.
a b À AMD
October 25, 2011 2:12:07 AM

beenthere...and AMD sells every CPU and APU they can produce so I guess not everyone believes their products are crap.

they sell them all because they dont make many.
October 25, 2011 2:17:07 AM

str8guySo let me get this straight.So Sandy Bridge was maybe 20-30% faster than Phenom II, Bulldozer comes out and beats Intel in a few areas, closes the gap in others, and slips in a few. That's an epic fail.But, Llano graphics are 200-300% faster than Sandy Bridge, and that's not a big deal, it's because AMD cheated and used more die space. Furthermore, Ivy Bridge is, according to Intel, going to be 60% faster, which will "cause trouble for AMD", while I'm sure AMD's graphics will just stand still for Trinity, despite AMD's claim that they're improving GFLOPs by more than 60%.Do have that right?


I think the reason BD is being labeled an "epic fail" is because it was hyped as an awesome CPU that would pull ahead of Intel while finally responding to Intel's Hyperthreading tech. The issue is that it's great in multi-threaded apps, but in single-threaded apps, and more specifically single-threaded apps with lots of floating point calculations, it falls behind the high-end Phenom II line, which is bad. One huge marketing fail was that it was advertised as an "octo-core" processor; in reality, it's much more like a 4C/8T processor. If AMD makes some design changes (more hand-crafting of critical parts) and improves IPC and single-threaded performance, it will be a very good chip (Piledriver, anyone?).

On graphics, palladin is right. AMD is way ahead of Intel in IGP, and Trinity is going to be another nice improvement. I doubt Ivy Bridge will pose a serious threat to Trinity, although it might close the gap a bit.
October 25, 2011 2:25:58 AM

PurpleHayes said:
I think the reason BD is being labeled an "epic fail" is because it was hyped as an awesome CPU that would pull ahead of Intel while finally responding to Intel's Hyperthreading tech. The issue is that it's great in multi-threaded apps, but in single-threaded apps, and more specifically single-threaded apps with lots of floating point calculations, it falls behind the high-end Phenom II line, which is bad. One huge marketing fail was that it was advertised as an "octo-core" processor; in reality, it's much more like a 4C/8T processor. If AMD makes some design changes (more hand-crafting of critical parts) and improves IPC and single-threaded performance, it will be a very good chip (Piledriver, anyone?).

On graphics, palladin is right. AMD is way ahead of Intel in IGP, and Trinity is going to be another nice improvement. I doubt Ivy Bridge will pose a serious threat to Trinity, although it might close the gap a bit.



AMD's mistake was they tried to design a desktop CPU using concepts developed for the server world. The server version of BD is actually quite good, ridiculously good at virtualization and HPC workloads. Lots of simultaneous number crunching doesn't work very well in the desktop world. The desktop benchmarks used to calculate math performance just use loops of complex math that Intel's advanced branch predictor can easily detect and cache, this it gives Intel a lead in "math" app that's doesn't translate into real performance. No one is going to be doing those kinds of things on a home PC anyway, we're not calculating the molecular density of a particular black hole based on the mass and gravitational movements of nearby stars as measured with red shift and gravitational lensing.

I think their lesson learned is that server parts do not necessarily work well in the desktop world.
October 25, 2011 2:47:18 AM

please do not bulldoze our hopes this time :p 
October 25, 2011 2:59:33 AM

Going to wait out on Trinity after what I've read about the bulldozer architecture. It's just too new for us, probably when windows 8 comes out there's better drivers and support for Bulldozer. I also sprang for a Llano laptop for school. So for now I'll be sticking with the still venerable K10/10.5 in my desktop / laptop.
October 25, 2011 3:05:27 AM

bennayeBahahahaha, cause school's all about bludging(Y) +1

Or I just work my butt of doing several tests a day, several projects a week, teachers too busy to help you, and dealing with a huge workload and deserve some time off? I don't see what's so funny about that bro, just sayin'...
October 25, 2011 3:07:35 AM

Indeed, server and laptop/notbook/ultra-thin are the targets. For now and looking forward, who cares about desktop? My family has 5 PC and all are laptops. My company has lots computing systems and they all are servers. Get the pictures?
October 25, 2011 3:12:19 AM

Quote:
The GPU in Ivy Bridge supports DX11.


Yes, but Sandy Bridge GPU ISN'T. So what might be a 60% performance increase in DX10 performance in Sandy VS Ivy, might be a 40% increase in frame rates in DX10 Sandy Bridge performance vs DX11 Ivy Bridge. Therefore the DX11 Llano and the DX11 Trinity will both outperform Ivy Bridge
October 25, 2011 3:14:21 AM

The APUs are very attractive on the low end, no doubt.
October 25, 2011 3:18:08 AM

I like those names.:)  the product must really be better....
October 25, 2011 3:53:39 AM

AbdullahGThat, and GPUs (though their drivers need some work). I think AMD should take time off the enthusiast market and stay in the low to mid-end market(where the Phenom II's are) for now until they can correct whatever mistakes were present in BD (maybe 4 REAL cores with great performance rather than the 4 Module/8 Thread design that are crap compared to their previous CPUs and the competition's).


not really, look at some of the benchmarks that bulldozer beat more expensive sandy bridges, i really want to know how much of a difference windows being written for bulldozer would help.

i wont call it a fail until i see win 8 built for it and revision 1, the chip could get a very large performance increase which would make it par phenom II and kick intels butt in other areas.
a b À AMD
October 25, 2011 4:08:37 AM

str8guySo let me get this straight.So Sandy Bridge was maybe 20-30% faster than Phenom II, Bulldozer comes out and beats Intel in a few areas, closes the gap in others, and slips in a few. That's an epic fail.But, Llano graphics are 200-300% faster than Sandy Bridge, and that's not a big deal, it's because AMD cheated and used more die space. Furthermore, Ivy Bridge is, according to Intel, going to be 60% faster, which will "cause trouble for AMD", while I'm sure AMD's graphics will just stand still for Trinity, despite AMD's claim that they're improving GFLOPs by more than 60%.Do have that right?


An increase in GFLOPs is not everything. Its great if you use your GPU for FoH but in games, it does not mean it will bump performance 60%. As well, only the HD79XX series will have improvements other than lower power due to the 40nm to 28nm die shrink. The HD78XX looks to be the exact same specs as the HD69XX is while the HD79XX will have more SPUs, faster stock clock, XDR2 RAM with faster speeds.

And I am not a Intel fanboi, I actually own mainly ATI GPUs, have a HD5870 in mine, HD4870 in my wifes, HD5450 in my HTPC. I just look at the facts and post the truth that I can find, no reason to believe in any hype.

palladin9479Intel fanbois will be fanbois, what else is there to explain.And yeah, the graphics performance on APU's is beyond anything Intel could build for at least another four or five years, they just don't have the experience nor the platform to develop from. What most people don't seem to get is that you can't just "create" and new platform and have it magically work no more then you can "create" a new CPU arch and have it work. It takes many revisions and modifications and redesigns to iron everything out. Both ATI and NVidia have had years upon years to develop and refine their arch's and platforms. Everything from the processing cores to the memory access bus to the software drivers has been refined and built upon. There is a reason S3's 3D accelerators failed, along with Intel's and Mattrox's. Intel could no more design a successful GPU then Nvidia / ATI could design a successful x86 CPU (licensing restrictions aside).This is one of those area's where AMD's strategic purchase of ATI is showing its value. They have the capability to create a union of x86 and a developed GPU on the same die without any extra licensing / development costs.


I find it interesting as to how little people seem to underestimate Intel. When we first found out that Intel was moving the GPU to the die on SB, everyone thought it was still going to suck. Yet it performed as well as or better than entry level discrete GPUs of the time.

If Intel wants to push in the APU business, they will. That one thing you have to remember about intel, they are huge and invest more money into R&D a year than most companies combined.
October 25, 2011 4:23:33 AM

jimmysmitty said:
An increase in GFLOPs is not everything. Its great if you use your GPU for FoH but in games, it does not mean it will bump performance 60%. As well, only the HD79XX series will have improvements other than lower power due to the 40nm to 28nm die shrink. The HD78XX looks to be the exact same specs as the HD69XX is while the HD79XX will have more SPUs, faster stock clock, XDR2 RAM with faster speeds.

And I am not a Intel fanboi, I actually own mainly ATI GPUs, have a HD5870 in mine, HD4870 in my wifes, HD5450 in my HTPC. I just look at the facts and post the truth that I can find, no reason to believe in any hype.



I find it interesting as to how little people seem to underestimate Intel. When we first found out that Intel was moving the GPU to the die on SB, everyone thought it was still going to suck. Yet it performed as well as or better than entry level discrete GPUs of the time.

If Intel wants to push in the APU business, they will. That one thing you have to remember about intel, they are huge and invest more money into R&D a year than most companies combined.



Repetitively you have demonstrated your an Intel Fanboi, just as Baron is an AMD Fanboi. It's obvious that you always take the Intel favoring side of all arguments / debates even if it contradicts a previous position you used on a different topic.

Intel has near zero experience building 3D acceleration engines, not only in the silicon but in the firmware and driver departments. They have no core architecture to build off of, they would have to start from scratch. Even with Intel's resources, that doesn't happen fast, nothing short of half a decade at absolute best. Where as Nvidia has experience dating back to the Riva and ATI back to Rage, Intel has .... Larrabee ... something so bad that it was canceled.

http://en.wikipedia.org/wiki/Larrabee_%28microarchitect...

Intel tried once a decade ago to release a graphics card, it was so bad that they stopped making them and no one ever heard about it again. It became what we call the IGA/IMA today.

It would be like Nvidia suddenly announcing that their going to make an x86 CPU to compete with Intel. They would fail completely try to do that on their own and they know it.

Best thing is for Intel to purchase a 3D technology from a non-profitable company and use that as a baseline to start with. Not too many of those left around, could possibly get S3 at a sweet deal. They actually had decent lower power 3D GPU, no where as good at nVidia / ATI but they used less power and functioned. S3's drivers were too buggy and by the time they produced a stable driver nVidia / ATI had moved on and the MeTAL was deemed unnecessary.
a b À AMD
October 25, 2011 5:33:07 AM

iirc, didn't amd say they were not interested in ultrabook form factor? way to change sides(and make the right decision).
amd, with their strong igp can give intel some serious competition if they successfully release their platforms in time. too bad llano suffers from production problem - blame global foundries.
if amd price their stuff aggressively, intel will soon regret their decision to not lower cpu prices for ultrabook oems.
otoh, if intel is pushed hard enough, they'll get off their lazy asses and might bring out a new conroe of an apu and .. you know the rest.
a strong amd vs intel competition in ultrabooks will lower prices, consumers will win and will not have to buy $1000 thin netbooks ultrabooks.
October 25, 2011 5:39:39 AM

Intel is inventive and AMD just like to copy everyone else. That's why their market share is 18% and they are always playing Ketchup(catch up). Ivey bridge will destroy AMD and totally embarrass anything AMD tries to put out.
October 25, 2011 5:43:53 AM

You guys sounds like a bunch of pissed off AMD Fanboys because you can't get it right.
October 25, 2011 5:47:50 AM

Quote:
I've seen these Brazos tablets. Bulky as elephant ***. And Atom based tablets - what a joke.
Now that's why iPad is so successful - fine slim design due to using a proper combination of ARM base CPU and software. The same is valid for newer Honeycomb devices.


Hmm their not really in the same league though. ARM CPU's have really low processing power, but they require less cooling and power. x86 CPU's (Intel / AMD) have much greater processing power but require more cooling / electrical power. It's a trade off, you either have high computing power or you have small / low electrical usage.

As a consumer you chose what works for you.
October 25, 2011 6:51:04 AM

http://www.tomshardware.com/news/drivers-rage-battlefield-3-bf3-detonators,13806.html

That is why Intel is at best 4~5 years behind NVidia / ATI in GPU's.

NVidia, an extremely experienced designer and producer of GPUs, a company who works directly with game developers, still finds things to fix and optimize in their drivers. And somehow jimmy expects Intel, a company who hasn't built a successful GPU ~ever~, to get it right anytime in the near figure.

That is why I consider you an Intel fanboi.
October 25, 2011 7:53:50 AM

ZingamI've seen these Brazos tablets. Bulky as elephant shit. And Atom based tablets - what a joke.Now that's why iPad is so successful - fine slim design due to using a proper combination of ARM base CPU and software. The same is valid for newer Honeycomb devices.

http://www.msi.com/product/nb/WindPad-110W.html#/?div=S...
It's a fair bit thicker, but consider the hardware you get - 4GB DDR3, a 64GB SSD, an IPS panel, an SD card slot and a number of ports. I doubt there are many tablets with that sort of hardware and less bulky at the same time.

49ers540Intel is inventive and AMD just like to copy everyone else. That's why their market share is 18% and they are always playing Ketchup(catch up). Ivey bridge will destroy AMD and totally embarrass anything AMD tries to put out.

Oh please.

jimmysmittyIts alittle more than 60%. IBs GPU will get 2x the EUs (24 vs 12)

16, not 24.

What would be a big help for IGPs would be when somebody brings out a solution with stacked memory. RAM is still too slow (Ivy Bridge's much faster RAM interface will make a big difference) and though we're not exactly seeing high performance parts, the CPU still needs that bandwidth. Regardless, even with stacked memory, it won't completely solve the bandwidth issue (and I don't expect a quad-channel solution anytime soon, either).

palladin9479AMD's mistake was they tried to design a desktop CPU using concepts developed for the server world.

Spot on.

palladin9479The server version of BD is actually quite good, ridiculously good at virtualization and HPC workloads.

I've not seen any reviews or benchmarks yet... any links?

palladin9479Best thing is for Intel to purchase a 3D technology from a non-profitable company and use that as a baseline to start with. Not too many of those left around, could possibly get S3 at a sweet deal.

S3 is owned by HTC, so it's unlikely. However, Intel could do themselves a huge favour by purchasing Imagination Technologies; I'd LOVE to see a performance part using TBDR, much like the Kyro III that we never saw. I'm not sure how feasible this is, however they can manufacture the GPU in-house without having to use TSMC.

jimmysmittyWhen we first found out that Intel was moving the GPU to the die on SB, everyone thought it was still going to suck. Yet it performed as well as or better than entry level discrete GPUs of the time.

A consequence of AMD and NVIDIA not bothering to update their IGP product portfolio for a good 18 months. HD required much higher clock speeds to come close to the HD 3300 IGP (900MHz for the i5-661 vs. 700MHz) in most titles, and despite the addition of DX11 support to the 4250 on the 890G boards, it was the same GPU that appeared with the HD 3200 back at the start of 2008.
October 25, 2011 11:36:19 AM

Intel's "GPUs" can barely even be called such. Their last foray was a flop, and the ten before that as well. It won't matter how much "GPU" power Sandy Vadge has when it's released - nobody will care, because they'll all be using Big Kid computers.

Sandy Vadge will likely sell very well. This has nothing to do with it being a decent or efficient processor, or a good value. This has everything to do with branding, marketing, and lowest common denominator advertising. Go, mediocrity!
October 25, 2011 11:44:31 AM

sot010174Well... if Ivy bridge's graphics turns out as intel is boasting AMD will be in trouble. They need to release it NOW. No more fooling around, just hurry up and release the damn product.No more time wasting doing useless 8ghz benchmarks. 95% of the userbase won't touch overclocking.


If it's for notebook, netbook, and ultrabook only. Then 100% of the people won't be able to OC. Also, Intel has great processors that have been eliminating AMD's solutions since 2006 from a processing standpoint only. But when we talk graphics, AMD is light years ahead. So, it would harm Intel's credibility to brag about something that they can't do. Also, let's not forget that it's not AMD that intel's fighting against this time. It's the renamed ATI whose engineers literally saved AMD in many situations. This was one of the best mergers I've seen during my short 22-year stay in this world
October 25, 2011 6:09:59 PM

49ers540Intel (Apple) is inventive and AMD (Google) just like to copy everyone else. That's why their market share is 18% and they are always playing Ketchup(catch up). Ivey bridge (iPhone 5) will destroy AMD (Google) and totally embarrass anything AMD (Google) tries to put out.


Since you like to take such extremes I figured you were also an apple fan, so I made you a comment for the next iPhone article.

I have both AMD and Intel systems, and each has their use. I love my AMD tablet, the APU is remarkable and allows me to playback 1080p video or do light gaming on the go with a good enough battery life, and it kicks the crap out of anything Intel has in that segment. I also have a high end Intel workstation for doing high compute operations and for high end gaming.

If AMD weren't around I wouldn't be enjoying my 2600k, because Intel would have much rather released the $1000 X series CPU's, but pressure from AMD's BE line forced Intel's hand. Remember, competition is good for the consumer.
October 25, 2011 7:13:15 PM

Do you know what the problem with Bulldozer is? The problem with Bulldozer is that if they took Thuban, shrank it down to 32nm process and added 2 more cores to it, it would of probably been faster than Bulldozer. And that is the problem with Bulldozer.
March 24, 2012 12:57:55 PM

dragonsqrrlThe GPU in Ivy Bridge supports DX11.


The point that I'm making is a "50% increase" from sandy Bridge graphics to Ivy Bridge cannot be calculated in DX11 if the Sandy Bridge does not support DX11. So if Intel is boasting DX10 performance on the Ivy Bridge as lets say "40 fps" will not necessarily be the same frame rate as the DX11 performance.

However the Llano vs Trinity will offer a direct comparison DX11 vs DX11!
!