Sign in with
Sign up | Sign in
Your question

Larrabee. Thoughts?

Last response: in Graphics & Displays
Share
a b U Graphics card
July 8, 2008 7:39:59 AM

Well, from what Ive read, it has 32 shrunk down PentiumIII cores I believe, uses x86, and can do 2 TFLOPS at DP. That being said, it looks to be an excellent graphics encoder, audio encoder etc or gpgpu. How will it do in a rasterisation enviroment who knows? Im thinking itll do ok, and will need alot of help from game devs to make it shine
July 8, 2008 7:49:57 AM

More importantly are the financial implications. The price war is hurting the big players, some are them are basically selling cards at a loss so if Larrabee is even just on a par with better margins the manufacturers well be all over it.
Related resources
July 8, 2008 8:02:04 AM

Some serious power draw from that beast - not surprising given the raw power it potentially can unleash.
July 8, 2008 8:22:28 AM

I doubt it. Pentium 3s where manufactured at over 200nm at one stage and only required really low level cooling which indicates a very low power drain. Now factor in Intel's multi core technology where a quad core only draws 50% more power than a dual at the same clock speed + the massive die shrink. Now if it where Pentium 4 cores...
a b U Graphics card
July 8, 2008 8:41:46 AM

If it were P4, the pipes would be so long, an entire game would be trapped in Larrabee before we saw anything heheh
July 8, 2008 8:43:12 AM

Personally I think it’s going to struggle against existing video cards as its architecture is completely different from that of the existing ATI and Nivida cards games will need to be seriously re-coded in order to work on video card that uses x86’s chips rather then more traditional shader and pixel pipeline based GPU’s.
July 8, 2008 8:57:48 AM

Romulus I thinking more 125-150W TDP.

As we already know, thanks to the E6600/Q6600, dual die chips only take 50% and lets assume that each die takes 8W.

8 dual dies at 12W(dual die takes 50% more power than a single die) hits 96W + that leaves power for the rest of the card including all the memory this beast will need. Plus by then Intel should have its monolithic quad cores sorted out and have done another die shrink offering even more power savings.

Don't forget Intel said using a mid level Larrabee and entire decent performance computer could be built using less than 140W of power.
July 8, 2008 9:00:27 AM

But it says 300W TDP over there... maybe the 32 cores are really hungry
a b U Graphics card
July 8, 2008 9:07:09 AM

Im thinking thats a FUD allowed by Intel to throw people off
July 8, 2008 9:17:08 AM

romulus yeah, 300W TDP with current technology and at 2ghz. Heres an interesting little piece of info from my overclocking. A Q6600 @ 3.2ghz requires 133W, at 3.6ghz it can burn through up to 200W. Small dump in clock speed but the CPU wants a lot more power. So lets assume the die shrink doesn't help, they can still scale back to 1.6ghz which would knock a huge amount of power requirements off. Wait this is Intel, their die shrinks always help.

EDIT: Also they state it will support SIMD which allows for multiple cores from a single physical core much like Hyper Threading does. That means it could only actually have 16 cores halving the power requirements.
July 8, 2008 10:12:46 AM

The Pentium 3 Arch used roughly 1.45v to get to 1.4 Ghz and that at 130nm. Considering that they used 32 processors each with SIMD features which will increase the utilization of the seperate cores, i think 300 Watt is not that far off. It may be a conservative estimate to stay on the safe side, but once all those processors start running at full speed, it will consume a lot of power.

For those of you concerned with that, please don't believe that intel does this to supply us with the next best graphics card. To them, this is something they can use in the server segment. And having 32 OOO cores capable of x86 code running at 2 Ghz each is quite something. Compare their processing power with that of a 32 processor rack and the power consumption doesn't look bad, not bad at all. Add the new space requirements and ease of installation and you can see where this is going...
July 8, 2008 10:13:34 AM

I was assuming these are going to be modified 'Atom' processors, since a few sites mentioned that these cores were going to process instructions 'inorder'.
July 8, 2008 10:16:53 AM

X2_Server I had also heard the same but perhaps Slogogob is right and these are the ones meant for servers much like NVidia's Telsa/Quadro and ATI FireGL ranges are in which case that thermal envelope makes sense.
July 8, 2008 12:09:36 PM

X2_Server said:
I was assuming these are going to be modified 'Atom' processors, since a few sites mentioned that these cores were going to process instructions 'inorder'.

Using the Atom itself has advantages but some severe disadvantages too. One of the advanteges would be the in-order architecture, given the planned usage.
One major disadvantage would be the inter-chip communication while hardly a priority for a GPU, it would suffer greatly because of the bus interface used. Looking at the server space and intels 4P and 8P processor scaling makes me wonder how they realize 32 processors on a single PCB. To my knowledge intel will use Quickpath, something the Atom does not have yet. Using Quickpath or a derivative of it is the only logical conclusion if intel intends to make larrabee more than a real basic vector processing unit.
The Atom processor provides some basic SIMD fetures that are similar to the P4 Hyperthreading. I read that Larrabee will provide up to 4 threads per core though.
In the end Larrabee will share some features with the Atom, but will be something of its own.


Here's a snippet from an intel pdf (Source):

Larrabee Architecture for Visual Computing -- With plans for the first demonstrations later this year, the Larrabee architecture will be Intel's next step in evolving the visual computing platform. The Larrabee architecture includes a high-performance, wide SIMD vector processing unit (VPU) along with a new set of vector instructions including integer and floating point arithmetic, vector memory operations and conditional instructions. In addition, Larrabee includes a major new hardware coherent cache design enabling the many-core architecture. The architecture and instructions have been designed to deliver performance, energy efficiency and general purpose programmability to meet the demands of visual computing and other workloads that are inherently parallel in nature. Tools are critical to success and key Intel® Software Products will be enhanced to support the Larrabee architecture and enable unparalleled developer freedom. Industry APIs such as DirectX™ and OpenGL will be supported on Larrabee-based products.

Here are a few more articles with a more in-depth look at it.

Link 1
Link 2

July 8, 2008 1:22:30 PM

What are they using to connect all these cores? I thought that the problem with many cores was the connective circuitry was just to big. Isn't that why IBM, HP and others are trying light, lasers and mirrors to cram more cores into a chip?

I am curious about that more than any thing else.

I googled larrabee and i found some info about ~170w.
a b U Graphics card
July 8, 2008 1:26:59 PM

I'm not that impressed by their numbers, since NVIDIA has already stated that by that time, that kind of computational power will have been already achieved (2X4870 > 2TFlops). The real deal for me is the architecture change, and we'll have to see the blue guys dealing with parallelism. I wouldn't bother with this card, if only I didn't know Intel all too well to laugh them off.
July 8, 2008 1:34:51 PM

i would imagine when larrabee come out. intel chipset will stop supporting ati crossfire and to have crossfire you'll have to use amd cpu and amd chipsets. this can really hurt amd unless they have a strong cpu by then
July 8, 2008 2:09:49 PM

Cause they have their own GFX?
July 8, 2008 2:17:17 PM

JAYDEEJOHN said:
If it were P4, the pipes would be so long, an entire game would be trapped in Larrabee before we saw anything heheh


HAH!!! wow, I cried after that one!
July 8, 2008 2:32:13 PM

I think Intel has to do something... look at people's signatures here... I've seen builds where someone drops $300+ on a video card only to drop $80-$100 on the CPU and OC the hell out of it. If the GPU really is replacing the CPU as the most important component in a PC, Intel is gonna want its cut.
July 8, 2008 2:54:28 PM

^good point. never thought of it that way.

Besides, isn't Nvidia trying to get into the cpu market? AMD is already in cpu market and gfx.

Intel may lose some cpu market share. Getting into the enthusiast gfx market could balance out that market loss.
a b U Graphics card
July 8, 2008 3:59:54 PM

Thats just it, Intel doesnt. It looks to be mainly a gpgpu first, gpu second. This doesnt have anything to do with Ray Tracing, so dont even guess that. If youve read various articles, stating that Intel thinks CUDA and AMD's gpu solution for physics/encoding etc will be just a footnote in history, as Intel believes everything thats done on x86 is superior, mainly due to familiarity with the devs. In most apps this is true, and CUDA will really have to take hold fast and furious, and have a great showing as well. As far as this being a gpu, who knows? Theyll have a helluva time with their software for DX connects/communications et al. We need more info, and Im not about to get whimsical about this and deny nVidias and ATIs rightful spot in the gpu hierarchy. Intel knows that nVidia and ATI just released their new cards, and we get this, a few hints, nothing solid, and leaving us all wondering. Bet by the next gen from nVidia and ATI, well see more crap about Larrabee as well
July 8, 2008 4:17:23 PM

The fact that they are using pentium 3 cores should be ample enough information to turn people away. Really dont care how convient it may be, this is 2008, not 1999. I mean how much can a few dozen Coppermines do compared to ATI/Nvidia offerings? Die shrinks are overrated and I think current temperatures and power usage on our gpu's and cpu's show this. These companies keep shouting less power less power, but as we see its not changing, ATI is the biggest culprit of this. Not to mention prices are NOT cheaper until the part is insignificant.

Whats next, dodecagon-SLi with Rage3d chips on 4 PCB's?
a b U Graphics card
July 8, 2008 4:23:08 PM

I thought the 4850 used less power than the 9800gtx+?
July 8, 2008 4:27:12 PM

JAYDEEJOHN said:
I thought the 4850 used less power than the 9800gtx+?


It does but barely, that was my point, and its still not much of an acomplishment considering the 9800GTX is a larger card (physically). These die shrinks are just flat out over rated, the numbers are not nearly enough to justify any change. The difference between consuming 170W under load against 190W under load is insignificant. If people are that worried about saving power and money on their bills then they are doing it the wrong way, they need to stop using PC's period. Die shrink all you want and add as many EPU chips as you want, it wont matter.

Also, the type of people that are buying these cards will have ample enough power 99.9% of the time to run either, so once again....insignificant.
July 8, 2008 4:54:47 PM

cal8949 said:
i would imagine when larrabee come out. intel chipset will stop supporting ati crossfire and to have crossfire you'll have to use amd cpu and amd chipsets. this can really hurt amd unless they have a strong cpu by then


Hardly.

I think that would see Intel's market share in the gaming and enthusiast area dry up overnight.


It would be absolute suicide.



What do you get more from performance wise (for games - which is the key aspect of said market) - an AMD 3200X2 with a HD3870 or a Intel QX6800 with an intel AGP?
July 8, 2008 4:57:11 PM

JAYDEEJOHN said:
If youve read various articles, stating that Intel thinks CUDA and AMD's gpu solution for physics/encoding etc will be just a footnote in history, as Intel believes everything thats done on x86 is superior, mainly due to familiarity with the devs. In most apps this is true, and CUDA will really have to take hold fast and furious, and have a great showing as well.



It depends how the various approaches deal with SMP/PVM.


The dominant technique should win out, as at that end of the market, there is no substitute for speed.


a b U Graphics card
July 8, 2008 5:09:32 PM

spathotan said:
It does but barely, that was my point, and its still not much of an acomplishment considering the 9800GTX is a larger card (physically).


What's the point exactly? The cards could be bigger or smaller pretty irrelevant.
The HD4850/4870 also have about 50% more transistors in the same die size and outperform the GTX and GTX+ per watt, so if you're talking about efficiency ATi's fine this round, after having a terrible problem with the HD2Ks in that respect. So your one-sided statement seems pretty biased, since the efficiency crown goes back and forth.

Quote:
These die shrinks are just flat out over rated, the numbers are not nearly enough to justify any change. The difference between consuming 170W under load against 190W under load is insignificant. If people are that worried about saving power and money on their bills then they are doing it the wrong way, they need to stop using PC's period. Die shrink all you want and add as many EPU chips as you want, it wont matter.


Well it's not just power consumption it's efficient power consumption. doubling performance without doubling power consumption is agood goal, and does increase greatly the performance per watt figures.
Compare the perf/watt of the GF FX or R300 series versus the HD4K and they are sginificantly different in a short period of time, and mmore so than most areas.

Quote:
Also, the type of people that are buying these cards will have ample enough power 99.9% of the time to run either, so once again....insignificant.


It's far from isnsignificant. It may not be a mjor concern, however without keeping it in check you will run into a hard limit outside the PC.
It's all fine having a 2KW PSU, however if it's still drawing off the typical 15A 110/120V outlet you're going to have trouble.

For a single card it's not a major issue, but it's still means you have 300W of heat to deal with (TDP is about heat not primarily about current draw) and 300W is nothing to sneeze at heat wise, and added to the rest of the case component it's going to be pumping out a whole lota heat into people's rooms.
a b U Graphics card
July 8, 2008 5:16:03 PM

As for intel dropping Xfire support, I wouldn't be suprised to see it happen.... EVENTUALLY, but not initially. As mentioned there's little benefit to juping the gun on that right away, and limiting people's choices doesn't usually go over very well.

However if they establish themselves to be a viable competitor and can offer the intel MoBo advantage to prop up their graphics division, don't be surprised if intel doesn't bar them from their platform, they just don't offer them help, and don't give them early access.

As long as Xfire or SLi can help sell intel's strategy they'll want them, but once there's more benefit to favouring their own solution at the exclusion of the rest, then they'll start freezing the others out. If they only lost 3-5% of their MoBo sales but increased their VPU sales by 30-50+% (and those numbers resulted in increased profits and better long term future), they'd pretty much be unwise not to pursue that strategy.

It may happen long term, but it's definitely not happening anytime soon.
July 15, 2008 1:15:52 PM

**Snigger**


The closer we get to an actual launch, and the more details get released, the worse this thing becomes.


1st it is based off Pentium MMX CPU cores.
2nd it uses around 300W at max power. -> http://www.fudzilla.com/index.php?option=com_content&ta...
3rd it DOESN'T have power saving features. -> http://www.fudzilla.com/index.php?option=com_content&ta...
4th it will not be cheap. -> http://www.fudzilla.com/index.php?option=com_content&ta...


Add in things like memory bandwidth (128 GB/s as of 2006 presentations) already lower than the market leaders and it seems as far as a GPU goes - Larrabee 1st gen is gonna be a dog.

July 15, 2008 1:57:29 PM

I find it a bit odd that there's so much negative FUD about Larrabee, especially given Intel's recent track record for making good stuff. I know no one's perfect, but you'd think with Intel's steam right now that they'd be smart and design and build everything fairly well. And even if there's problems, they could just delay the thing until it's ready and perfected. It's what they're doing with Centrino 2.
July 15, 2008 3:18:27 PM

mathiasschnell said:
I find it a bit odd that there's so much negative FUD about Larrabee


Its only FUD if it is not true.


There are a lot of stories about Larrabee out at the moment - none of them good.
a b U Graphics card
July 15, 2008 4:07:06 PM

There's also no solid information, and what little we do have is from questionable sources at best.

I'll wait for the more reputable numbers before making a decision as far as Larrabee is concerned.
a b U Graphics card
July 15, 2008 4:30:13 PM

cjl said:
There's also no solid information, and what little we do have is from questionable sources at best.

I'll wait for the more reputable numbers before making a decision as far as Larrabee is concerned.

Which, like I said earlier, we wont see until the next release of cards from nVidia or ATI. Intel is just trying to let people know theyre out there, and theyre coming. I dont buy any of this info at all. These measely few breadcrumbs. Wait for the release of the R800 or the G300 before we know anything about Larrabee. This isnt Intel showing off their pretty C2Ds, where they had a poor but well marketed product in the P4 already in the fight. Larrabee will either have to blow the doors off anything thats out, and then we will hear allllllllllllllllll about it, or if we only get these breadcrumbs, itll be just another attempt for a competing product in a well established field
July 16, 2008 12:20:40 AM

Intel know a lot about chip design. They also know a fair bit about marketing. They also know a great deal about chip manufacturing, and importantly in this instance, how to do it cheap. Look at the base price for Atom CPUs - under $5 a pop.

Looking at Intel's recent track record, especially bearing in mind this is a new market for them away from discrete, I wouldn't be surprised if they waited to perfect drivers especially before releasing, and very cleverly managed to say nothing about it until a month or so prior to release (without any shortages of course).

Larrabee has huge huge promise - don't think that they'll let this be diluted or over-hyped at all in the run up to release. Don't think that they are overly rushed to release either. Intel with their current position can afford to wait and get it just right.

Let's not also forget what 'just right' means for Intel. It doesn't just mean a sucessful product - it means hard cold market share almost immediately. They will have an aggressive internal target, be it 5, 10 or 20 or whatever % of the dedicated GPU market, and they need the product, the pricing and the support to make it happen.

Exciting times.
July 16, 2008 4:46:42 AM

The_Abyss said:

Larrabee has huge huge promise - don't think that they'll let this be diluted or over-hyped at all in the run up to release. Don't think that they are overly rushed to release either. Intel with their current position can afford to wait and get it just right.

Let's not also forget what 'just right' means for Intel. It doesn't just mean a sucessful product - it means hard cold market share almost immediately. They will have an aggressive internal target, be it 5, 10 or 20 or whatever % of the dedicated GPU market, and they need the product, the pricing and the support to make it happen.

Exciting times.


I can't say I agree with you on this. Based on the information I've seen I expect Larrabee to be a tremendous flop for Intel. Most of what I've seen suggests that Larrabee will be capable of 2 teraflops, which would be great if it were released now, but it won't be released until next year. The 4870X2 is expected to be able to exceed 2 teraflops and it will be available next month, not next year. Given that both Nvidia and ATI roughly doubled the processing power of their GPU's over the last 8 months, Larrabee could actually be significantly slower than the competition when it is released.

There is also the fact that Intel has never produced a graphics solution that could be described as being anything better than marginally adequate.
a b U Graphics card
July 16, 2008 5:11:06 AM

Everyone seems to be confused about the 2 teraflop number.

From what I've seen, it is supposed to be capable of 2 teraflops double precision. RV770, AKA HD4870, can only manage around 400 gigaflops double precision (the 1.2 teraflop number is single precision), making R700 AKA HD4870x2 capable of 800 gigaflops double precision.

So, it is a slightly more impressive figure than many here seem to think.
July 16, 2008 7:35:51 AM

Don't forget also until just before production if things look to be off peak of where they want to be, Intel can start evolving the design and push the existing framework onto the parallel processing market, one which is not inconsiderable (Dreamworks anyone?).

Either way, the graphics market desperately needs a viable 3rd player in it to compete effectively at the mid and higher ends. If nothing else it will help sharpen the minds of the two incumbents.
July 16, 2008 8:11:28 AM

Well, even if intel make GFX on par with ATI and BV (i am kinda pasimistic about it) i am going skip at least few generations.
Firsly i dont like intel and their marketing strategy and i know what to expect from ATI and NV graphics but dont know intel side and i think game developers will do same at least on begining.
a b U Graphics card
July 16, 2008 8:26:24 AM

I thought DP wasnt important in graphics rendering? We desperately need a third party cpu maker thats much larger than Intel, has more money, more resources, more influence etc etc, No we dont need a third gfx maker. Itll be interesting. Needed? No. Does Intel need it? Yes, very much so. They may play down graphics, but in reality, other than business, the vast majority is home buyers who think their cpu is just fine, but would looove to be able to play games and encode videos quickly. Until its here, or close, its nothing. A new approach isnt needed yet, and this may not be it anyways.
July 16, 2008 9:49:36 AM

The_Abyss said:
Looking at Intel's recent track record,


Yes... Intel 740.

Enough said.





Oh - and its the P4 design team building Larrabbee - not the Israeli crew that were so successful with P-M (Yonah). Whether that affects things or not....
July 16, 2008 12:41:07 PM

JAYDEEJOHN said:
I thought DP wasnt important in graphics rendering? We desperately need a third party cpu maker thats much larger than Intel, has more money, more resources, more influence etc etc, No we dont need a third gfx maker. Itll be interesting. Needed? No. Does Intel need it? Yes, very much so. They may play down graphics, but in reality, other than business, the vast majority is home buyers who think their cpu is just fine, but would looove to be able to play games and encode videos quickly. Until its here, or close, its nothing. A new approach isnt needed yet, and this may not be it anyways.


Why do think that more choice will be to the detriment of the graphics card sector. Look at the recent history:

Nvidia - FX5xxx fiasco
ATI - 2xxx fiasco
Nvidia - no drivers for months

Add in all the positives surrounding each of these events and the outcome is clear - at each time there really was only one logical choice, and that choice seesaws between one or the other.

Why would we NOT want to be in a position where we could potentially choose from 2 success stories out of 3 manufacturers, or god forbid, all 3. How is that bad?

Also, why isn't a new approach needed? 5 years ago, all GPUs were single cards. Since then we have had multiple GPU solutions, each done differently. If the developer and driver support is there, why not have another new approach? It may fall flat, but it may also change the direction of the industry. Or it may just be another way.

Would you change your mind if Intel were substituted for another firm - is the dislike that institutionalised?
July 16, 2008 12:43:23 PM

Amiga500 said:
Yes... Intel 740.

Enough said.

Oh - and its the P4 design team building Larrabbee - not the Israeli crew that were so successful with P-M (Yonah). Whether that affects things or not....


I'm sure there are all sorts of comparisons that can be made. But I'm struggling to understand why people don't want this to be successful.
July 16, 2008 12:49:01 PM

The_Abyss said:
I'm sure there are all sorts of comparisons that can be made. But I'm struggling to understand why people don't want this to be successful.


OK.

Intel start getting graphics on x86 programming models.

AMD/ATI can adopt as they have an x86 license... what can Nvidia do?



All of a sudden, we are left with AMD/ATI versus Intel in both CPU and GPU markets.



Given the size of Intel compared to AMD/ATI and the troubles they (AMD) have at the moment, Intel will win that, and go on to dominate both CPU and GPU markets - leading to a slow down of progression in both.
a b U Graphics card
July 16, 2008 1:03:33 PM

Exactly what we dont need. @ The_Abyss . Read what I wrote, I said yet, not ever. I also said that Intels direction may be the wrong way as well. I dont want the gpu market forced in one direction, and with Intels money and influence, they may just try. I like nVidia also, and this leaves them little choice, especially if tons of fanboys that dont even game have to have one, giving false numbers. I want Intel to fail. If they didnt have so much influence/money/power, it would be different. I hate having to even say that, as I like tech to go forward, but like the saying goes, 2 heads are better than 1, and this may ruin nVidia, and hurt AMD further, which would leave us what? No thanks
July 16, 2008 1:39:28 PM

Amiga500 said:
OK.

Intel start getting graphics on x86 programming models.

AMD/ATI can adopt as they have an x86 license... what can Nvidia do?



All of a sudden, we are left with AMD/ATI versus Intel in both CPU and GPU markets.



Given the size of Intel compared to AMD/ATI and the troubles they (AMD) have at the moment, Intel will win that, and go on to dominate both CPU and GPU markets - leading to a slow down of progression in both.


I think you are missing the point. No Vendor can dominate the GPU market enough to set the general direction of it all of its own.

Intel does not develop Larrabee to kick Nvidia and AMD out of the GPU market. Larrabee will have to adopt software to make their x86 architecture compatible to DirectX just like the software renderers of the past (UT, quake, etc.). AMD and Nvidia offer specialised chips that do not need much of a software layer and are optimized toward DirectX while Larrabee is a whole lot more "General Purpose". It will mainly target Nvidias CUDA since they dared to put their fingers into Intels Cookie jar.
If X86 is so terrifyingly succesfull in the gaming market, where are all those software renderers? Why isn't everyone buying quad-socket quacd-core platform from intel and AMD to get their games going at.... 13 frames?
Now people might start yelling "Ray Tracing, but Ray Tracing will..." nope. Considering all the recently released games using ray-tracing and the overwhelming line-up of blockbusters to be released that use ray tracing, intel just needs to step up and rule them all.
Until we see ray tracing GPUs will evolve a lot more and so will intels offering. I don't understand why people think intel will simply own the gaming market with their Larrabee project.
Intel has literally no experience with GPUs. Their onboard solutions hardly do anything. They are basically entering new territory with their competitors having quite the head start. ATI was worth roughly 5 billion dollars and nvidia comes in at 6 or 7 billion. Since neither owns factories it's mostly intellectual property, research, licenses etc. Intel has to catch up quite a bit there.
Depending on the value of the different markets, Larrabee will evolve differently from GPUs. Intel didn't call it a GPU. They stick to vector processing unit and other marketing talk. The company is more interested in putting massive amounts of flexible processing power on a PCB board. Those things will be the holy grail for render farms, physics nerds and maybe even general number crunching.
The X86 may have a lot of advantages for physics and even for developers that can adapt it right away, but it comes with some very nasty drawbacks too. There's plenty of 30 year old software chained to x86 and to keep it compatible there is a lot of junk carried over that slows things or requires parts that are totaly redundant for a rendering chip.

Intel won't dominate the GPU market. They just want their cut but they will have to fight AMD and Nvidia tooth and nail for every percentage of market share. If things work out well for intel, they will become a regular on the GPU market, influencing the direction of it, not choosing it.
July 16, 2008 1:57:02 PM

Quote:

they do not play nice with other companies.


That's actually something i like about them.

Quote:

they are not someone who likes standards,


That is not true. Intel likes standards. They just tend to make their own (one of the benefits of being the industries leader) but at least they stick to them. Look through the documentations of their processors or their essential series motherboards. It's almost art.

Quote:

they like to control things too much and i think if they did get some success could through brute force try and fracture the market as we have it.


Of course they try to control it. Nvidia does the same and so does AMD. They are all greedy companies that want money. If one of them controls the market, the consumer has to pay through the nose. That is absolutely clear and that's why no company should dominate the market. Look back at the 8800 series. Nvidia really had the market. They could simply set prices for performance cards.

Quote:

i could of course be considered cynical but maybe just realistic.


Maybe. The problem is, that would make the "intel goes bankcrupt in 2008" crowd realists too. Are they? :kaola: 
a b U Graphics card
July 16, 2008 2:07:40 PM

I hope youre right. That I can live with. Its just all this Intel fanboyism creeping in, even without a bit or even a snippet of capability that bothers me. Like I said, until it gets here, Im not bothering with it. The hype has started, and somehow well all have to live with it. Ive already seen the hype. Raytracing this and that. Those people know cpus, but they cant fathom what itd take to actually render an entire game using raytracing with decent frame rates, especially a FPS game. Then theres the expectation of those that think since Intel is entering this market that Intels influence will of course dominate, and well have these killer gpgpus coming from Intel. I guess my responses are mainly due to this, the hype, the unrealistic hopes, and lack of knowledge and information about Larrabee.
!