Where is Larrabee now?

jennyh

Splendid
Larrabee...is it even capable of the multi screen setups we've seen recently?

Nvidia can react and evolve g300, eventually get their multi-screen card out. Can Larrabee change enough to encompass this?
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
No one has any real idea what LRB can actually do and Intel has tons of time to change up the game as it isn't coming out for quiet some time. I am sure it can at least handle 2 as they is just bloody standard now.

As for multi screen setups that is a small share of the overall market when it comes down to it and Intel will ALWAYS at least have that 'every man' share of sales with people who don't care at all or know anything about graphics cards and will buy it because it is Intel.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810


True, it's cheaper but not amazingly so other then comparing it to what it used to me. And it isn't just price but space, even running just 2 large monitors need a bit of desk space.

Also another part of what I meant was so many monitors. How many people really want more then 2 or 3? So when going off about how Nvidia or LRB will answer to the new muilt monitor set up we are back to both price and people who actually want it.

What is the market for people who want 6 damn monitors? and most people with 3 montior set ups have more then 1 card anyways.
 

Small businesses that want to expand their digital signage for one, all those admins and PA's who have to run everyone's lives while at the same time show visitors what rooms are booked and which ones are free. You really do not seem to have any comprehension of what ATi have shown to be possible now, to span an image over nine screens 'used' to require kit costing several thousand pounds soon it will be available for a few hundred and won't require a small team of AV engineers to install, set up and teach the customer how to use.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
Yes, but your talking about small businesses that isn't what I am talking about as the OP was aimed at large. When it comes to businesses that is a completely different market and they will eat this new technology up like wild fire. It is perfect and pretty much what they are demanding for.

But when your talking about gamers, and normal/heavy computer users there isn't that huge of a market past people with money to blow.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
I totally agree, but what I am getting is if you don't point that out in the OP it loses it's effectiveness and becomes fanboy ranting, by just stating it at large, while the market for it is more specialized.

The OP sounds more like a fanboy/girl call out of the other companies rather then making a point of how amazing this new technology is. that was all I meant.
 

jennyh

Splendid
I was actually wondering about larrabee's ability to display on multiple screens. Up till now we've only heard about how it can do x fps on x cores on 1 screen.

As for the point brought up about it being a small market, I dont think so much it was all about the cost, some was about how it didn't really work properly. This Eyefinity thing 'just works' if you believe what you've read.
 
The main thing to remember is multi-monitor isn't hard to do for easy tasks, you've been able to fo that since the R9000 on the Colorgraphic Xentras on 8 screens, but you couldn't play games on them very easily, they essentially only gave you basic features and power.
What was impressive was that it was a game, and was at a very high resolution (even per panel). For just multi-monitor support it just needs the add-on hardware to support the X # of monitors/connections.

LRB probably has the underlying horsepower, is a multi-core/thread/etc design, and shouldn't have problems with a certain level of similar type of task, but the effectiveness of which will depend on their thread handling capability (something always rumoured to be a big HD5K upgrade from way back).

Of course the answer to the title of this thread: Where is Larrabee now?

is: The same place it has been for a while... behind locked doors at intel being tested/perfected.

As always, only time will tell... what they decide to focus on and provide for in their initial hardware.
 

jennyh

Splendid
Nvidia have been saying that they see their future away from discrete gpu's (or something like 25% discrete gpu's, 75% others), which considering that was what the company was built on seems a little bit odd.

Clearly they are branching out, which is good for them, but if they weren't they would be in serious trouble. They are in serious trouble that much is clear, but it could have been a lot worse.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810
Serious troble? How so? They have a huge lead over ATi in profits as well as sales numbers in the current market. While the 5000 series is going to very large lead due to being on the market longer, and showing off it's power before Nvidia even releases benchmarks that, that doesn't mean they are in trouble.

Weather or not it is true Nividia has made name for itself as the best, it as a very powerful brand name that could sell anything it is put on so branching out is only a smart move, it doesn't say they are in trouble but that they are in control.
 

jennyh

Splendid
Nvidia are in serious trouble this round because they don't have eyefinity. They might conjur up something that resembles it but you're gonna need a card for every 2 monitors. Unless you believe that Nvidia came up with this idea at the same time as ATI did?

I don't think so. Nvidia have no history of innovation or using displayport. What you're gonna get is a large and powerful g300 that nobody cares about because single screen gaming is on the way out.
 
OK, more rumors.
Theres a few people working on LRB, have seen it in the wild, and rumor has it shoyuld do 2 screens for sure, and possibly more, but 2 for sure.
Multi monitor will be a nice seller as alot of stock watchers need it, and itll definately give them an edge, so theyll sell well, and have this ability will catch on, and the numbers will grow, maybe not alot, but alot more than we currently see
 

jennyh

Splendid
Wouldn't 2 screens half the fps though jaydee? (on larrabee).

I kept reading about larrabee being capable of 60fps using 32 cores at whatever resolution it was. The cores aren't shrinking without a die shrink to 32nm, which although close for intel for sure won't be releasing larrabee on it anytime soon.

From wiki :- "As of June 2009, prototypes of Larrabee have been claimed to be on par with the nVidia GeForce GTX 285."

That looked not so bad until this 3 days ago, now it looks like it's yesterday's tech. How can intel improve on that without moving to 32nm...and even then can it still possibly be as powerful as the new cards coming out? I don't see how it is possible.
 

jennyh

Splendid
And more from wiki :-

"Intel's SIGGRAPH 2008 paper describes cycle-accurate simulations (limitations of memory, caches and texture units was included) of Larrabee's projected performance.[10] Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing.