Sign in with
Sign up | Sign in
Your question
Solved

Larrabee versus ATI/Nvidia are we getting more choices?

Tags:
  • Graphics Cards
  • Nvidia
  • ATI
  • Graphics
Last response: in Graphics & Displays
Share
a b U Graphics card
August 22, 2009 3:24:46 AM

So Larrabee is coming soon and Nvidia and ATI don't seem to mind, does Intel actually have a threat to pose or are they just going to be advanced integrated graphics?

And what does Larrabee have over Nvidia/ATI? and vice versa

I know this might be trolling but it gives me some good info

More about : larrabee versus ati nvidia choices

August 22, 2009 4:09:36 AM

nobody knows what Intel got cooking... we just have to wait and see
m
0
l
a c 107 U Graphics card
a b Î Nvidia
August 22, 2009 5:28:42 AM

Yeah, last time Intel tried their hand at dedicated Graphics cards they flopped, and they flopped big. Of course this time they actually intend to put some effort into it and not do it half assed. Anyway I wouldn't expect too much out of Larrabee. At best it will be adequate. What's important is what Intel learns from it and what they put into what comes after.
m
0
l
a b U Graphics card
August 22, 2009 5:50:55 AM

I think that these cards are aimed more at people who will buy them simply because they know the brand. The kind of people who either just started to get into computers or are buying from an OEM like Dell or HP.

I am not saying these cards won't be powerful, no one an say for sure, but I think they would have to be so amazingly powerful or bring something big to the table in fetures before they win over any of the ATI/Nividia fanboys.

Then we have to also take into account this Intel we are talking about, they are not the best at pricing so when it comes down to it we could see more people sliding to ATI simply because "oh look another overly priced company" I don't think they are worried because they know people will buy their cards and even if Intel does come out with something amazing ati and Nividia will already have you hooked with their new lineup.
m
0
l

Best solution

August 22, 2009 11:01:41 AM

Larrabee could be very interesting especially when it comes to real-time ray tracing and being able to upgrade to any future DirectX versions without having to buy a new video card. These are the things that kept me focusing on Intel Larrabee but the Larrabee's performance might be slower than current ATI and NVIDIA video cards. It would surprise me if Intel Larrabee really allows a real time ray tracing and if there will be real time ray tracing games in the near future that will put rasterisation graphics era to an end. (I know that it is unlikely to happen soon). However, I heard that Intel will release PC Game Project Offset/Meteor which could potentially enable real time ray tracing and this will act as a demo to show what Intel Larrabee is capable of.

Another interesting thing about Larrabee is that it is a video card for general use and allows the users to program it for specific purpose like there might be a software for Intel Larrabee's functionality rather than having fixed functions that are permanent and cannot be change like the ones with ATI and NVIDIA.

Anyway, lets hope that it will offer many new good things that would benefits us.
Share
a b U Graphics card
August 22, 2009 11:13:39 AM

I'm really optimistic about Larrabee actually, and if it performs comparably to the higher end cards when I'm looking to replace my 4870x2s, I'd definitely try it. Right now, it's hard to say though, since there hasn't been a ton of info.
m
0
l
August 22, 2009 12:03:50 PM

1) Larrabee will only succeed if developers start programming games for the x86 architecture.

2) if it does succeed, there's a very high chance that nvidia will be out of the business.
m
0
l
a b U Graphics card
August 22, 2009 3:26:44 PM

@ helloworld, why would Nvidia be out of business? and wouldn't x86 bit be a downgrade?

@ techno boy what is real time ray tracing?
m
0
l
August 22, 2009 4:01:51 PM

^ nvidia would be out of the business because they have nothing to fall back onto, unlike AMD, because if AMD gets owned they can fall back on their CPU's, chipsets and IGP's. Nvidia can't do that as they've pretty much been kicked out of the Intel chipset market, and have no x86 license.

x86 seems like a downgrade, but because larrabee is so powerful, it should be more of an upgrade, although the rumoured 300w TDP could put off customers, because who wants essentially a small heater in their case?
m
0
l
a b U Graphics card
August 22, 2009 4:07:01 PM

So why isn't Nvidia doing anything to safeguard themselves?
m
0
l
a b U Graphics card
August 22, 2009 4:14:25 PM

@ SS, naw, just a slow drawn out one
LRB is intended to do everything CUDA claims to do, and may do it better, as well as being a gfx card, thats why theres more pressure on nVidia here with LRB.
I agree with helloworld here, if indeed LRB picks up alot of x86 devs, and finds its way into a major console, theres trouble ahead for both nVidia and ATI
The difference here is, LRB and whatever ATI/AMD brings to the table in 2012ish is going die side, or fusioned, which will leave nVidia out in the cold like the ugly red heaeded stepchile
Another scenario, and the one I see as most likely is, lower classed cards will simply disappear in this scenario, as fusion happens, and leaves only the higher/high end open for discrete, tho, listening to devs lately, they all sound like making agame is becoming too expensive, and progress there will be stalled for quite some time, and itll only be the renegade rogue camps that push the bubble
The devs are already outsourcing their artworks for games, and making them cheaper this way has stalled as well, and unless the adoption of DX11, and being able to use it, and hopefully doing so reduces manhours while still bringing higher eyecandy etc, itll stall, tho I still see an end to DX9.
One positive thing I am hearing is that devs actually may start putting in better story lines with better , larger worlds, but all this is speculation, and we wont know til it happens.
Keep in mind, for every LRB sold, thats 1 less sale to nVidia or ATI
m
0
l
August 22, 2009 4:23:11 PM

My guess would be,

Larrabee succeeds > nvidia is out of the business. AMD either A) has a 28nm competition which is more powerful and can execute X86-64 instructions within a year. or B) they create a fused cpu/gpu which is 75%+ as powerful as larrabee and costs 20% less.

Larrabee fails > nvidia is saved for the moment. everything else carries on as usual.
m
0
l
a b U Graphics card
August 22, 2009 4:43:36 PM

A lot is being put on Intels competence with LRB, since its x86.
It kills me to hear some devs say gaming has to change, its too costly, talking about DX9, praising the SW approach of CUDA and x86, while not mentioning DX compute, DX10 or DX11, or the whole process of seeing gpus slowly going from complete fixed function units , evolving into non fixed function compute shading on a much more open SW solution using DX11.
I think their point is, the process has been slow, its actually cost them, and M$ isnt really involved monetarily like Intel will be, thereby getting a better commitment from Intel, and thats why theyre wanting Intel to drive the direction of gaming, instead of M$.
Problem is, Tim Sweeney has wanted this even before it was possible, now that its getting close, he and his ilk are foaming at the mouth for it, and totally disowning the path thats made them their fame and fortune
m
0
l
August 22, 2009 4:49:02 PM

Upendra09 said:
@ helloworld, why would Nvidia be out of business? and wouldn't x86 bit be a downgrade?

@ techno boy what is real time ray tracing?


Ray tracing means rendering the graphics by using light rays. According to Physics, light rays bounce/reflect off any 3D objects we see which gives 3D object the colour and shiny 3D objects would also reflect other 3D objects too. Ray tracing will give a photo-realism graphics which is many times better than current rasterized graphics in current video games and it would look very real and this will also be a big major step in the graphics improvement. :bounce: 

However, the idea of Ray tracing isn't something new and it is being use more in Hollywood realistic animation but it is not use in video games yet because it would perform very slow with current video cards and it would take a lot of power from the video card to perform it in which we later called it as "Real-time Ray tracing". Real-time ray tracing is the idea of using ray traced graphics in a real-time 3d virtual world like in video games where you can manipulate it and move around the place freely with the protagonist character but like I stated earlier, it would require a lot of GPU processing power to perform ray traced games. :bounce: 

Real-time Ray tracing is something that we could not easily ignore and it would instantly change the graphic era dramatically. It is also possible that Intel Larrabee will support real time ray tracing and this is why I am keeping my eyes on Intel Larrabee. Anyway, this doesn't only depend on Intel but also on game developers to make such a real-time ray traced games. :D 

I just hope that it is not going to be like right after we buy ATI DX 11 card or NVIDIA GT300 DX 11 card and then suddenly a news pop up saying that Intel release a real-time ray tracing card and it is the start of the end of rasterization graphics era and we would get stuck with obsolete rasterize DX 11 video card. :cry: 


Example:

m
0
l
a b U Graphics card
August 22, 2009 5:03:37 PM

A good example of RT
m
0
l
a b U Graphics card
August 22, 2009 5:06:42 PM

At techno, I have a major problen with that comparison, as the saucer shows no depth, theres really no shadowing being done etc
Looks like a rookie dev doing it
m
0
l
a b U Graphics card
August 22, 2009 5:10:33 PM

Bu then, theres many here that dont think things like SSAO or HDAO mean much either when it comes to eyecandy, and still want DX9 over DX10 or 10.1 or dont have much liking for DX11.
Its what the devs want to put into it
m
0
l
August 22, 2009 5:12:36 PM

JAYDEEJOHN said:
At techno, I have a major problen with that comparison, as the saucer shows no depth, theres really no shadowing being done etc
Looks like a rookie dev doing it


I got that comparison from a blog by somebody but I am not sure what you meant by no shadowing and no depth.

Is that still a true Ray traced graphic or maybe you meant it was badly done by some noobs? :??: 
m
0
l
a b U Graphics card
August 22, 2009 5:14:20 PM

Things to bear in mind while looking at that RT image, its too perfect.
Wheres the finger prints? Doing things like finger prints on RT is going to really cost, a smudge whatever.
Imagine a race game, with crunching and dirt flying...
m
0
l
a b U Graphics card
August 22, 2009 5:15:11 PM

I meant the raster image, it was poorly done. The RT one is fine
m
0
l
a b U Graphics card
August 22, 2009 5:17:42 PM

The reflections on raster are much more difficult, especially concave scenarios, but can be done.
Silver can still be silver, and not some butt ugly gray, theres simple no depth to the saucer, the placing of a few things is poor, but hey, the cup looks ok heheh
m
0
l
August 22, 2009 5:23:17 PM

JAYDEEJOHN said:
Things to bear in mind while looking at that RT image, its too perfect.
Wheres the finger prints? Doing things like finger prints on RT is going to really cost, a smudge whatever.
Imagine a race game, with crunching and dirt flying...


That has to do more with texture rendering or texture mapping like you can see rust on the old steel door. Even on 3D Studio Max allows those textures like dirt, rust, finger prints and so on. However, Game devs can program Larrabee to do whatever they wanted since Larrabee would allow that so it is going to be more specific and also for general purpose perhaps than the current cards from ATI or NVIDIA? Different game devs could have different specific functions from Larrabee but they just have to program it.

It is still going to be closer to photographs than the current rasterized graphics but it cannot be too perfect as in reality at the meantime. :) 
m
0
l
a b U Graphics card
a b Î Nvidia
August 22, 2009 5:25:45 PM

He meant it's a DX7 level example of Raster graphics, reflections, shadowing and depth can be done far better than that example shows.

You can show all the same things in a a still from both really, where they differentiate is the level of maneuverability you can do with RT versus Raster as the bodies get more complex and things like varying window transparency change by angle and approach.

Right now Raster graphics look much much better REAL time, because the processing power for RT is so demanding, so Raster tricks help give us awesome looking images (let's say 93% realistic - accurate) at 30 fps, whereas RT graphics can give us 99.44% accurate images with few visual errors, but at 30 seconds per frame.

Showing the images you did undersells Rasterization as much as much as only showing RT-Quake would undersell Ray Tracing.

To the OP, Google is your friend for the questions about 'what is...." try to move the conversation along with "what do you think of..." instead and google the stuff in the answers you don't get. ;) 
m
0
l
August 22, 2009 5:34:12 PM

I remember that there is also the Ray traced "Ferrari Car" which looked amazingly real but I forgot the link. I think that it is also on Youtube if I am correct.

I hope that something so real like that will exist in video games in the near future. That was awesome and incredible! :o 

Also, 60 fps should be enough since LCD monitors will not allow us to go above 60 fps and human eyes will not regonize the difference between 45 fps and 60 fps based on eye doctors' information. I hope that Intel Larrabee will support Real-time ray tracing and giving about 40-60 fps in ray traced games and that would be enough to satisfy most of us. I also hope that Larrabee could come in around Q4 2010 to compete with ATI DX 11 cards and NVIDIA DX 11 cards in time instead of Q1 2010 so we could compare them and see the advantage of Intel Larrabee over ATI and NVIDIA's offerings with DX 11 video cards.
m
0
l
a b U Graphics card
August 22, 2009 5:45:58 PM

Upendra09 said:
So why isn't Nvidia doing anything to safeguard themselves?


Helloworld_98 said:
^ they can't.

That is more or less true. Don't forget that nVidia has CUDA which is getting a bit of a hold in the CAD/CFD and may be even vid/photo editing market.

Quote:
, people who think you can't see anything above 20-60 fps don't deserve a voice on the internet(s).

+1000000. Agreed! You can (at least I can) tell the difference between 40fps-60fps and like 120fps+.

As far as real time raytacing goes, nVidia/ATI has the best chance of going some thing (esp. AMD as they have both GPU AND CPU tech).

Another posibility is that if this takes off, nVidia could be bought off/sold to by some big company such as IBM. I'm not sure on this, but I think nVidia could get x86 license from IBM or VIA correct?
m
0
l
August 22, 2009 5:49:26 PM

LCD Monitors support 60 hertz which is only 60 fps max. Despite that some LCD monitors might offer 120 hertz option but you still would never go above 60 fps no matter how powerful your video card is even if it is powered by Nuclear energy. Lol! :D 

Right, now I better shut my mouth or else I would never get to see anything below 60 fps. Lol! (just kidding):D 
m
0
l
a b U Graphics card
August 22, 2009 5:50:12 PM

I still say this all comes down to the die hard fanboys and weather Intel can do something to make them switch. As well I think it won't happen at first, people will want reviews and benchmarks and some hands on before they will drop the money Intel is likely going to be asking.

And from what I have heard the Intel cards won't be out until Q1 of next year. By then Ati and Nivida will likely have a foothold in sales with the 300 and 5000 series. That will hurt Intel's sells quirt a bit I think; as most people don't have the money to buy another high end card.
m
0
l
a b U Graphics card
a b Î Nvidia
August 22, 2009 5:52:39 PM

Techno-boy said:
I remember that there is also the Ray traced "Ferrari Car" which looked amazingly real but I forgot the link. I think that it is also on Youtube if I am correct.


Yes but that's old too, look at last year's ATi Demo day and look at the car and cut scene there with reflections, depth of field and everything, compare that ferrari to those two (which are being rendered real-time) and you don't care too much about 'playing' the early RT stuff, just hope for them to get the hardware soon that will start getting us toward parity. The latest best RT demo car I remember was a Buggatti Veyron, but admitedly that was a few months ago, stuff in this field changes quickly for both sides (just look at the DX11 demos, with full Tesselation [check the edges to show the advantage of create relief, not just parrallax illusions]).


Quote:
Also, 60 fps should be enough since LCD monitors will not allow us to go above 60 fps and human eyes will not regonize the difference between 40 fps and 60 fps based on eye doctors' information...


Stop now, you're becoming one of those people who should be disconnected from the internet(s).

Also, Larrabee will not usher in real-time RayTracing, that will be a while out for anything but small resolution/object games/demos. Real-time IMO is about a 2012-15 thing depending on requirements. Just think of the RT workload of a foliage infested Crysis or Oblivion, it's not going to be handled live by Larrabee or anything near term.

Don't confuse yourself, Larrabee is not all about RT, and it's doing to be still primarily for low end usage at first, JDJ is talking about future iterations, not about current ones, and you will likely see alot of short cuts, tricks and DX/OGL fallbacks for this first generation hardware. But having abother entrant into the market with the clout of intel helps to shake things up and make the other two work harder, and hopefully this is a goof thing, as long as there isn't only one left standing.

Anywhooo, I gotta go to the lake, just thought I'd pop in for a sec, Ciaola! :sol: 
m
0
l
a b U Graphics card
August 22, 2009 5:58:19 PM

The ray traced pics look extremely real, especially JDs.. but how do i know that's not a photograph from a 12.3 MP camera?

not that i am doubting you, but my point is where does ray tracing have to end before everything looks so real that you might as well take a picture of everything and make a game out of that?
m
0
l
a b U Graphics card
a b Î Nvidia
August 22, 2009 6:04:10 PM

Before I go, just something to look into which has always been seen as the bridge between Raster & RT, and which I think may be something that the compute shaders and Larrabee will work on and start to implement is Ray-Casting, Think of it like the missing link between the two. Early use of Ray-casting was limited due to power, but now it should be easier to do, especially with the very vector-oriented processors in unified GPUs that are massive in number compared to the old vertex shaders. When combined with deferred shading on the setup rather than Z clean-up, I think you're going to see more of that implemented nearer term to achieve RT type images in Raster type speeds. The HD4K did a demo earlier in the year, and it was pretty impressive, although misnamed as true RT.

But like JDJ mentions it's still dependant on all the devs and here the money and motivation are.
m
0
l
a b U Graphics card
a b Î Nvidia
August 22, 2009 6:10:51 PM

Upendra09 said:
The ray traced pics look extremely real, especially JDs.. but how do i know that's not a photograph from a 12.3 MP camera?

not that i am doubting you, but my point is where does ray tracing have to end before everything looks so real that you might as well take a picture of everything and make a game out of that?


OK, one last one, RT can be manipulated, the picture cannot. In order to play a game with 2-4 Mega pixel images of every View, you would need Terabytes of data to play a game since even rotating around an object with the exact same spacing, through 360 degrees and sayinging you didn't require images for even fractional degrees, would require 360 pictures for each object in whatever bit-depth you needed for the game, that's an incredile load just for that one object, let alone a bunch in a room at various distances, then add effects like smoke, then you need a whole different folder/batch for pre & post explosion/fire, etc. and then you need pictures for lighting changes, etc, just impractical even for a one room game let alone a whole game world. If you create render rules (either for Raster or RT) then you don't need a ton of pictures, you can just say render this object and it's lighting based on it being a pot, with this wavelength of colour, these reflective properties, and then have it inserted into the view/reflection calculations.

BTW RT graphics are usually what you see in movies, but they are using render farms that take seconds to generate one frame, so trying to get that level of realism down to real-time on a desktop is a little difficult and the thing holding us back.

Anwyhoo gotta fly...

m
0
l
a b U Graphics card
August 22, 2009 6:35:26 PM

by RT movies you mean the animated ones right?

And i tried looking this up on google, but what are the main differences between DX8, 9, 10 and new 11?
m
0
l
a b U Graphics card
August 22, 2009 7:12:42 PM

Google 1 at a time, itll show the differences from one to the next, hint: start with the lowest number
m
0
l
a b U Graphics card
August 22, 2009 11:59:19 PM

^ i'll try again
m
0
l
August 23, 2009 12:39:50 AM

Just to mention, I think it took 6 hours to render one frame of "Cars" on the render farm, and Cars wasn't completely raytraced, they still used rasterization for non/less-reflective surfaces.

I think TGGA's prediction of 2015 is reasonable, at least for the first real-time ray-traced things to come out.
m
0
l
August 23, 2009 2:26:18 AM

i think i read somewhere that lrb is roughly the power of a current gtx285, and when it releases, the 485 ight already b available, ray tracing looks pretty kool, but i was under the impressio that the main reason for lrb was because intel saw that the cpus days as the pcs powerhouse was comming to an end with gpu based processing taking over, whatever is the reasn, more competition in any market is better for the consumer.
m
0
l
a b U Graphics card
August 23, 2009 3:36:50 AM

yes but if ray tracing does well then supposedly, Nvidia is out and then it is still two companies
0
m
0
l
August 23, 2009 3:42:24 AM

that will be interesting to see AMD/ATI vs Intel/Intel .... hahaha RED vs BLUE (shush AMD is not green :p )
m
0
l
a b U Graphics card
August 23, 2009 4:26:43 AM

is still want ur internet
if that becomes the case then there is going to be alot of heat on AMD, to come out ahead.
m
0
l
August 23, 2009 4:34:13 AM

as much as i love ATI, i hope Intel really does some damage with the first series of LRB... if they don't really stand out from the other two there is no reason to switch... they dont have a fan base... the have to flat out outperform and out price ATi/nvidia to get their name in there.... because us, consumers really dont care whether its red, green, or blue... as long as it plows through Crysis at a reasonable price
m
0
l
a b U Graphics card
August 23, 2009 4:45:32 AM

Itll be interesting, as nVidia carries a huge marketing name with it, and it shows, especially with the 4xxx series, and the slow inroads ATIs made with it
Its basically controlled the pricing, led in that department, and doesnt dhow alot for all theyve done.
Then you have the ding dong ding ding thing, and Intel owns there
m
0
l
a b U Graphics card
August 23, 2009 7:26:38 AM

xc0mmiex said:
as much as i love ATI, i hope Intel really does some damage with the first series of LRB... if they don't really stand out from the other two there is no reason to switch... they dont have a fan base... the have to flat out outperform and out price ATi/nvidia to get their name in there.... because us, consumers really dont care whether its red, green, or blue... as long as it plows through Crysis at a reasonable price



will be hard to out price ATI while still offering a great card. And again this is Intel...not the best known around. And the taste of their last failure of a card is still strong in the mouth of those who know better.

Should be fun to see what happens but I know I am sticking to ATI unless the performance AND price are better becase if even one of those fails I still want to support ATI as a company I believe in. Offering both power and price.
m
0
l
a b U Graphics card
August 25, 2009 2:12:01 AM

So will there be any directX like support for LRB+RT as the tech get's better?
m
0
l
a b U Graphics card
August 25, 2009 2:23:45 AM

Its a wait and see how these new hardware will pan out.

With ATI and NVIDIA pushing out new generation card in 6 or 8 months Intel solution won't necessarily address the high end game applications.

Intel will strengthen its hold on average or mid range application.

Math intensive calculation will provide a great advantage for Intel. Users don't need to pay for additional Physics card. Its already with the CPU.

Software support will play a big role on this.
m
0
l
a b U Graphics card
August 25, 2009 2:50:17 AM

The best way to go about this would be to pair LRB with Intel procs so that they work well with each other so that they can be compared to the 260+, and 4850+ gfx cards
m
0
l
      • 1 / 2
      • 2
      • Newest
!