Sign in with
Sign up | Sign in
Your question

Larrabee vs geforce 9800gt vs radeon 5770 vs gt250

Last response: in Graphics & Displays
Share
November 10, 2009 4:33:33 AM

as we know larrabee will be release in january 2010. for mid range graphic card which do you think is the most cost/performance? will larrabee win the mid range gpu war?
a b U Graphics card
November 10, 2009 4:47:12 AM

Links?
Related resources
a b U Graphics card
November 10, 2009 5:03:32 AM

The first video cards featuring Larrabee are likely to be released in the first quarter of 2010.[1][2][3][4]
This comes with bracketed forwards/included links. Looking at No.4 we see :
http://www.tomshardware.com/news/intel-larrabee-gpgpu-g...
Which states 1st half of 2010, not January
Dont always trust wiki
a c 212 U Graphics card
November 10, 2009 5:05:17 AM

The competition for the 5770 is the 4870 and the 260

http://www.tomshardware.com/reviews/best-graphics-card,...

As for Larabee.....I think January is wishful thinking for what ahs been reported to eb a 1st quarter release.....methinks Intel is optimistic. In any case, I don't expect Larabee to compete in the uppe rmid range to high end for about 2 years .... various Intel execs have said as much in the trade press.
a b U Graphics card
November 10, 2009 5:18:17 AM

JAYDEEJOHN said:

Dont always trust wiki


What are you talking about? Wiki never lies.... Fermi is coming THIS MONTH... WIKI TOLD ME SO !! :whistle: 
a b U Graphics card
November 10, 2009 12:18:33 PM

Larrabee in january 2010? Lol not a chance.
November 10, 2009 2:11:39 PM

jennyh said:
Larrabee in january 2010? Lol not a chance.


they will release it....just as fermi will came out before christmas

then it will happen like this:

chipset= owned by gma series

low entry = radeon 5500/5600 will be owned by chrome 480/580gtx and geforce 9500/gt220, larrabee x2.

mainstream = own by gt250/9600/9800gt and larrabee x4, the price will sure kill amd's 4770/5750/4830.

mid range = gtx260/275 still stay strong, larrabee x8 will pwn r700 and 4870/4890/jupiter still got owned. even geforce 9800gtx+ still stay in the game.

high end = fermi 360/375gtx, larrabee x16 will just just murder cypress. also gtx285/295 are killing 5850/5870/5890 in p/c

anthusiaist: 385/395 will pwned hemlock in the dirt. but larrabee x32/64 will just eat both alive
a b U Graphics card
November 10, 2009 4:00:41 PM

cheesesubs said:
they will release it....just as fermi will came out before christmas

then it will happen like this:

chipset= owned by gma series

low entry = radeon 5500/5600 will be owned by chrome 480/580gtx and geforce 9500/gt220, larrabee x2.

mainstream = own by gt250/9600/9800gt and larrabee x4, the price will sure kill amd's 4770/5750/4830.

mid range = gtx260/275 still stay strong, larrabee x8 will pwn r700 and 4870/4890/jupiter still got owned. even geforce 9800gtx+ still stay in the game.

high end = fermi 360/375gtx, larrabee x16 will just just murder cypress. also gtx285/295 are killing 5850/5870/5890 in p/c

anthusiaist: 385/395 will pwned hemlock in the dirt. but larrabee x32/64 will just eat both alive


Wait......





















....what?
a b U Graphics card
November 10, 2009 4:05:06 PM

Something tells me, someone hates AMD, and translates that hate to ATI as well
a b U Graphics card
November 10, 2009 4:48:01 PM

You can throw in an unhealthy amount of delusion with that too. :p 
a b U Graphics card
November 10, 2009 4:58:53 PM

Wow, when I first read that post a while ago the poor grammar/internet speak made me miss just how wrong it was.
a b U Graphics card
November 10, 2009 5:02:30 PM

cheesesubs said:
they will release it....just as fermi will came out before christmas

then it will happen like this:

chipset= owned by gma series

low entry = radeon 5500/5600 will be owned by chrome 480/580gtx and geforce 9500/gt220, larrabee x2.

mainstream = own by gt250/9600/9800gt and larrabee x4, the price will sure kill amd's 4770/5750/4830.

mid range = gtx260/275 still stay strong, larrabee x8 will pwn r700 and 4870/4890/jupiter still got owned. even geforce 9800gtx+ still stay in the game.

high end = fermi 360/375gtx, larrabee x16 will just just murder cypress. also gtx285/295 are killing 5850/5870/5890 in p/c

anthusiaist: 385/395 will pwned hemlock in the dirt. but larrabee x32/64 will just eat both alive


Holy crap, is this serious? At first I thought it was some kind of joke, but I couldn't even bring myself to laugh. :heink: 
a b U Graphics card
November 10, 2009 5:25:54 PM

ok, Cheesesubs, who is "Chrome 480/580gtx"? I'd sure like to meet him.

If you really think that a GeForce 9500/GT220 would beat out an ATI 5600 series card, you have really smoked your final rock...

You forget, Nvidia has all but wiped the GT/GTX2xx series off the board.

And lastly! Keep up the good work murdering the english language while confusing the snots out of us. I have lost about 13 IQ points just reading your comment(s).
November 10, 2009 7:08:11 PM

jonpaul37 said:
ok, Cheesesubs, who is "Chrome 480/580gtx"? I'd sure like to meet him.

If you really think that a GeForce 9500/GT220 would beat out an ATI 5600 series card, you have really smoked your final rock....


bet you never heard of chrome from s3 then at least you should know that the series was menufacture by via.

and i wont quote the rest of it because time will prove me right.

jonpaul37 said:
You forget, Nvidia has all but wiped the GT/GTX2xx series off the board.


well...that is news to me. but i'm still seeing gf"9" series on the stock everywhere.

jonpaul37 said:
And lastly! Keep up the good work murdering the english language while confusing the snots out of us. I have lost about 13 IQ points just reading your comment(s).


then dont complain because you had read the post and there is no way you can get your refund back.



EXT64 said:
Wow, when I first read that post a while ago the poor grammar/internet speak made me miss just how wrong it was.


then that means you had been entertained by this post. enjoy!


JAYDEEJOHN said:
Something tells me, someone hates AMD, and translates that hate to ATI as well


no body hate amd personally. it's just i can't stand how people praising that how good amd's products are and even means sometime they make junk and still got admired by some fans.
a b U Graphics card
November 10, 2009 7:39:44 PM

The reason you are still seeing 9-series in stock is because it's all nvidia has at that point in the market.

Rehashed, rebranded 8 series cards competing with far superior last generation ATI products. I don't need to praise ATI to see that they just plain win at every price point.
November 10, 2009 8:11:32 PM

jennyh said:
The reason you are still seeing 9-series in stock is because it's all nvidia has at that point in the market.
that is because radeon 4870/5770 didn't take far advantage over g92. in some game r700 were overwhelmed by geforce 9800gtx. 9600gt still hold its own in the midrange market. so it is no need to wipe off g92 at the moment.


jennyh said:
Rehashed, rebranded 8 series cards competing with far superior last generation ATI products. I don't need to praise ATI to see that they just plain win at every price point.


because until recently amd finally invented arch that "far" surpass g92 (even 4870 had hard time keep out with 9800gtx in some game. at best 4970 only leads up 5~8% over 9800gtx in performance). cypress that is. but it is still not popular yet. it is ridiculous that it took amd 3 years to beat an arch that been around since 2006.

to me that is praise
a b U Graphics card
November 10, 2009 8:21:15 PM

The 250 is nVidias 5th best card. Whats ATIs fifth best?
And where do we see any cards beating ATIs current top?
No?











waiting









No again?
Problem with your whole argument is, theres nothing from nVidia at all.
They had 3 years to make something better thasn the G92, and theyve come out with what? 4 cards in 3 years?
And it had trouble beating the 98x2, cost over 600$, ran hot, had 1 of the highest return tates seen lately on high end.
Theyve got drivers out now that burn your mobo and card.
You arent worth dealing with.
Until you come and join the rest of us in reality, learn up
November 10, 2009 8:30:17 PM

Going to have to say that...

jonpaul37 wrote, "And lastly! Keep up the good work murdering the english language while confusing the snots out of us. I have lost about 13 IQ points just reading your comment(s)."

cheesesubs wrote, "then dont complain because you had read the post and there is no way you can get your refund back. "


...is a win.


I would love to see both Fermi and Larrabee soon, but I'm not holding my breath.
November 10, 2009 8:34:40 PM

they said that lrb is roughly as powerfull as gtx285, i saw there lrb x16, can i construe this as 16 gpu cores scaling perfectly to render in real time, forgive me for being skeptical, besides, i still dont think intel is that serious about gaming, they are just producing lrb because they realise the gpu will soon take over the majority of computing from the cpu. Just my $00.02
November 10, 2009 9:55:02 PM

xaira said:
, they are just producing lrb because they realise the gpu will soon take over the majority of computing from the cpu. Just my $00.02



gpu will not replace the cpu...dont get the drug that nvidia provide. does a gpu have instruction set? no! does a gpu have on die sram cache? no! gpu means "graphical processing unit" which is work on pure graphical usage. plus most of gpu have issue with above 2gb on board video ram because until radeon 5870 is still 32bit processing and intel had em64t that nvidia dont have(amd may have x64 processing but they are having trouble to place in their gpu).

or correcting me how shader unit/texture pipeline/render output unit to do with realtime computing.......

JAYDEEJOHN said:
The 250 is nVidias 5th best card. Whats ATIs fifth best?
And where do we see any cards beating ATIs current top?
No?

waiting



No again?
Problem with your whole argument is, theres nothing from nVidia at all.
They had 3 years to make something better thasn the G92, and theyve come out with what? 4 cards in 3 years?
And it had trouble beating the 98x2, cost over 600$, ran hot, had 1 of the highest return tates seen lately on high end.
Theyve got drivers out now that burn your mobo and card.
You arent worth dealing with.
Until you come and join the rest of us in reality, learn up


gtx295 still pwn 5870 in 2/3 of the game, gtx285/9800gx2 are equally match with 5850, gtx 260/275 owned 5700/4800s and gt250/9600/9800 beat the crap out of 5500/4600s

since you'll going to mention incoming hemlock...too bad. fermi takes all(except multicore larrabee extreme).

and i am live in the reality that nearly everyone are tolerated with amd mistake/praise their long time develop product. or simply just spoiling amd and being harsh to nvidia/intel and even s3? i wonder if there's any justice around......


a b U Graphics card
November 10, 2009 10:05:12 PM

You really need to learn some.
Gte out Intel fanboi!!
I was afraid of this, now were (ATI and nVidia gpu users) going to have to face the looney Intel crowd
a b U Graphics card
November 10, 2009 10:10:27 PM

The only link you provided was wrong.
You have no clue as to perf, no clue as to dates, on anything youbve said so far. Quit before youre buried. I may have to close this, unless you have some proofs, otherwise, its only pure flamebait
November 10, 2009 10:10:49 PM

And I thought the AMD fanboys were bad. I guess I was wrong.
November 10, 2009 10:15:46 PM

...I think my eyes are starting to bleed.

I take it back, grammar IS a good thing!
a b U Graphics card
November 10, 2009 10:16:28 PM

Id welcome you to my thread, which shows a 21% increase on the new beta drivers, which puts it even with the 295 in just 1 game shown so far, but theres more to come.
Like I said, you havnt a clue, you probably dont even game.
http://hardocp.com/article/2009/11/10/need_for_speed_sh...
And, as far as your 2 outta 3 claim, it shows youd never ever run high end, as those wins are generally at lower res, with no or lil eyecandy, which is not what the top cards are for.
November 10, 2009 10:18:21 PM

lol intel gma rapes amd/nvidia? nope, amd has hd4200 which is atleast able to run a source engine game at max settings no aa @ 30fps+. GMAx4500 would only get about 25 fps on same settings and nvidia's 9400 chipset gets 50fps+
November 10, 2009 10:18:41 PM

Wait...is this guy serious in saying that it's an achievement for Nvidia's dual-chip card to be able to beat AMD's best single-chip card roughly 60% of the time? Especially considering that AMD's drivers ALWAYS take a few months to be properly optimized? And even if a GTX285 can hang with a HD 5850, it's still ~$60 more. Add in that neither of the Nvidia cards are capable of doing DX11 (even though it may not matter much yet), and...why are we having this argument?

Dude, come back and argue about Fermi after Nvidia figures out how to mass-produce a working chip that is on the market and has been properly benchmarked. And just to add a little more than that, there is no way that a high-end Fermi card will be at the same price point as AMD's cards, since Nvidia doesn't know how to sell something at a price average people can afford.

Oh, and s3 makes me lol.
November 10, 2009 10:19:59 PM

IDK guys, he may have a point...



Just think how poor nV would look if Lrb beat Fermi to market.
a b U Graphics card
November 10, 2009 10:23:29 PM

Intel Corp.'s Larrabee graphics processor, which is expected to challenge Nvidia Inc. and ATI Corp. in the high-performance desktop and gaming PC market, will arrive early next year, the chipmaker's CEO said during a quarterly earnings call on Tuesday.


From your link.
Its dated April, its saying 1st half. You do know what 1st half means dont you?
In your thread, theres no mention of how many cores, what speeds, etc etc etc. Pie in the sky.
a b U Graphics card
November 10, 2009 10:26:56 PM

Sure his points all about a 2 non existing cards, with non existing perf abilities, in either the gfx or gpgpu market. No release dates, no size, price, heat, power etc etc
November 10, 2009 10:51:52 PM

JAYDEEJOHN said:
Sure his points all about a 2 non existing cards, with non existing perf abilities, in either the gfx or gpgpu market. No release dates, no size, price, heat, power etc etc


they are exist.... intel holding it back and push the release date away from original schedule was because of optimization. intel planned to unleash it in 3nd quarter 2009(they anounced it in 2008) but cypress(5800) ruin the party and forced intel to refine the architecture and pushing the date forward. same case also happen to fermi they were caught surprisely by evergreen's low profile release. you cant blame them for it.

btw intel goes with 32nm(or 22nm?) instead of 40nm which is the main reason why were they delay the release of larrabee. and since 32nm processor has successfully on shipping schedule(core i9/i3, pentium e7000s) i believe larrabee will release anytime in early 2010, and defintely not first half. there are only 2 source that claim it will release in first half or april. but since evergreen was anounce to be release in december 2009, it got earlier than consumer's expectation. larrabee might be just like evergreen.
a b U Graphics card
November 10, 2009 10:54:48 PM

Wait, now you say that the 5870 beat larrabee, so Intel went back to 'refine' it? If they are still refining it, then first half will be optimistic.
a b U Graphics card
November 10, 2009 11:00:06 PM

Last I heard, silicon was pretty much ready, they had a few exclusive tests going on, and the drivers were waaaaay off, delaying it further.
Youre acting as if you have exclusive info here, which I assure you you dont.
No one knows of its performance. IDF show was unimpressive to many, so, lets wait and see what they bring, before we start thumping our chests ok?
November 10, 2009 11:04:38 PM

You seriously think that Evergreen and Cyprus had a low-profile release? Sure, that's why in June AMD was saying it would be out "sooner than you think."
http://www.anandtech.com/showdoc.aspx?i=3573

Maybe Intel pushed back Larrabee because, while roadmaps are effective for planning in the long-run, they do not account for problems in the R&D cycle. Did you think about that?

Give Intel a few years, and then they might be able to shoot for the performance crown with Larrabee. It won't happen in January, though (or whenever it actually gets released).
November 10, 2009 11:04:47 PM

EXT64 said:
Wait, now you say that the 5870 beat larrabee, so Intel went back to 'refine' it? If they are still refining it, then first half will be optimistic.


it took by surprise. and also the heat cause the delay too. a mainstream lrl will consume 130w TDF, too much for a midrange gpu....not to mention extreme multichip + multicore version....that is the reason why they undergo refining the chip . however even without refine it(32nm). the multicore feature and level 1/2cache +instruction set will still beat evergreen even hemlock is not certain to be a match to "extreme" version of larrabee. intel hold it back shows that they aint hot head like nvidia that only dancing with amd's shadow.

they will make perfect chip that can compete to fermi 395.

JAYDEEJOHN said:
Last I heard, silicon was pretty much ready, they had a few exclusive tests going on, and the drivers were waaaaay off, delaying it further.
Youre acting as if you have exclusive info here, which I assure you you dont.
No one knows of its performance. IDF show was unimpressive to many, so, lets wait and see what they bring, before we start thumping our chests ok?


you are right at this once....the driver issue is the major reason(beside TDF) they cant get the chips under the sunlight. they had been isolate from gpu industry for decade so they would need time to write perfect driver. because of this, even long time driver expert like nvidia can make disaster. why dont we give them a chance like you gave it to amd back to that failure 2900xt?
November 10, 2009 11:08:56 PM

cheesesubs said:
it took by surprise. and also the heat cause the delay too. a mainstream lrl will consume 130w TDF, too much for a midrange gpu....not to mention extreme multichip + multicore version....that is the reason why they undergo refining the chip . however even without refine it(32nm). the multicore feature and level 1/2cache +instruction set will still beat evergreen even hemlock is not certain to be a match to "extreme" version of larrabee. intel hold it back shows that they aint hot head like nvidia that only dancing with amd's shadow.

they will make perfect chip that can compete to fermi 395.


Yep. And you'll pass a grammar course.
November 10, 2009 11:15:54 PM

chedrz said:
Yep. And you'll pass a grammar course.


offtopic: yes!! i did not pass it as you can see that how bad i am in that post! guess my typing skill need to be improve.....

whatever as long as you can "barely" read it, it will be fine.....

a b U Graphics card
November 10, 2009 11:17:52 PM

Im right, which makes you? January release? This beats that, well I have 1 HUGE problem with LRB, you cant find it anywheres!!!!!!!!!!! heheh
Thats my point, and your mistake. Until we have benches we have nothing. Same for Fermi. Unlike the 5 series from ATI, we could sorta project its perf, but only to a point, but ballpark
You can do neither with LRB or Fermi, since both are brand new arch', so again, come back after youve learned a few basics
a b U Graphics card
November 10, 2009 11:19:46 PM

Well, even Fermi we have some benchmark, it should be faster than their last card, the GTX285. As for Intel, all we have to compare to is their IGP, which beating that would be no great achievement.

Edit: I meant benchmark as in 'bottom end of where it could perform', not as in an actual FPS Benchmark.
November 10, 2009 11:20:03 PM

GT300 "Fermi" Architecture GPU Specifications

* 3.0 billion transistors
* 40nm GPU by TSMC
* 384-bit memory interface (6x64-bit memory controllers)
* 512 shader cores (renamed to CUDA Cores)
* 32 CUDA cores per shader cluster
* 1MB L1 cache memory [divided into 16KB Cache - Shared Memory]
* 768KB L2 unified cache memory
* Up to 6GB GDDR5 memory (1.5GB for GeForce and up to 6GB for Quadro/Tesla)
* Half Speed IEEE 754 Double Precision
* 16 Streaming Multiprocessors (new name for the former Shader Cluster) containing 32 cores each

that 1mb l1 and 728kb l2, thats not cache, then what is it?

and i never said that the gpu will take over from cpus, its just that waiting 4 hours for an i7 to convert a video is not exactly attractive when a 5750/gts250 can do it in 1. the gpu is being used to help the cpu, not replace it. and nvidia and intel and amd are pumping huge amts of paper into this application.
a b U Graphics card
November 10, 2009 11:32:01 PM

EXT64 said:
Well, even Fermi we have some benchmark, it should be faster than their last card, the GTX285. As for Intel, all we have to compare to is their IGP, which beating that would be no great achievement.

Edit: I meant benchmark as in 'bottom end of where it could perform', not as in an actual FPS Benchmark.

Which is true, and only further shows the OPs total neglect to fact, and a propensity towards branding.
All the specs truly mean nothing, as we dont know how they perform. Itll all be done in SW, which will be an advantage possibly for gpgpu usage, if CUDA doesnt take off well, but for games? Nothing there, sorry. Again, at least we have a base for Fermi, nothing for LRB for gaming, at all
a b U Graphics card
November 10, 2009 11:40:14 PM

So, we have a card that no one knows when itll come out, if the drivers are mature or even work for the game you play, no history of perf at all, no die size, power draw, being done at Intel instead of TSMC, no knowledge of cores, if theyll perform or not, if certain games cause problems for them or not, yet you, you say its a monster? Please understand, youre playing to the wrong crowd here.
And if its delayed too long, itll have to face refresh parts, which it will anyways, before it has much chance to take off anyways, then itll be a new process for the gpus before Intel can make any changes, then itll be HKMG, which will be like another process for free, and of course, therell be improvements all along the way, and maybe by then, well see the secong iteration of LRB. Like I said, give it a rest, lets all wait
a b U Graphics card
November 11, 2009 12:07:01 AM

Larrabee will not be released in January 2010 lol.

How can anybody believe that? We have seen nothing except a tiny insignificant demo at IDC. No leaks, no benchmarks, no idea of what the performance is like, except in ancient games like Quake where I heard it isn't bad actually!

We've been getting leaked clarkdale benches for how long now? 4 months+? Yet we still haven't seen a single leak of larrabee, still no clue of its performance. 32nm? You gotta be kidding me.

I'll say he is half correct though. It should be out in January, just in 2011 not 2010.
November 11, 2009 12:26:10 AM

Please tell me someone else realizes Cheesesubs has no idea whatsoever about what he's saying! this is killing me :cry: 
November 11, 2009 1:32:57 AM

yannifb said:
Please tell me someone else realizes Cheesesubs has no idea whatsoever about what he's saying! this is killing me :cry: 


I would be that cheesesubs has no idea whatsoever about what anyone's saying.
November 11, 2009 1:52:54 AM

Dekasav said:
I would be that cheesesubs has no idea whatsoever about what anyone's saying.


Indeed.... indeed.

Anyway back to GPUs!

Honestly I dont know what some of you are talking about when you say the GTX 295 is soooo much better than the 5870... I have one and honestly im kind of regretting it (partly cause i dont have an SLI board and because i want eyefinity).... but its not too late to fix that- anyone want my 295 ;) 
a b U Graphics card
November 11, 2009 4:40:03 AM

B-Unit said:
IDK guys, he may have a point...

Just think how poor nV would look if Lrb beat Fermi to market.


Would that be more or less poor than if they held up a mock-up card with wood-screws during a product announcement at a conference and claimed 'this is it' ? :lol: 
!