5850 GTX 285 killer?

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Will you buy this or wait for the GTX300 series?! I don't think that the GTX360 would be able to match these specs especially when DX11 will be out soon. Following the last generation trending, the 4850 was nowhere near the GTX260, now the "50" variant bests even the single card monster of GTX285. So we will see what a x2 card can do as a single card.

Nvidia ceeded in a recent interview and stated that DX11 won't be as good as DX10 for gaming and graphics purposes but more for CUDA-like apps and sharing GPU power through networking. They also state that these features that DX11 will bring for Ati and the new 5870 is overhyped.
 

Jaysin

Distinguished
Feb 21, 2009
234
0
18,710
Do you want to wait until January/February for GT300? If you do, feel free. If you want the best value of current gen hardware that is out now, get a 5850.
 
As you can clearly see DX11 gives vastly improved framerates over DX10 on the same hardware. :whistle:
20218.png
 
Its hard to judge DX11 yet as all we have are hacked up games and demos, so as mousemonkey posted it has yet to prove its value.If you aren't in a rush to upgrade, just sit back and enjoy the show like I am. By the time the NVidia cards come out we may even have some real (include more features) DX11 games and can judge it better.
 

knotknut

Distinguished
Aug 13, 2007
1,218
1
19,310
HD 5850 GTX 285 Killer?

If Nvidia lowered the price to like $50.00 less than the HD 5850 then GTX 285 lives.

But a GTX 285 for $100.00 more than the HD 5850 you have to be on Smack, Jack.
 
Yeah, they're a really good buy these days. I was even looking into a second 4850 1GB for crossfire, but then the fact I don't really need it spoiled it. Per/$ though that route is hard to beat if you already have one and a CF board.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Still this is a trendbreaker, the 5850 should not be able to beat Nvidias Best single card config. Its like saying the 3850 when it first came out to counter the 8800gt was going to be better than the 8800Ultra. Not even the 4850 was better than the 8800Ultra/9800GTX+, they were equal considering the G92 architecture was already established and older.

This instance with the 5850 and 5870 just destroys the fairly recent GTX200 series out of the water. We havent even seen 5870 x2 or a 5890 yet!?! Nvidia needs to catch up and not be content with their position.

It would also be nice to see more chart comparisons other than frame rate comparisons. Maybe we can see the color quality and Image quality for gaming as well. I know Nvidia has more 'realistic' colors compared to ATi cards.... can anyone confirm?
 
I haven't heard that before, but I have seen recent tests for AA/etc. and the 5800s are theoretically better. Problem with comparing colors is that they can be manipulated in so many different ways (in control panel, windows, in game, on your monitor) that I'm not sure any difference would be clear.
 


Stop doing drugs, the colours are fine on both.
 

wh3resmycar

Distinguished


you calibrate your display differently relative to the brand of the card that you have. well unless you have a microscopic eye that can differentiate the billion shades of red,green, blue (along with pigmentation) which i doubt you do.

TGGA's avatar looked the same to me when i had an ati card and now that i have a nv, so your term "realistic" is a little misjudging..
 

ThemNuts

Distinguished
Oct 7, 2009
1
0
18,510


He is actually right... Compare 2 games on the same system, only change the Gcard.
Nvidia and AMD will look loads different, trees, sky, colors etc will have a different look.
This has to do with the way each manufacturer renders the graphics.. This is only with ingame footage though. Photoshop, images, and video will look practicly the same.
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990

Thats because default settings are different, you can make ATI card to "look" like Geforce, or vice versa.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
With Full AA and AF? Ive seen some differences mentioned in TOMS when they had Resident Evil 5 comparisons. In fact they should do more of that where they use gif files to compare two exact frames from the exact game to see Nvidias quality vs Ati's. Of course it would also depend on how good our screens, color settings, and GPUs are as well to tell the difference.

They should add some side by side comparison shots or vids like those uploaded on youtube.
 
Yea, he sorta alluded to being a tad sneaky there.
Ive even seen higher marks than 25%, like in the 30% range, itll vary according to setup, and later according to each game, how ell the devs did , how many and much features are used, and where in each game
 

cheesesubs

Distinguished
Oct 8, 2009
459
0
18,790
instead been satisfied by pwning last gen card they should focus more on incomig challenge. lerrabee is coming out. a quad core lerrabee will be a dreadful threat to either amd and nvidia. imagine a single core lerrabee can be match up with gts250. a quad core will surely beat the crap out of gtx 295 and radeon 5870. but these are only based the uncertain release of g300. if g300 delay and release after lerrabee then nvidia will face their doom......amd will having trouble as well. a dual core lerrabee will murder 5850 and gtx285.

intel is still the big evil behind the door.....
 

yannifb

Distinguished
Jun 25, 2009
1,106
2
19,310


He is showing that it gives a 5% decrease...

 


In late 2010 at best (likely 2011), and by then totally new architecture from ATi expected. No point in focusing on Larrabee competition right now, which is what nV may find out if they've had to compromise any other performance to perform against a part that is still at least a year away.
 


Uh no, it's actually showing a major benefit of enabling SSAO in DX11 mode versus DX10 mode.

Not enabling it, the DX10 and DX11 implementations are the same, whereas enabling SSAO the drop is far less with the DX11 mode.

 
So, in essence, free eye candy, and thats what we want right?
You almost always cant have both, or the devs did the pooch somewheres along the line in the older version.
This is the thing.
If we want better eyecandy, then we need powerful enough cards, or, better DX models to allow for it, even to lessor extents on lower cards, so the whole spectrum is covered.

Everyone wants a Crysis killer, and wants all its eyecandy as well, again, if Crysis was DX11, alot of what we have in Crysis would run much better on the new cards, but its not.
The ground work is being laid for this, with DX10 on up engines being made, with DX10 on up games.
Once this happens, things will improve, its just the transition were going thru now, which slows this progress, as people are still using old OS' and old cards
 

yannifb

Distinguished
Jun 25, 2009
1,106
2
19,310


ohhhh i see. Oops, sorry :na: