Sign in with
Sign up | Sign in
Your question

Interesting things about the 5xxx series

Last response: in Graphics & Displays
Share
a b U Graphics card
August 16, 2009 3:23:38 AM

From here
http://www.pcper.com/comments.php?nid=7641
•Battleforge DirectX 10/DirectX 10.1 performance improves of up to 50% with the largest gains in configurations using ATI CrossFireX™ technology.
•Company of Heroes DirectX 10 performance improves of up to 77%.

•Crysis DirectX 10 performance of ATI CrossFireX technology in dual mode improves of up to 10% and quad mode performance improves of up to 34%.

•Crysis Warhead DirectX 10 performance of ATI CrossFireX technology in dual mode improves of up to 7% and quad mode performance improves of up to 69%.
•Far Cry 2 DirectX 10 performance of ATI CrossFireX technology in dual mode improves of up to 50% and quad mode performance improves of up to 88%.

•Tom Clancy’s H.A.W.X. DirectX 10/DirectX 10.1 performance of ATI CrossFireX technology in dual mode improves of up to 40% and with quad mode performance improving of up to 60%.

•UnigineTropics OpenGL performance improvements of up to 20%.

•UnigineTropics DirectX 10 performance of ATI CrossFireX technology in quad mode improvements of up to 20%.

•World in Conflict DirectX 10 performance improvements of up to by 10%.


And more info here
http://www.legitreviews.com/article/1040/1/
And a picture of the card












Rumor has it, the top card is 4 cored, imagine that
a c 376 U Graphics card
August 16, 2009 3:49:27 AM

These improvements are in comparison to what?
a b U Graphics card
August 16, 2009 4:05:50 AM

The last driver releases.
These new drivers were released yesterday
They help immensly with major increases, including quadfire
Related resources
a b U Graphics card
August 16, 2009 5:28:29 AM

Don't forget tri-fire JayD, the ugly step sister!
a b U Graphics card
August 16, 2009 5:35:43 AM

Or, possibly forced TSMC monster
a b U Graphics card
August 16, 2009 5:55:08 AM

I gotta point out that the hot air exhaust on that card is puny, reminds me of the 9800GX2, which dumped lots of its heat in the case...
August 16, 2009 6:28:53 AM

I'm having a hard time comprehending 4 cores on one GPU. lol. That sounds great, but we've heard those rumors months ago about the 4 cores. I wouldn't doubt it. AMD has said their highest end chip would come later, of course. Can't wait!
a b U Graphics card
August 16, 2009 11:10:04 AM

If these release in Sep, then all new drivers? Fair enough
Having a triple link is odd, and having only half a slot for air cooling is even moreso, plus the length of the card, and all the rumors for the last 2 years, sounds ok too
a b U Graphics card
August 16, 2009 11:17:06 AM

Im not making claims, but theres some strange things here is all
a b U Graphics card
August 16, 2009 2:09:05 PM

JAYDEEJOHN said:
Im not making claims, but theres some strange things here is all


Definately, like, if that is the 4 gpu card, you're telling me that little exhaust port is going to get rid of heat for four gpus ?

It probably couldn't do one gpu for that matter, and why use all those ports on the back ?
August 16, 2009 6:56:29 PM

if you look close @ the picture the card looks like a two PCB card like the GTX x2, the vents above (and below for that matter)the card are the not for the card but the case
a b U Graphics card
August 16, 2009 9:57:33 PM

This isnt the top card. Its the same one chiphell did a pic of, and during the early DX11 demos.
Specualtion is, if this is a multi, its only a dual, or around the 295 perf, as rumor has it.
Take 2 4890s at 40nm, and dumb em down abit
a b U Graphics card
August 16, 2009 10:49:41 PM

Impressive numbers. Cant wait to see this pans out hopefully in the matter of only a month or less.
a b U Graphics card
August 16, 2009 11:00:05 PM

The ones youre getting with your x2 I assume?
August 16, 2009 11:14:09 PM

look @ the top of the card JD you can see a PCB on the top of the card, why? if its not a dual PCB card
August 16, 2009 11:23:14 PM

^^they would if it had 4 cores
a b U Graphics card
August 16, 2009 11:30:39 PM

Well, this is all speculation, but this picture obviously has something over the top, hiding something

I dunno tho, it really doesnt look like 2 pcbs, but then again, people are saying this isnt the top model either.
Who knows? All Im saying is, the rumors still alive and well this late in the game
a b U Graphics card
August 16, 2009 11:36:00 PM

If you look at the photo, its split, with a seam running the length, the lower half is very grayed out by the CH watermark, and is a different color/texture
August 16, 2009 11:39:39 PM

im wrong, if it was a dual PCB it would be @ the bottom not the top
a b U Graphics card
August 16, 2009 11:40:32 PM

Thats true, if its the old smae approach. Theres tons of speculation that DX11 has actually helped in the implementation of sharing more than just shader tasks, maybe making it easier for memory etc as well.
I dont and wont get carried away with this, but until we know for sure, Im not eliminating it either.
The last launch, it was waaaay earlier before the launch these rumors were dispelled, and yet this time, they still persist.
Its a possible option
a b U Graphics card
August 17, 2009 6:37:42 AM

Since this thread is a rumor thread, and theres no hard links, other rumors running is, the NDA holds thru Sep, and launch wont be til Oct 2nd week, but again, its a rumor, based on the invite only Sept 10th posting weve seen, meaning press mainly, with a few others.
Either way, its close
a b U Graphics card
August 17, 2009 2:36:54 PM

Quote:
22" long :lol: 

No way that is fitting in my case.



It's like...the Ron Jeremy of video cards!
August 17, 2009 2:37:28 PM

22" is for sure mistranslated. Probably 22 cm or about 9".

Anyway , does anyone knows if there are final specs published somewhere and if so provide link, or everything is in the gray area still?

Thanks
a b U Graphics card
August 17, 2009 2:45:51 PM

There are no final specs anywhere yet. Soon enough though..
August 17, 2009 6:54:18 PM

If reviewers already have cards, and NDA is up a month before they come out, odds are there should be sufficient stock for launch, which is always a good thing. (as having cards now means a lot of production time before launch).
August 17, 2009 10:57:20 PM

That little tesselation thing was awesome. Character went from not impressive, to an awesome model. Could be a nasty performance hit, but definitely something very easy to see that will impact realism and just the good looks of a game very quickly.
a b U Graphics card
August 17, 2009 11:24:08 PM

My thoughts are, itll hold off RT for awhile, as it developes in more usage, and better perf and design
a b U Graphics card
August 18, 2009 8:55:38 AM

OK, more findings for more rumors heheh
The findings are (no link, lost it, bit its true) This gen of cards is the most radical change ATI has done since R600
OK, rumors, clocks are up. Whats does this mean? Core clocks are expected to be somewhat higher with the shrink, but possibly shader clocks no linger linked to core?
a c 130 U Graphics card
August 18, 2009 5:50:06 PM

Looking at it, going on the overhang from the PCIE slot, it wont even be close to fitting in my case and its not like i have a small case either.

Mactronix
a b U Graphics card
August 18, 2009 6:15:14 PM

I havnt seen anyone actually trying to break down its length, but Im sure its not the largest ever made, maybe tied, maybe the 4850x2? length?
a b U Graphics card
August 18, 2009 10:35:36 PM

Radeon "7" Family (Evergreen) is 6 parts:
Trillian = "R800" + either multiple montior or 3-way GPU
Hemlock = "R800" Dual GPU
Cypress = "RV870" Single GPU
Juniper = "RV840"
Redwood = "RV830"
Cedar = "RV810"
Press date: Sept 10th, 2009 Event loc. San Fran, USA?
Hard release: Sept 24th. --> mid Nov. 2009
Price range: $500 (high-end) --> $50 (low-end Cedar)

Cypress (rv870) single gpu "HD5870" is:
HD2900XT PCB size
40nm
353mm^2 die size
19mm x 19mm at cost of $34 per die
2x 6pin PCI_E power connectors
DX11
1600SP
256bit bus
80TMU
32ROP
1024/2048mb mem configs
SIMD : 20
Shader Clock : 850 ???
Memory Clock : 1200 ???
Bandwidth : 153Gbps ???
~$350 MSRP
3dmark vantage performance:
Cypress ~P16xxx - P17xxx - P18xxx
aka ~HD4870X2 performance level
potentially an ~30% performance improvement in the same power profile vs. rv770

Hemlock (R800) dual gpu "HD5870x2" is:
larger than 4870x2 single PCB
40nm 353mm^2 die x 2
19mm x 19mm at cost of $34 per die
1 6pin + 1 8pin PCI_E power connectors
DX11
3200SP
256bit bus x 2
80TMU x 2
32ROP x 2
1024/2048mb x 2 mem configs ???
~$500 MSRP
3dmark vantage performance:
(2x Cypress) theoretical P34xxx ???

Juniper/Redwood HD5850, HD5830, HD5770?
Partial shroud HSF design
40nm
181mm^2 die size
14mm x 14mm die
1 6pin PCI_E power connector
DX11
800SP on Redwood
640SP on Juniper
128bit bus
40TMU
16ROP
512/1024mb mem configs???
~$199/$149 MSRP
3dmark vantage performance:
Juniper XT ~P95xx
Redwood ~P46xx
About the speed of a HD4870 & HD3870

http://www.xtremesystems.org/forums/showthread.php?t=22...
May be true, who knows, but some rumors from a Chinese site said about ATI not being able to do something because of TSMCs 40nm process, which now appears to be a 4 core card, so they settled with 3?
August 19, 2009 12:27:25 AM

so a single 5870 = 4890 CF :D 
I was just about to purchase the CF but now gotta wait and see if 5870 CF will be priced the same as 4890 CF .
a c 130 U Graphics card
August 19, 2009 8:47:18 AM

The more info i see the more its starting to look like a non event for me. Of course it still remains to be seen whats actually what but its going to come down to what the Juniper/Redwood stats actually are. The other cards look to be either too big or too power hungry to be of interest to myself and i can see many feeling the same way. Sure as an upgrade from a lesser card they would make perfect sense but for someone with a 4 series card already it could turn out to be a pointless exercise.
Im coming from a Price/Power/Perf point of view here, sure a lot of you guys, true enthusiasts i guess :)  are quite happy to swap out a GPU to get the latset and greatest but i need to see a solid reason for doing so.
As i have said before, if we end up with parts that are basically the 4 series with DX11 then as far as the mainstream goes then they missed the boat. depends what DX11 does though of course.
Those better cards sure look like they are going to make things interesting at the top end though, im still looking forward to the launch and seeing what they can do, just think i may sit this round out and wait for a refresh/ respin before i join in.

Mactronix
a b U Graphics card
August 19, 2009 8:58:01 AM

Well, as said earlier, the match of the 4670 will have 4870 perf this gen, so I guess it all depends as usual.
The 5830 should be able to handle almost any game out, much like the 4870 can now.
One thing Ive seen is theres been price hikes for that last killing before the old cards dip in price. As the new ones come in high and later tail off in price, so too can the old cards do the same
a b U Graphics card
August 19, 2009 8:58:51 AM

September 10th is most likely just a PR event. I am hearing second hand that the 24th may be a much more interesting date for those of us interested.
a b U Graphics card
August 19, 2009 9:18:02 AM

Been thinking about what youve said mac, but its going against the grains of everything Ive been reading, even come straight from ATI, in that this gen of cards has the most changes since the R600 series, so I guess if you consider that just a tack on with few improvements, and think the confidence in ATI and what theyve shown, is that most everyone doesnt want a much better perf, maybe youre right, but Im reading it all different
a c 130 U Graphics card
August 19, 2009 10:02:52 AM

As i said JD from a strictly mainstream Price/ Power/Perf point of view, IF the Juniper/Redwood parts come out as little more that what we have now but with DX11 then i cant see the mass appeal for mainstream. The Cypress on spec's looks like a brilliant card but where im coming from is this.
Mainstream machines to my mind would mainly need a CPU and probably in most cases a PSU upgrade to support Cypress properly, 2X PCIE cables and X2 performance is where im getting this from. There are so many posts regarding systems with a decent Core2 and a single power cable that we recomend 4850/4770 etc.
These systems would need to then look at either upgrading those parts of the PC or look at the Juniper/Redwood parts. Now im not saying its a non starter but it really would depend on Price/Perf personally when deciding if its worth it or not. Personally im not about to upgrade to a 4870 so what im saying is why should i get a differant card that performs the same ? DX11 ? well we need tio see what that actually brings first.
It really does (for me ) depend on what they can actually do when released and tested.

Mactronix
August 19, 2009 12:29:48 PM

I dont know about you guys but I am excited, just because it seems this time ATI prepared a whole line of cards abroad the whole spectre. This will allow probably over 80% of new buyers to select card suiting their demand but with dx11 capability and that will be the real blow to nVidia with the launch of the Win 7.

Yes, sure the enthusiasts will be happy with the new radeons, mainstreams will be happy. Thats only 1 part of the story. The bigger more important for ati is continuing to grab share from nVidia while they are in the advantage point. nVidia can be hurt badly even for few months only as with every new Windows released there is a huge bump in sales. And Win 7 is the most anticipated since XP probably.

Can ATI turn the table - I think its possible. If they play their cards wisely, they can be the dictators in the next few years. But they should be carefull, noone even knows what is GT300 going to look like.

1600 SP comes as a sweet surprise for me as I didnt expected them to double the SPs. More than doubling the TMUs and ROPs also is a best case scenario. My expectance is 5870 to be about 2x perf or 4890 (may be even more in Dx11 titles).

if that is true, then we will see a HUGE jump in performance which will give developers room for more advanced , cool looking games in the future, and lets hope NEAR future :) 
a c 130 U Graphics card
August 19, 2009 12:56:10 PM

Thats all well and good but just recently quite a few Devs have actually come out and said that they dont want to be coding to a specific API, such as DX11. Crytec even said that there may not even be a "Next Gen " per say as far as PC gaming is concerned and gave the slow development of next gen consoles as the reason, also the fact that the Wii isnt exactly ground breaking.
Its shaping like a pissing contest between the Devs and the hardware/software companies at the minuite with my personal feeling being that the Devs will bloody well code to what the most prevalant platform is regardless as money is king. Oh they may well want to code in CC+ or what ever it is but if MS still have share and DX what ever is in most machines then thats what they will code to i would have thought.

Mactronix
a b U Graphics card
August 19, 2009 6:20:41 PM

Heres the point. You DONT buy the HW, youre sorta handing those devs the win right? I mean, if DX10.1 was adopted right away by nVidia, dont you think there would have been alot more DX10.1 titles out now?
Same goes for DX10. If more people had migrated to Vista

So, in my mind, you support this notion when you dont buy DX11. Its a chicken and the egg thing, and we all know that HW drives us forwords
How many people have come on here saying but xp has still x amount of the market, and theres still too many people that own DX9 only etc etc, when we all know alot of those numbers are distorted because of Intels crap igp, which Ive been very vocal about, for obvious reasons

If a mid card new gen = a top card last gen, thats the normal case of progression, and if its not enough growth for you I understand, but adding in Crytek and Tim the weeny sweeney doesnt make sense.
I got shot down awhile back in the cpu section, when there were reports Intel was going to devs to help work with their igp for games.....in 2012, the same thing for the consoles, 2012, the same for Crytek, 2012. Now whos REALLY behind this? Maybe those blue giants thatre bringing out a new "concept" gpu next year, and have been saying all along, the first gen wont be the good one, itll be the second gen that owns.
I bet my left @#$ the second gen LRB will be out.......2012
Its interesting so many backers have just dropped games lately, you notice that? And the ones that havnt dropped them, are pushing them back.
Theres money talkin here, and the devs have been on the short end.
Hoolowood has continued to buy into the gaming industry, you know Hollowood, the same guys that brought us DRM?
We have 2 huge factions muscling their way into gaming, PC and consoles, with the devs stuck in between, and the consumer left out in the cold.
My hope is that someone bucks the trend, and just makes a few decent games on a budget, and opts out for "help", and wont sell out
a b U Graphics card
August 19, 2009 7:57:01 PM

Quote:
How many people have come on here saying but xp has still x amount of the market, and theres still too many people that own DX9 only etc etc, when we all know alot of those numbers are distorted because of Intels crap igp, which Ive been very vocal about, for obvious reasons


I wonder who he could be referring too... :D 

Quote:

We have 2 huge factions muscling their way into gaming, PC and consoles, with the devs stuck in between, and the consumer left out in the cold.
My hope is that someone bucks the trend, and just makes a few decent games on a budget, and opts out for "help", and wont sell out


Stardock does this. That being said, it needs to exist in order to drive development of new GPU technologies (for the next wave of consoles).

As for the devs, they will code for the easiest platform that is avaliable. Right now, thats the 360. Also remember, Consoles have only a few defined setups that need to be tested for, while for PC's, there are many more configurations that need to be tested, hence it makes more sense to code for the console first, and then see what changes need to be made for PC optimization.
a b U Graphics card
August 19, 2009 8:03:28 PM

Simply comes down to engines, and as for your fav company, nVidia is saying 10% higher perf in W7 over xp
If your engine is already DX10, youre more than half there already
The article I was critical to from guru saying theres no difference between W7 and Vista was, as I said, to the point. Its not between Vista and W7, its between those 2 and xp, where xp is failing
a b U Graphics card
August 19, 2009 8:10:09 PM

"As a bonus tidbit, it adds that on Windows 7 the SLI multi-GPU technology works 10% faster than on Windows XP."
http://www.techpowerup.com/index.php?101944
So, where ya gonna hide, watcha gonna do xp? heheh
a c 130 U Graphics card
August 19, 2009 8:21:29 PM

JAYDEEJOHN said:
Heres the point. You DONT buy the HW, youre sorta handing those devs the win right? I mean, if DX10.1 was adopted right away by nVidia, dont you think there would have been alot more DX10.1 titles out now?
Same goes for DX10. If more people had migrated to Vista

So, in my mind, you support this notion when you dont buy DX11. Its a chicken and the egg thing, and we all know that HW drives us forwords
How many people have come on here saying but xp has still x amount of the market, and theres still too many people that own DX9 only etc etc, when we all know alot of those numbers are distorted because of Intels crap igp, which Ive been very vocal about, for obvious reasons

If a mid card new gen = a top card last gen, thats the normal case of progression, and if its not enough growth for you I understand, but adding in Crytek and Tim the weeny sweeney doesnt make sense.
I got shot down awhile back in the cpu section, when there were reports Intel was going to devs to help work with their igp for games.....in 2012, the same thing for the consoles, 2012, the same for Crytek, 2012. Now whos REALLY behind this? Maybe those blue giants thatre bringing out a new "concept" gpu next year, and have been saying all along, the first gen wont be the good one, itll be the second gen that owns.
I bet my left @#$ the second gen LRB will be out.......2012
Its interesting so many backers have just dropped games lately, you notice that? And the ones that havnt dropped them, are pushing them back.
Theres money talkin here, and the devs have been on the short end.
Hoolowood has continued to buy into the gaming industry, you know Hollowood, the same guys that brought us DRM?
We have 2 huge factions muscling their way into gaming, PC and consoles, with the devs stuck in between, and the consumer left out in the cold.
My hope is that someone bucks the trend, and just makes a few decent games on a budget, and opts out for "help", and wont sell out


Hang on a minuite there JD, Im assuming this passage is regarding my last post, as no one else mentioned or infured these guys ?

" If a mid card new gen = a top card last gen, thats the normal case of progression, and if its not enough growth for you I understand, but adding in Crytek and Tim the weeny sweeney doesnt make sense."

I would like to point out that the post was in reply to this, posted by rawsteel.

" if that is true, then we will see a HUGE jump in performance which will give developers room for more advanced , cool looking games in the future, and lets hope NEAR future "

The point is that its a fact pure and simple that games are coded for the mass market which is DX9. Dont get me wrong i want progression but we wont get it untill as you say someone takes a chance, the odd game coded with a few DX10+ codepaths while welcome wont amount to much where it matters.
Whats really needed is for MS to bankroll a few maybe half a dozen full on DX11 games, in much the same way i expect Intell to bankroll LRB based games, well they did buy a development house didnt they.
Also i didnt say i wouldnt be getting one i just said it very much depends on what the performance actually is and what DX11 actually ends up being and i guess i should also have added how well DX11 is suported game wise.
There is definatly something afoot but as i said elsewhere, the devs have no clout and i dont know what planet Tim sweeney is on if he thinks for one minuite the devs have any say one way or another where we end up.
Mactronix
    • 1 / 6
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!