Sign in with
Sign up | Sign in
Your question

AMD Announces R9 285 Graphics Card At Live Event

Tags:
  • Graphics Cards
  • AMD
  • Components
Last response: in News comments
Share
a b U Graphics card
August 23, 2014 9:50:42 AM

AMD announced the R9 285 at its "30 Years of Gaming and Graphics" event.

AMD Announces R9 285 Graphics Card At Live Event : Read more

More about : amd announces 285 graphics card live event

a b U Graphics card
August 23, 2014 10:41:45 AM

"2gb GDDR5". The R9 280 comes with 3gb. I am also seeing the basic R9 280 going for $215 on Newegg (or $185 with a rebate). I'm eager to see the reviews, but this looks like a good time to pick up an R9 280 if you are looking in that price segment.
m
7
l
August 23, 2014 10:49:32 AM

Hd 7970 190 watt version...
Sounds awesome...
Still need to wait review fo e this card to know its awesomenesa...
m
1
l
Related resources
August 23, 2014 11:35:21 AM

only thing I see diff is the support of directx 12
m
-1
l
August 23, 2014 12:21:13 PM

Just get a decent power supply, stop worrying about watt usage and enjoy the sale prices of the 280's. You'll probably save more getting great deals than being able to lower your power bill by that much. Just my opinion though.
m
1
l
August 23, 2014 12:49:08 PM

"On the other hand, the R9 285's 1375 MHz actual/5.5 GHz effective memory speed is notably faster than the 280's 1250 MHz/5.0 GHz specification, so net performance may be slightly faster."

Nope, not how memory bandwidth works. It's (bus width * "effective" ghz )/8 or (bus width * memory speed)/2 - in that case, it's usually 1000-1750MHz, effective's going to be 4-7GHz, unlikely you'll see lower/higher than that.. Doesn't matter if the memory's 1.1x faster if the bus width is 1.5x the size.

Quote:
Hd 7970 190 watt version...
Sounds awesome...
Still need to wait review fo e this card to know its awesomenesa...


Nope, slightly slower 7950 if I'm not mistaken. 7970 is 2048:128:32@~925MHz iirc, this is 1792:112:32@918MHz. Not to mention the bandwidth difference, 2GB of 176GB/s v 3GB of 288GB/s.
m
1
l
August 23, 2014 1:25:24 PM

So, why isn't it called R9 275 if it is behind the 280 in every aspect?
m
7
l
August 23, 2014 2:43:45 PM

Power savings can lead to higher over clocks, no? I would wait for this over the 280 for precisely that possibility
m
2
l
August 23, 2014 2:49:02 PM

Is this an improved architecture or just another variation of the GCN like what could be found on the HD 7790 thus having True Audio and (I assume) actual game Freesync capabilities? That's the real question. If it's the former, then it could very well be on par with the HD 7970 with fewer cores and power consumption.
m
1
l
a b U Graphics card
August 23, 2014 3:28:44 PM

Need benchies... I'll hold my breath until then, since going by the specs alone, this looks to be under my 7970 at 1.1Ghz by a big margin.

Cheers!
m
2
l
a b U Graphics card
August 23, 2014 4:12:46 PM

With lower price and power consumption, this can be great for the Tonga GPU. It's the same architecture and slightly greater than the R9 280 in terms of net performance.

So presumably, getting an R9 280 or 285 will not have that big of a gap in terms of overall performance. :) 
m
3
l
a b U Graphics card
a b À AMD
August 23, 2014 4:23:00 PM

Haha looks like they were poking at Nvidia who charged a huge extra charge for its first Maxwell card, while AMD is charging close to its actual performance replaced card.

As for those questioning the architecture, unless they did a die shrink to 20nm like they have been planning I don't think they would of managed that big of a power reduction without changing the architecture. Given the major focus on the mobile market now though, they seem to be following Nvidia's plan and working on reducing power consumption while maintaining performance. Chances are this is a GCN2.1, GCN3.0, or a die shrunk GCN 2.0 architecture. Since GCN2 didn't change anything really for the shaders just added some extra features like True Audio DSPs there is no telling how much they have changed.
m
2
l
August 23, 2014 6:04:18 PM

should have been called R9 281.

dont go calling it R9 285 if it doesnt beat 280x.
m
0
l
a b U Graphics card
August 23, 2014 6:58:11 PM

Quote:
only thing I see diff is the support of directx 12

Support for DX12 is kind of a big deal.
m
3
l
August 23, 2014 7:23:19 PM

Quote:
only thing I see diff is the support of directx 12


and trueaudio and freesync full support. They seem to be refreshing their old cards with newer features. This also shows that the reason AMD cards have high TDP and heat is probably the wider bus. I guess higher speed VRAM is the better route.
m
1
l
August 23, 2014 9:11:53 PM

This gpu isn't impressive but the X versions now that I am interested in since I am currently building a new rig ...........very slowly. ...........

m
2
l
August 23, 2014 10:42:46 PM


Quote:
"On the other hand, the R9 285's 1375 MHz actual/5.5 GHz effective memory speed is notably faster than the 280's 1250 MHz/5.0 GHz specification, so net performance may be slightly faster."

Nope, not how memory bandwidth works. It's (bus width * "effective" ghz )/8 or (bus width * memory speed)/2 - in that case, it's usually 1000-1750MHz, effective's going to be 4-7GHz, unlikely you'll see lower/higher than that.. Doesn't matter if the memory's 1.1x faster if the bus width is 1.5x the size.

Quote:
Hd 7970 190 watt version...
Sounds awesome...
Still need to wait review fo e this card to know its awesomenesa...


Nope, slightly slower 7950 if I'm not mistaken. 7970 is 2048:128:32@~925MHz iirc, this is 1792:112:32@918MHz. Not to mention the bandwidth difference, 2GB of 176GB/s v 3GB of 288GB/s.


Now you have said it is slower than HD7950, could you please provide the number/s for comparison?
m
1
l
August 24, 2014 1:39:34 AM

I lol'd at what this article says about memory config being faster than 280/7950.
m
1
l
August 24, 2014 2:33:04 AM

What a joke of a card. What's the point of this?
m
0
l
a b U Graphics card
a b À AMD
August 24, 2014 2:41:46 AM

the model doesn't justifies the card r7 265 is stronger than r7 260x and r7 260x is stronger than r7 260 so i was hoping that r9 285 would be better than r9 280x and will sit in the middle of r9 290 and r9 280x and will give a direct competition to the gtx 770 with better performance at low price but sadly seems nothing like that :( 
m
-3
l
August 24, 2014 2:49:30 AM

Quote:


Now you have said it is slower than HD7950, could you please provide the number/s for comparison?


Both have 1792:112:32 cores, 285 has 918MHz boost v 925MHz boost 7950 . 285 has 2GB 176GB/s bandwidth v 7950's 3GB w/ 240GB/s.

Same core config, slightly slower clock speed, less memory, less bandwidth.
m
-2
l
August 24, 2014 3:14:11 AM

just as I thought AMD is trying to improve their power consumption on their graphic cards (which is good) and am now simply happy cause my AX 760 platinum wasn't a purchase in vain =D
m
1
l
a b U Graphics card
August 24, 2014 6:38:21 AM

This is the sign of a broke company. They reused the same card in 3 generations with minor tweaks each time.

Full DX12 support is important, but when AMD says "full support" I'll believe it when I see it.
m
-4
l
a c 1426 U Graphics card
a c 372 À AMD
August 24, 2014 9:23:48 AM

It is simple it is all about marketing, to sell GPU's which fail quality check for the higher ranking GPU cards.
m
0
l
a b U Graphics card
August 24, 2014 12:16:28 PM

Quote:
Quote:
only thing I see diff is the support of directx 12

Support for DX12 is kind of a big deal.


AMD announced that all GCN cards will support DX12. So, anything above HD7xxx will support DX12, chill ;) 
m
1
l
a b U Graphics card
August 24, 2014 7:56:00 PM

what about hdmi 2.0 support?
m
1
l
August 24, 2014 10:11:07 PM

Seems like i travel back in time when i read tomshardware news its always behind which isn't acceptable on a tech site
m
-3
l
a c 177 U Graphics card
a b À AMD
August 24, 2014 10:47:09 PM

crisan_tiberiu said:
Quote:
Quote:
only thing I see diff is the support of directx 12

Support for DX12 is kind of a big deal.


AMD announced that all GCN cards will support DX12. So, anything above HD7xxx will support DX12, chill ;) 


Is having "support" for dx12 the same as enabling dx12 features?
Although i often wonder what defines support/features for newer versions of dx since shaders/cuda cores have been programmable for so long, shouldnt it just be a matter of updating the driver to support new dx features?
m
-1
l
August 25, 2014 6:36:59 AM

People seem to miss the point of this card. It's a mid-range card, that runs faster than the R9 280, slower than the 280X; but runs at 60 watts less than the R9 280. This isn't for people who want a top-end gaming card; this is the middle market, and for those who value efficiency.
m
4
l
August 25, 2014 6:52:22 AM

r9 280w?
m
0
l
August 25, 2014 7:35:26 AM

Quote:
Seems like i travel back in time when i read tomshardware news its always behind which isn't acceptable on a tech site


Did you travel back in time like 15 minutes then? This post was up immediately after the card was announced.
m
1
l
August 25, 2014 7:34:52 PM

I don't see actual memory bandwidth listed, maybe amd did some trickery like NVidia did with Maxwell, like increasing cache size? lets not complain about a card until we see benchies
m
0
l
August 26, 2014 6:09:41 AM

Quote:
Quote:
only thing I see diff is the support of directx 12

Support for DX12 is kind of a big deal.


Just like DX11 was a "big deal"?

It doesn't matter what level of software that comes out, it's actually pushing developers to use said software technology. We've been stuck with DX9 games since it's inception with a few games tossed in here or there that actually support DX11 (mainly any game that's using CryEngine).
m
0
l
a b U Graphics card
August 26, 2014 8:48:07 PM

soccerplayer88 said:
Quote:
Quote:
only thing I see diff is the support of directx 12

Support for DX12 is kind of a big deal.


Just like DX11 was a "big deal"?

It doesn't matter what level of software that comes out, it's actually pushing developers to use said software technology. We've been stuck with DX9 games since it's inception with a few games tossed in here or there that actually support DX11 (mainly any game that's using CryEngine).


If people don't give a shit, developers will never adopt it. Luckily I am not alone in believing that pushing technology further is important.
m
0
l
!