AMD Radeon R9 270 Review: Replacing The Radeon HD 7800s

Status
Not open for further replies.

16bit

Honorable
Oct 6, 2013
238
0
10,710
Seems like a pretty solid card, but I would like to see benchmarks that include some of the higher end cards. Curious how big the gap between the 280x and the 270 is.
 

m32

Honorable
Apr 15, 2012
387
0
10,810


I doubt this card has too much headroom in that department. The 6-pin is a gift and a curse.
 

wdmfiber

Honorable
Dec 7, 2012
810
0
11,160
The chart need a typo fixed. The 7870 is I incorrectly labeled as 40nm, but it's built on the 28nm fab process; just like everything else. .

Frig... we've been stuck at 28nm for so long it's just "understood". You could get-away with leaving that whole column out.
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
I wonder, how exactly does overclocking work with these cards? Wouldnt it just be varying its fan speeds whenever it hits a certain temperature and sends clockspeeds all over the place?
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
I will not be surprise they gonna release a R9-260X (R9 version of 260X) that is a rebrand of 7850. A curacao chip with a broken CU has to go somewhere.....
 

Da W

Honorable
Oct 31, 2013
13
0
10,510
How about crossfire? tri-crossfire? That would be interesting to see, 3x 270 with one 6pin connector each, instead of buying the monster that 290X is.
 
So are they leaving room for R9 265X? Eventually there must be a card with disabled Shader Cores always has been, that is how they maximize their yields. Unless of course they keep the 7850 in the lineup like they have done with the lower end 6xxx series cards.
 

tomc100

Distinguished
Jul 15, 2008
166
0
18,680
Not sure why tomshardware is using COD Ghosts as a performance test. This game is one of the most poorly optimized game ever made right next to that Jurassic Park game in the 90's called Trespasser. The game is using the same Quake engine and is known to run poorly on SLI and crossfire.
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
It looks like they intentionally hamstrung this card against OCing with that single 6-pin. There appears to be a bit of power headroom, but not much. Bumping freq and/or V will eat that up pretty quickly. It makes sense if they don't want to cannibalize 270X, I suppose, but oh well.
 
That's the reference model. You can bet some board partners are going to either put an 8-pin on it or a pair of 6-pin connectors, just for OCing.
I'm still looking forward to a review of the R7s...
 

mapesdhs

Distinguished
So basically the card is rereleased 20 months after initial launch, with a new
name and tweaked performance that's slower or little better than the original,
and pretty much the same price point. Can someone explain where one can
find Moore's Law in all this? IMO it all looks like a waste of time. I mean, after
20 months, this is all we get? Not impressed at all. I'd hoped AMD wouldn't
go down the road of rebranding (it was bad enough with the 8800GT fiasco),
but I guess they figure enough people will fall for the PR. I foresee yet more
cards on eBay from disgruntled gamers who upgraded only to observe little
or no speed gain.

The 290/290X at least offer something tangible, whether it's solid performance,
good prices, or both, sans noise issues, but these reissued older GPUs really
irritate me. Reminds me of the lacklustre improvements we've had in CPU power,
the halting of price drops for SSDs, the shooting back up of RAM prices since
Feb, and so on. If the PC market is shrinking, I don't think one can blame it
entirely on the rise of tablets & suchlike, or the dislike many have of Win8;
instead, IMO these days there are simply fewer items that are worth buying
as upgrades. All this stalls demand, people stop buying, or buy less often
as they wait for something better, which makes it look like the market is
shrinking when infact users are just waiting for products that are worthy
of their cash.

Sometimes I think it's a pity that all the various 3rd-party GPU makers can't
combine their own talents and come up with a completely separate GPU
development path to NVIDIA and AMD. Surely there's enough skill & knowledge
by now at ASUS, Sapphire, Gigabyte, EVGA, HIS, etc., to do this. Oh if only...

Ian.

 

silicondoc_85

Honorable
Mar 10, 2012
39
0
10,530


You won't like it much when the NDA (or the gigantic bias not mentioning it and pretending the topic doesn't exist here in the review) on the power circuits onboard is released.

Expect massive vrm blowouts and crashes with sad overclocking.
Crappy power sucking bins stuffed in the slot by amd, a way to get rid of their junk electromigration gpu's without scrapping them.

 
Status
Not open for further replies.