Sign in with
Sign up | Sign in
Your question
Closed

XFX, Gigabyte, HIS Also Roll Out Radeon HD 7990 Cards

Last response: in News comments
Share
a b U Graphics card
April 26, 2013 4:48:37 AM

XFX should really stop naming cards. triple dissipation? Come on, if you're gonna make crappy low quality cards (You can't deny that they have massive QC issues) at least don't give it a corny name.
Score
0
a b U Graphics card
April 26, 2013 5:23:58 AM

XFX is the WORST company ever. Cheap quality, poor cooling and poor customer support what else. XFX lost the reputation being the best quality graphics card manufacture to worst manufacture. What happened to them with in a few years ??
Score
-3
Related resources
a b U Graphics card
April 26, 2013 5:26:15 AM

spp85 I'm not sure but I know what you mean, as a brand they are now cheesy,
powercolor is more legit than them now....
Score
-4
a c 194 U Graphics card
April 26, 2013 5:40:22 AM

Re XFX: Speaking of the DD editions (I've never owned a "Core" model), I like the construction quality, and the materials, like aluminum shrouds instead of cheap plastic. Up to the HD7870 stay relatively quiet too. I haven't yet had one die, and recently sold a HD5770 that was probably at least 3 years old.
Then I got a HD7970. Oops. Noisy fans (including some rattling), very hot-running, and WORST, unresponsive support. The latter in particular is why I got a RMA on it, and will be replacing it with a Gigabyte. While not on my "Do Not Buy" list (it takes outright incompetence, e.g. Diamond, or dishonesty, e.g. CM, to make that list), they will no longer be one of my preferred vendors until I've become aware that their support has improved.
Score
7
a b U Graphics card
April 26, 2013 5:47:09 AM

Onus I agree. I had the 5770 egg cooler with the double lifetime warranty and it was quite nice actually. This may be incorrect but I feel there was a really big change once they dropped the double lifetime warranty completely. It's almost like they do not care at all anymore.
Score
2
a b U Graphics card
April 26, 2013 6:14:22 AM

Pine Technologies changed their card name to XFX. They were the worst producer of cards ever.
Score
1
April 26, 2013 6:57:35 AM

For the people complaining about XFX in this instance....umm...why the f*$k does it matter? It's all the same card. There's no reason to care one way or another. I would go with Asus, Gigabyte, or MSI as I've dealt with their customer service flawlessly.
Score
0
April 26, 2013 7:20:13 AM

I actually have never had a problem with XFX cards (atleast on the Nvidia side). My 7950 GT, 9800GTX+, 280 GTX and 295 GTX are still running just fine. Hell I had a problem with my 7950 GT back in 2006 when I my computer got stuck on post overnight (hit restart instead of shutdown the night before) and the whole computer heated up massively. The fan on the 7950 GT half melted off... but still spun and still works. Never used an XFX AMD card though.
Score
0
a c 1379 U Graphics card
April 26, 2013 7:31:16 AM

Since people are negative on XFX here, I will have to say that I have currently 4 cards from them all with lifetime warranty and no issues. Their warranty works as well since when my sons HD4890 cooling fan failed (after almost 3 years) they replaced the card with HD6850.
Score
4
April 26, 2013 7:43:09 AM

I remember when flagship cards were $499 and more people could afford them with out going nuts...
I can't see spending $999+ for a flagship video card.
Score
0
a b U Graphics card
April 26, 2013 9:27:01 AM

rolli59 said:
Since people are negative on XFX here, I will have to say that I have currently 4 cards from them all with lifetime warranty and no issues. Their warranty works as well since when my sons HD4890 cooling fan failed (after almost 3 years) they replaced the card with HD6850.


That's the point I made lifetime and double life time is gone all gone.
Score
0
April 26, 2013 9:46:47 AM

jn77 said:
I remember when flagship cards were $499 and more people could afford them with out going nuts...
I can't see spending $999+ for a flagship video card.


Ironically, a lot of people can. Myself for example, spent $550 on a 7970, then $900+ on a 690... Thinking of dropping it for a pair of Titans. I'm not loaded or anything. But my computer is one thing I tend to spend money on. Some people spend it on cars or something. About the only thing I really spend big money on is computer parts. Wife, two kids still win the priority obviously but when it comes to buying toys, I stopped with the consoles of 10-year-old garbage hardware and focused on my PCs.

I know some of the most broke people in the world spending $5k+ on a set of rims for their car... To me that's insane. It's all about priorities and what is important to the person.

ALTHOUGH, I do agree that the current $1k price for these elite cards is really pushing it. $600 was steep enough IMHO...
Score
0
April 26, 2013 10:04:19 AM

I want Twin Frozr on this thing.Stock build from all of you?Seriously? Is it not known that stock cooling is pathetic on ALL AMD reference cards?Asus!!!Where is Direct CU in tripple slot?Ridiculous.
Score
0
a c 194 U Graphics card
April 26, 2013 10:24:51 AM

Stock cooling does have the advantage of exhausting heat though. I'd like to see HIS put one of their Black Hole coolers on it though.
Score
0
April 26, 2013 10:54:59 AM

Yeah,how can all of them not even attempt to at least differentiate?And for 999 i want to know their best bells and whistles are in it.
Score
0
April 26, 2013 11:41:22 AM

i had an XFX 295 that did run hot (but that is because the person I brought it off was an incompetent IDIOT and didn't apply paste correctly (THERE WERE GREAT BIG FREAKING HOLES WITH NO THERMAL PASTE ONE THE GPU)
once I did clean it up after getting annoyed with its rather high running temperatures I managed to make it loose about 10 degrees of heat
that gave me a massive overhead for overclocking boosting its performance to even higher levels
Score
0
a c 1379 U Graphics card
April 26, 2013 12:40:33 PM

spentshells said:
rolli59 said:
Since people are negative on XFX here, I will have to say that I have currently 4 cards from them all with lifetime warranty and no issues. Their warranty works as well since when my sons HD4890 cooling fan failed (after almost 3 years) they replaced the card with HD6850.


That's the point I made lifetime and double life time is gone all gone.

Still lifetime on my DD HD7950!
Score
-1
a b U Graphics card
April 26, 2013 12:48:20 PM

So what's the difference between (among?) them?
Score
0
a c 194 U Graphics card
April 26, 2013 12:51:06 PM

Apparently there aren't any differences yet, if they're all reference. Hopefully individualized versions will come out in the next few weeks.
Not that I'll be buying one...getting a single HD7970 was enough pain in the wallet.
Score
0
April 26, 2013 1:47:13 PM

Amen to that,it'll probably be like 690,no stock anywhere.Im actually very impressed by 7750 and 70.They made nvidia of all people price cut.
Score
0
April 26, 2013 4:57:33 PM

Niels, please don't describe these cards as having 6GB "total memory" as it might
confuse people into thinking this is somehow comparable to Titan's genuine 6GB.

The 7990's RAM is split 3GB per GPU, so the host system only sees 3GB overall,
not 6GB. Thus, the 7990 should always be described as a 3GB card.

Otherwise, it's just like those on eBay who describe a quad-core 3GHz CPU as
being a, "12GHz system".

Ian.
Score
0
a c 172 U Graphics card
April 26, 2013 10:25:37 PM

in light of recent reviews and testing methods I don't know who would be stupid enough to buy and AMD dual GPU setup? We always new crossfire problems were there, now there is a confirmed way to measure it (FCAT). They really need to fix that fast in order to sell these cards, that's IF it can be fixed in the drivers. Until then, AMD for single gpu only. oh, and don't tell me radeonpro, its not a fix, its a workaround that lowers overall performance anyway.
Score
0
a b U Graphics card
April 27, 2013 4:44:51 AM

rolli59 said:
spentshells said:
rolli59 said:
Since people are negative on XFX here, I will have to say that I have currently 4 cards from them all with lifetime warranty and no issues. Their warranty works as well since when my sons HD4890 cooling fan failed (after almost 3 years) they replaced the card with HD6850.


That's the point I made lifetime and double life time is gone all gone.

Still lifetime on my DD HD7950!


???????

Perhaps it's canada only where they did that,

BRB


Heres what I found

XFX Discontinues Double Lifetime Warranty with New Radeon Graphics Cards

Cards with Double Dissipation (Double D) or whose product number ends in "R" get Lifetime warranty if registered within 30 days.
All other cards (ex: HD 7970 Core Edition; FX797ATNFC) get 2 Year Warranty
Score
0
a c 1379 U Graphics card
April 27, 2013 6:45:33 AM

spentshells said:
rolli59 said:
spentshells said:
rolli59 said:
Since people are negative on XFX here, I will have to say that I have currently 4 cards from them all with lifetime warranty and no issues. Their warranty works as well since when my sons HD4890 cooling fan failed (after almost 3 years) they replaced the card with HD6850.


That's the point I made lifetime and double life time is gone all gone.

Still lifetime on my DD HD7950!


???????

Perhaps it's canada only where they did that,

BRB


Heres what I found

XFX Discontinues Double Lifetime Warranty with New Radeon Graphics Cards

Cards with Double Dissipation (Double D) or whose product number ends in "R" get Lifetime warranty if registered within 30 days.
All other cards (ex: HD 7970 Core Edition; FX797ATNFC) get 2 Year Warranty

Well it used to be North America only and they as you correctly write just certain cards now.
Score
0
a c 85 U Graphics card
April 27, 2013 12:31:42 PM

My XFX 6670 works perfectly. I think they may have issues on their high end cards but you shouldn't say that all their cards are bad.
Score
-1
April 28, 2013 12:16:42 PM

These can't be worth the money can they? Wouldn't a HD 7970 in crossfire be more on the money or a GTX 690 in SLI?
Score
0
April 28, 2013 6:35:47 PM

690 SLI is $2000?How's that on the money?
Score
0
May 4, 2013 2:48:01 AM

mapesdhs said:
Niels, please don't describe these cards as having 6GB "total memory" as it might
confuse people into thinking this is somehow comparable to Titan's genuine 6GB.

The 7990's RAM is split 3GB per GPU, so the host system only sees 3GB overall,
not 6GB. Thus, the 7990 should always be described as a 3GB card.

Otherwise, it's just like those on eBay who describe a quad-core 3GHz CPU as
being a, "12GHz system".

Ian.


my 295 was reported by anything i asked to give me its impression as having around 1700 MB of ram when it was half that on each card
NVidia also say it has 1792MB in a standard configuration so unless that's false advertising of sorts...
Score
0
May 4, 2013 2:48:01 AM

mapesdhs said:
Niels, please don't describe these cards as having 6GB "total memory" as it might
confuse people into thinking this is somehow comparable to Titan's genuine 6GB.

The 7990's RAM is split 3GB per GPU, so the host system only sees 3GB overall,
not 6GB. Thus, the 7990 should always be described as a 3GB card.

Otherwise, it's just like those on eBay who describe a quad-core 3GHz CPU as
being a, "12GHz system".

Ian.


my 295 was reported by anything i asked to give me its impression as having around 1700 MB of ram when it was half that on each card
NVidia also say it has 1792MB in a standard configuration so unless that's false advertising of sorts...
Score
0
May 9, 2013 5:33:26 AM


heero yuy said:
NVidia also say it has 1792MB in a standard configuration so unless that's false advertising of sorts...


Indeed it is false advertising, because an application can't run as if
there were a single 1792MB resource; the limit is still just the
amount on each GPU (896MB). Remember, functionally speaking, a card
like the 295 is no different than using two separate cards in SLI
(it's just that in the 295, the SLI functionality is hidden from
view), though in this case with the process shrink the two GPUs in the
295 are more like a mashup between a 280 and a 260 (it has the SPs of
the 280, but its ROPs and clocks are like the 260 Core 216). Anyway,
imagine instead of a 295 you had two normal 280s in SLI (or 260s,
the same concept applies); is a game suddenly able to cope with much
higher resolutions and AA load because each card has 1GB? ie. could
or would one call the resulting SLI setup a GTX 280 2GB? No, absolutely
not, it doesn't work that way; the gaming limit would still be 1GB.
Likewise, the limit for a game with a 295 is still 896MB.

Try running Crysis2 on a 295 at max detail 1920x1200, high AA, etc.
and see what happens, it'll choke. I tested with a native GTX 460 2GB,
the game uses 1347MB with these settings; as a result, performance is
smoother (better minimum fps) with a single GTX 460 2GB than two 460
1GB SLI, for obvious reasons.

This idea of shared resources split across multiple processing units
is not new. In the 1990s, similar confusion occured with SGI's high-
end gfx system, InfiniteReality (IR); this is the tech upon which the
original Geforce256 was based. The final version called IR4 (July
2002), consisted of a geometry board (does all transform/lighting), a
display generator board (final DAC output, etc.) and either 1, 2 or 4
raster manager boards (called RM11 in this case). Each RM11 has 2.5GB
video RAM and 1GB texture RAM. When multiple RM boards are employed,
the video RAM is combined to give a higher overall total, but the
texture RAM is not combined because - just like the overall RAM in
modern dual-GPU gfx cards - it's a duplicated resource. Thus, a 4RM11
IR4 setup has 10GB VRAM genuinely available to use, but still only 1GB
TRAM (one can only increase the TRAM pool by combing multiple IR4
'pipes' together, up to 16 of them for 160GB VRAM and 16GB TRAM total).

When the IR4 system is processing textures, the same texture data is
sent and stored in the TRAM on every RM board. This is what happens
with modern SLI/CF setups and it's why chips like the NF200, etc. are
so useful at offering greater PCIe bandwidth for multiple cards when
the no. of lanes from the chipset isn't that much; the chip receives
data which it duplicates & sends on to multiple GPU destinations, eg.
the ASUS P7P55 WS Supercomputer with an NF200 offers x16/x16 or
x8/x8/x8/x8, far more than any normal P55 board which is usually
limited to just one x16, or x8/x8, or at best x8/x8/x4 as with the
Asrock P55 Deluxe. Another example: putting four GTX 680 2GB cards
on an ASUS P9X79 WS does not mean one suddenly has a "GTX 680 8GB". :D 

I hope people did not fall for the idea that the 295 had 1792MB, but I
expect many did. It's a misleading marketing trick that shouldn't be
allowed IMO.

Ian.

PS. References:

http://www.tomshardware.com/reviews/geforce-gtx-295,210...
http://sgidepot.co.uk/misc/3353.pdf

Score
0
!