Smiler

Distinguished
Apr 1, 2005
85
0
18,630
Hi.

My assumption here is that although the 8800 GTS will likely prove to be overpriced compared to what will come out over (say) the next 6 months, the depreciation over a year is likely to be less than buying one of the current DX9 cards.

Like the rest of the free world I am pondering whether to lay out the big bucks for an 8800 GTS (GTX is too much of a stretch). I can't wait 6 months for my new PC, so it's either a DX9 card to tide me over for a year or the 8800 GTS now.

What occurred to me in favour of the 8800 GTS (which is way more than I was otherwise planning on spending on a GPU) is that although new cards will have appeared and prices fallen a year or so down the road, presumably the 8800 GTS should maintain some resale value as still being current technology. Whereas by then the formerly hot shot DX9 cards are likely to be considered old hat. And just maybe the performance of the 8800 GTS will be enough to carry me through for a couple of years till the next major upgrade.

Anyone agree, or is my logic flawed self delusion :?:

Cheers.
 

djplanet

Distinguished
Aug 27, 2006
489
0
18,780
Your theory of depreciation makes sense, though the newest DX 9 cards won't be considered antiques in merely a year, IMO. If you're willing to wait six months, maybe you should. R600 will be released by then, and Nvidia's second generation may even be released at that point.
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
Resale value isn't great IMO.

R600 and 8900s are coming. 8800s will be phased out, and stay up in price. Just like the 7800s when 7900s came out.

If you can afford it, I say go for it. It's a fantastic performer, and will last you quite a bit.
 

Track

Distinguished
Jul 4, 2006
1,520
0
19,790
Resale value isn't great IMO.

R600 and 8900s are coming. 8800s will be phased out, and stay up in price. Just like the 7800s when 7900s came out.

If you can afford it, I say go for it. It's a fantastic performer, and will last you quite a bit.

But get the eVGA so that u can get the 8900 when it comes out.
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
Resale value on PC components is never incredible.

But, it seems like the G80 is the very first of the DX10 era, and therefore it's probably going to get wiped out by another GPU in the next year. If you can wait, it might be best to buy a 7600gt or something and upgrade in a year.

(FYI, I know a lot less on this subject than a lot of people here, so don't listen to me... :lol: )
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
But, it seems like the G80 is the very first of the DX10 era
The "DX10 era" isn't here yet, because Vista and DX10 itself (not to mention games) aren't here.

Right now, the G80 is the most powerful card out there, a new generation, not a beginning of the "DX10 era".
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
But, it seems like the G80 is the very first of the DX10 era
The "DX10 era" isn't here yet, because Vista and DX10 itself (not to mention games) aren't here.

Right now, the G80 is the most powerful card out there, a new generation, not a beginning of the "DX10 era".
Right... the first DX10 card surely doesn't mark the beginning of the DX10 era.

...

I must be some sort of idiot! :roll:
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
It's a "next generation" card, which happens to have DX10 as one of its features. It cannot be used right now, so it doesn't make a "DX10 card". People just call it that.
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
It's a "next generation" card, which happens to have DX10 as one of its features. It cannot be used right now, so it doesn't make a "DX10 card". People just call it that.
Example of your fine logic in play:
I run an OS that only supports a single-core CPU, so my Core2Duo is a single core CPU! I sure can't wait until I can afford Windows, because then I'll have a dual-core CPU! That'll be REAL nice!
 

tool_462

Distinguished
Jun 19, 2006
3,020
2
20,780
...I'm sure Prozac will counter, but just pointing out that dual core CPUs show performance increase in Windows, while DX10 cards will show no performance gain over DX9 cards in a sense. Yes they perform better, but the technology is better, not the DirectX compatibility.
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
...I'm sure Prozac will counter, but just pointing out that dual core CPUs show performance increase in Windows, while DX10 cards will show no performance gain over DX9 cards in a sense. Yes they perform better, but the technology is better, not the DirectX compatibility.
Yeah, and DX9 cards show no performance gain over DX8 cards in a sense. Why don't you buy one?

I see what you are saying, and I get your point. I'm just saying: if it supports DX10, doesn't that make it a DX10 card just like a card that supports DX9 is a DX9 card???
 

Smiler

Distinguished
Apr 1, 2005
85
0
18,630
How large is your monitor and what titles do you plan on playing right now?

IIyama Vision Master Pro 454 (also known as the HM903DT) which is a 19 inch CRT with the ultra-flat Trinitron tube. I'd be grateful for any tips on how to best set this up for gaming...

Anyway, my principal gaming activity now is Call of Duty 2 and my 3 1/2 year old PC is in serious need of replacement as it struggles to cope with this when things get busy, at even the lowest resolutions. So I either go for a cheaper card to keep me going for a while and then take the loss on that or stump up now for the 8800 GTS.

Cheers.
 

dean7

Distinguished
Aug 15, 2006
1,559
0
19,780
Don't be a retard, it just came out.. How can anyone tell you honestly?
<knock knock> hello, mcfly???
Seriously, are you on crack or something?

But, I do agree with you that nobody can guess how things will turn out. I don't think he wanted anybody to predict the future... he was just looking for some best guesses :D.
 

Smiler

Distinguished
Apr 1, 2005
85
0
18,630
Don't be a retard, it just came out.. How can anyone tell you honestly?
<knock knock> hello, mcfly???
Seriously, are you on crack or something?

But, I do agree with you that nobody can guess how things will turn out. I don't think he wanted anybody to predict the future... he was just looking for some best guesses :D.

Indeed. Is this not a forum for those who have an unhealthy level of preoccupation with this topic and thus, one would hope, an expedient level of sophistication? :wink:

Now obviously you are free to continue arguing amongst yourselves. 8O However, allowing for my shaky grasp of the relevant technical terms, the point I was attempting to make is that if/when DX10 based software is released and is desirable then it seems kind of likely that the production of previous GPUs will cease or be cut back to budget models.

Let’s presume a budget of £350 for our crystal ball gazing. One option is spending £150 on a card now on the assumption that a card at least as good as the 8800 GTS will be available for (say) £200 in a year’s time. In this circumstance what will the £150 card be worth that doesn’t have the DX10 bells and whistles? After DX9 came out how long did companies continue making cards that couldn’t take advantage of it?

Of course the other appeal of the 8800 GTS is that although it is priced around £320 - £350 (which means outlaying all the money now) it does appear to stand up well against other cards in that price bracket a bit on existing titles where there is no DX10 optimisation. So that should mean kicking butt from the get go!

On one hand it seems a heck of a lot of money for a graphics card, but I think I'm at least half-way to convincing myself...
 
Well it's likely something similar to these;

http://www.theinquirer.net/default.aspx?article=35707
http://www.theinquirer.net/default.aspx?article=35708

Think about how the Ringbus works;
http://www.beyond3d.com/reviews/ati/r520/index.php?p=05

and you'll see that it makes more sense to be 1024 internal and 512 external since the throughput would be near the same per clock with 512bit DDR memory interface and single clock rate VPU ringbus.

Based on the description, it would be 16 x 32bit wide (64MB) memory chips externally.