Sign in with
Sign up | Sign in
Your question

3D Graphics Card Feature Article.

Last response: in Graphics & Displays
Share

What did you think of the 3D Graphics Feature Article?

Total: 25 votes

  • Good Article, Great Pics, wish I could get more.
  • 24 %
  • OK article nothing special
  • 32 %
  • No opinion
  • 4 %
  • Ugh!
  • 16 %
  • Old Reviews were better, Lars / Borsti come back!
  • 24 %
  • Who is Lars? Who is this Tom guy for that matter?
  • 0 %
a b U Graphics card
January 18, 2006 4:39:44 AM

Surprisingly I didn't notice this article until today;

http://www.tomshardware.com/2006/01/13/new_3d_graphics_card_features_in_2006/

I'd like to get the forum's opinions on it (especially from the veterans)

Also Check out my Poll!

POLL not Pole ya' Perv!

.
--
Dang that was hard to make (very word sensititve!) It gave me dozens of post errors, and after each one it posted a truncated poll. I had to delete about 12 copies of this. :x
January 18, 2006 2:02:36 PM

Wow... wow. Come on, guys. There is some notable bad info in there:

The article suggests that the first DirectX 8 cards are the Geforce 4 & Radeon 9600?
Nope. Radeon 8500 and Geforce3 would be the first DirectX 8 cards.
The Radeon 9600 was a DirectX 9 card... Geforce 4 was DirectX 8 tho.


Mostly I take issue with the suggestion that you need an X1x00 or Geforce7 to see decent shader effects. Alot of similar, if not identical, shader effects can be done in plain DirectX 9 SM 2.0, and don't require SM 3.0 on the newest X1x00 or Geforce7 cards.

Geforce6 and Radeon 9550 and up can do DirectX 9. Even the GeforceFX series can do many of those things. Check titles like Doom3 and Half Life 2, lots of shader effects in there.

Any true DX9 card - i.e. Radeon 9550 and up, not sure about the GeforceFX series - can do HDR, not the SM 3.0 method but works just fine in Half Life. Why no half-Life HDR demo of Lost Coast?

Just seemed a little too pushy for the next gen cards. Or maybe I'm just hypersensitive, I dunno.
January 18, 2006 2:21:50 PM

Ugh...was my vote

i didnt quite get the point of the article, thought i might of missed something nope i just read it again. for how fast cards and directx versions are being made would it not of been better for them to wait till directx 10 then list all the major changes.. directx 9 has been around for awhile now and most of the people at toms already know the dfifference between them(8 & 9).

P.S

how about an article comparing a 4400 X2 overclocked to a 4800 X2, compared with a 4800 X2 to see if we do actually get the correct performance.

and a couple of tope of the range intel vs amd, cause for some reason you always seem to go for a similair priced model, yes that gives us the best bang for buck but not which is more powerfull :) 

or something you never see in the sound articles is how the thing actually sounds just cause the numbers are better it should sound better doesnt mean it does sound better.. kno what i mean?
Related resources
January 18, 2006 3:04:27 PM

Man, not everybody 'at Tom's' knows as much as you about Graphics Cards. There are new people coming into the industry every day. Besides that, its a good thing to take stock of what features you can expect from graphics cards today so that people who don't have zotts of time to pour over forums and other websites can figure out what some of this stuff means.

Though the DX8 information may not be dead on, its kind of a moot point, considering no one is going to buy a DX8 card anymore.

Overall, despite a few shortcomings, I think this was a nice summary of current graphics card technology and trends in the industry.

Edit: And dude, this article has been on the front page for like a week. :lawl:

Edit 2: What did Lars and Borsti do better? I didn't read Tom's when they were around.
a b U Graphics card
January 18, 2006 3:06:04 PM

Quote:
Wow... wow. Come on, guys. There is some notable bad info in there:

The article suggests that the first DirectX 8 cards are the Geforce 4 & Radeon 9600?
Nope. Radeon 8500 and Geforce3 would be the first DirectX 8 cards.
The Radeon 9600 was a DirectX 9 card... Geforce 4 was DirectX 8 tho.


Yeah that struck me as odd too, also interchanging PS2 with DX8 and PS3 with DX9. Having GF 3/4 equated to R9800, but the GF5 (aka FX) equated to the X800?


Quote:
Any true DX9 card - i.e. Radeon 9550 and up, not sure about the GeforceFX series - can do HDR, not the SM 3.0 method but works just fine in Half Life. Why no half-Life HDR demo of Lost Coast?


Exactly.! Maybe not OpenEXR as we've talked about, but that's not the only method of HDR. Heck Rthdribl has HDR in it's name, and it was the first true DX9 app IMO. Funny I read your R9550 and about and though about the R9500 being omitted, but you're talking performance and my brain instantly went to digits.

Quote:
Just seemed a little too pushy for the next gen cards. Or maybe I'm just hypersensitive, I dunno.


Also seemed somewhat vendor specific as to what's a benifit (no mention of vertec shader advanatage like geometric instancing) and which games to investigate (TWIMTBP?), but that may be my hype-sensitivity, similar to yours, I know you AGP guys are afraid of the next gen, luddites!

8)
a b U Graphics card
January 18, 2006 3:25:59 PM

Quote:

Though the DX8 information may not be dead on, its kind of a moot point, considering no one is going to buy a DX8 card anymore.


Well if they call the R9500-9800 and X800 series DX8, then many people.
If the information is wrong as to what the differences are then people may also think they NEED to upgrade, when in fact their old card is fine.

A little edit for cleanliness, no need for the last part in reply to a jest. I left my sense of humour at home today, which is rare.
January 18, 2006 3:36:45 PM

this article is mostly personnal perception not professionnal :!:
January 18, 2006 3:37:54 PM

Heh, I see your point. 8)

So...I guess I approve of the concept of the article...but the quality appears to be lacking.

Edit: I was just teasing you, no offense intended.
January 18, 2006 3:46:13 PM

Quote:
...many ignore them all together nowadays.


You know, it's funny that you mentioned that.

I came to this forum because a few years ago, I would go to Tom's homepage every day. Just the natural thing to do.

Now I rarely visit the homepage. In my opinion it seemed that they started spinning articles in an odd way.

Then they took the "high road" on some graphics card reviews by not reviewing them at all... for weeks after their competition did.

Yes, that's the high road, but this is a news site. Once news is old, it's no longer news. You used to be able to count on Tom's to have the first, or one of the first, unbiased reviews.

And bias... to this day, I do not understand why the 9700 PRO review called it the "pretender to the throne", when it beat the living snot out of it's competitor the Geforce 4 Ti. maybe I just don't understand the reference, I dunno...

Regardless, over time I started relying on Xbit and the Inquirer for my news tidbits. Tom's still has good reference articles, especially the VGA charts, but it's just not the king of graphics news anymore, IMHO....

... and these quasi-accurate articles are not helping their rep.
January 18, 2006 4:40:13 PM

I must say I was highly dissapointed in the article. However, I was impressed with your perseverance in making the poll. :p 

Know that I have no idea what language the article was originally written in, so I do apologize for any criticisms that might not apply to the actual article itself, but rather an improper translation in the case of, say, the article originally being German, and poorly translated into English, which given some of the grammatical ackwardness I see, I am almost certain of. But still, I can't tell whether something is the way it is because the original article was like that, or whether it was an error in translation.

At any rate, the people above have already covered the point about confusing what cards supported what SM. Normally, this can be forgiven for most people, (having talked with a lot of people, a lot, about graphics cards, it's a common mistake) but I would've expected more out of an article on THG, which I've held as the gold standard for hardware. (but are now having doubts)

If memory serves me correctly, Morrowind was not the first game to display shaders; Halo on the Xbox had done so, and if we even count OpenGL, (though unclear, I interpreted the wording of the arcticle to specifically note shaders under D3D, not OGL) Quake III Arena was arguably the first game to implement pixel shaders. Of course, Microsoft held back a PC port of Halo until they had turned it into the first SM 2.0 (DX9) game.

Unfortunately, the article fails to immediately get more accurate; the comment on the development of shader models is incorrect as well; in 2002, nVidia only went to SM 1.3, (DX8.1) while ATi unvieled their first SM 2.0 cards, the Radeon 9700/9500 series on the R300. There was also no such thing as SM 2.1; the proper name, as far as I've been able to tell, was "SM 2.0 extended," though I see it alternately referred to as SM 2.1 or 2.0b; again, this is something that would be fine for normal people, but this is supposed to be a professional article, on a professional news site.

Further confusion arised when we get to discussion SM 3.0. Again, no surprise if this was an ordinary person, even with a fairly firm grasp on graphics, to do. the first ommision, which I feel likely can't have been the result of translation, is differentiating between DX 9, and DX 9.0c; that's the difference between SM 2.0 and SM 3.0. Then, although it's not a direct statement, the placement I see implies that parralax mapping is a SM 3.0 effect, when it can quite clearly be implemented without drawbacks under SM 2.0; F.E.A.R. readily shows this, as well as the upcomming Elder Scrolls IV: Oblivion.

Although not as significant as the others, it should be noted that the common misconception that "Bullet Time" first appeared in The Matrix is FALSE; it was merely the first appearance to truly popularize it. The first film to use it was the 1981 film Kill and Kill Again, (Wikipedia Link) and other places as well before 1999. The first actual CG version debuted in a project known as The Campanile Movie, by Paul Debevec, and the work there was directly drawn upon in the making of the scenes in The Matrix.

Once we get past the screenshots, and back into the article, things resume, unfortunately. Right off, my earlier suspicion had been confirmed, when they claimed that without SM 3.0, you cannot have HDR, parralax mapping, or transparent water. All three of those seperate claims, as we all know, are very false. A number of SM 2.0 demos and games demonstrate HDR. (including Masa's RTHDRIBL and Debevec's RNL) As for paralax mapping, there's my comment about F.E.A.R.; the parralax mapping works perfectly fine with a Radeon X-series card. And lastly, the water comment is laughable; even if you only count water that's shaded, even Morrowind's water is transparent, as well as that of other games, such as Far Cry, and all under SM 2.0. (Age of Empires 3 really cannot be considered a good demonstration of the capabilities of SM 3.0, given the inherent conflict of interest as it was produced by the same company that developed SM 3.0) Again, we also see what appears to be some odd confusions, that seem to be implying that SM 2.0 is part of DirectX 8, and SM 3.0 is part of DirectX 9.

However, at least in describing the effects of HDR and paralax mapping, the author is much more accurate, though they're slightly off on one point: normal-mapping is the true replacement for bump-mapping, as they both alter the way an object is lighted; (given that bump-mapping simply mimics varied elevation, while normal-mapping actually covers "angle") parralax mapping simply changes the way the texture is drawn on a surface, and allows for another tool to accomplish the same effect, but is still best when combined with normal-mapping.

Of course, once we get to the page on F.E.A.R.'s detail levels, we encounter more problems. Equating the Radeon 9800 with the GeForce 3 and 4, and the Radeon X800 with the GeForce FX? Let's slap a "the way it's meant to be played" icon over all the images while we're at it, okay?

In the end, my conclusion that the article was far less than professional; I’d like to believe that this was entirely due to lossy translation, but it is quite clear that many, if not the solid majority, of the major flaws are not the result of such.

Perhaps the biggest disappointment was a result of the English title; “ New 3D Graphics Card Features in 2006.” When I saw it the day it was posted, I eagerly opened it, hoping to see perhaps a future glimpse of some features to come in upcoming games, that had existed, but weren’t used, such as sub-surface scattering or radiosity. I was thoroughly disappointed for it instead give a flawed, and potentially slanted, review on the top shaders used in games in 2005. The title was very misleading; something like “All the latest shaders at work” might’ve been better for it.

Overall, it seems that while I have considered THG to be the best source for techy data on PC hardware, I've been seeing a disturbing trend, with some of the articles seeming to slip in quality. My first complaint came from the 8th itineration of the VGA charts; I cannot consider it as fair as the previous versions, as while they used version 81.85 of ForceWare, which was quite fresh upon the time of the article, they used the thoroughly outdated Catalyst 5.10. Most importantly, this was old enough not to include the major “hotfixes” that had resulted in massive performance increased for the Radeon X1k cards in OpenGL games such as Quake IV. If memory serves me correctly, ATi had Catalyst version 5.12 availible around the same time as ForceWare 81.85, if not even before then.
January 18, 2006 4:47:07 PM

Quote:
You know, it's funny that you mentioned that.

I came to this forum because a few years ago, I would go to Tom's homepage every day. Just the natural thing to do.

Now I rarely visit the homepage. In my opinion it seemed that they started spinning articles in an odd way.

Then they took the "high road" on some graphics card reviews by not reviewing them at all... for weeks after their competition did.

Yes, that's the high road, but this is a news site. Once news is old, it's no longer news. You used to be able to count on Tom's to have the first, or one of the first, unbiased reviews.

And bias... to this day, I do not understand why the 9700 PRO review called it the "pretender to the throne", when it beat the living snot out of it's competitor the Geforce 4 Ti. maybe I just don't understand the reference, I dunno...

Regardless, over time I started relying on Xbit and the Inquirer for my news tidbits. Tom's still has good reference articles, especially the VGA charts, but it's just not the king of graphics news anymore, IMHO....

... and these quasi-accurate articles are not helping their rep.
Indeed, I must agree with you. As I mentioned above, it seems to be a disturbing trend to see the quality of the articles go down, and also to see the site become less and less the best source for the most up-to-date information; I often see things ommited entirely. And the slant that seems to be appearing in some articles is quite easily the strongest source of the quality loss.

As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given. I would be seriously appalled to find if I was wrong here. Graphics-chip favoratism is even worse than die-hard soda favoratism; at least there are substantial enough differences between Coke and Pepsi to merit a strong preference.

GPU fandom is sad enough; I've perused a number of ATi and nVidia's internal memos and presentations, so I say we need to let THEM be the fanboys/fangirls here, and just draw amusement here, and potentially really good deals on video cards while we're at it, and they're at each other's throats. Business competition is supposed to be good for the customers as well as the media, so there's no reason to toss that away.
January 18, 2006 5:11:57 PM

Quote:

As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given.


Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html
January 18, 2006 5:41:52 PM

I personally liked reading the article and go for anything 3D/graphics related.

But the headline of the article had me believing I would learn about features and wiz-bang stuff not yet released. What I saw was technology already in cards in 2005. I was hoping for glimpses of what the next generation ATV/nVidia cards have in mind for us.

Some other reporting on screen resolutions, where LCD gaming is going and how it's improving and those sorts of things would have been nice.

But like I said the article was well done with the screen shots and explanations. But I was looking a little bit more for future stuff.

A good article might be about what Vista means for us hard core gamers. Good or bad? Do we want an OS hoggin up all the graphics horsepower? I'm still on W2K because I dispise the resource hogging, kiddie look of XP. And even in W2K I have all features diabled for optimal performance.
January 18, 2006 5:41:58 PM

Quote:
Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html
Although I haven't even been a READER of THG for terribly long, (only since late 2003) I can't say much for certain, but one thing that I've noticed is that even if 99% of their research is excelent, their translation often leaves something to be desired. I had forgotten that the title was phrased that way. :p 
a b U Graphics card
January 18, 2006 6:51:52 PM

Quote:
I must say I was highly dissapointed in the article. However, I was impressed with your perseverance in making the poll. :p 


I hope you weren't in the forum yesterday around 1am Eastern (11Pm local) because I must've take up all the posts on the (display 10 posts screen) and about half of them on the lagrer views. I had like 12-15 going at once all with slight variantions to get past whatever I thought the culprit causing the 'error' was. When I finally returned to the forum to look at it, I said out loud Holy F-Uh-K! Spent next 5 minutes manually deleting each one and confirming deletion. I was worried someone would reply before I could delete them all. :lol: 

Now you've spent more time analyzing the article than I have, nice job on that BTW.

Quote:
original article was like that, or whether it was an error in translation.


And that's the thing, Lars/Bortsi is German as well, and his reviews always were solid. You might question some sidebar conclusions, but they showed the time/care take. There was no rush on this article either it wasn't some launch hardware. I may be being a little overly critical, but it seemed to do the opposite of what I would expect to be it's intent. Muddying the waters instead of clearing them in their DX8.1/DX9 glory. 8)

Quote:
If memory serves me correctly, Morrowind was not the first game to display shaders; Halo on the Xbox had done so,


I think they may have been focusing on PC, and Morrowind PC shipped as Morrowind Xbox did. I'm not sure about the programable shaders in Halo, it may not have shipped as such with the Xbox title, but I wouldn't say. But Morrowind on the PC was a testament to the difference between DX8 and DX8.1. The Xbox version was DX8 because of the GF3 engine. OpenGL makes it more of an issue due to the developement of features before the support of hardware sometimes (seem easier with the extensions). May be the first time to enable featues? I don't know.



Quote:
nVidia only went to SM 1.3, (DX8.1)


DX8.0 :p 
Remember ATi had the DX8.1 part with the R8500, one of the main reasons I chose the AIW R8500 for Morrowind instead of the Geforces. After seeing the ExtendedPlay review on TechTV, I wanted all the bling, never looked back for that.

Quote:
while ATi unvieled their first SM 2.0 cards, the Radeon 9700/9500 series on the R300. There was also no such thing as SM 2.1; the proper name, as far as I've been able to tell, was "SM 2.0 extended," though I see it alternately referred to as SM 2.1 or 2.0b; again, this is something that would be fine for normal people, but this is supposed to be a professional article, on a professional news site.


Yep I pointed that out in an e-mail to THG. I don't remember seeing PS2.1 on MSDN's D3D info either. :wink:

Quote:
Further confusion arised when we get to discussion SM 3.0. Again, no surprise if this was an ordinary person, even with a fairly firm grasp on graphics, to do. the first ommision, which I feel likely can't have been the result of translation, is differentiating between DX 9, and DX 9.0c; that's the difference between SM 2.0 and SM 3.0. Then, although it's not a direct statement, the placement I see implies that parralax mapping is a SM 3.0 effect, when it can quite clearly be implemented without drawbacks under SM 2.0; F.E.A.R. readily shows this, as well as the upcomming Elder Scrolls IV: Oblivion.


Exactly, this almost seems like a chapter taken from the Cg workbook.

Quote:
All three of those seperate claims, as we all know, are very false.


Exactly!

Quote:
even if you only count water that's shaded, even Morrowind's water is transparent,


Yes is is, and on the FX series even more so :lol: 
Morrowind transparent using PS1.4.

Quote:
However, at least in describing the effects of HDR and paralax mapping, the author is much more accurate, though they're slightly off on one point: normal-mapping is the true replacement for bump-mapping, as they both alter the way an object is lighted; (given that bump-mapping simply mimics varied elevation, while normal-mapping actually covers "angle") parralax mapping simply changes the way the texture is drawn on a surface, and allows for another tool to accomplish the same effect, but is still best when combined with normal-mapping.


And then there's displacement mapping, or as ATi and nV do now, virtual displacement mapping. Getting rid of the virtual would be nice, but there seems to be little focus on rush to that point yet.

The only issue I had with the lighting section was saying that Specular lighting istelf was a functions of HDR, which isn't correct either, and to compound it it was said once again only SM3.0 could do it.

Quote:
Of course, once we get to the page on F.E.A.R.'s detail levels, we encounter more problems. Equating the Radeon 9800 with the GeForce 3 and 4, and the Radeon X800 with the GeForce FX? Let's slap a "the way it's meant to be played" icon over all the images while we're at it, okay?


And that was the uneasy feeling I got only in that last section. Before I just thought, Oooh sloppy, but with that final page, OIE it just seemed rather 'directed'.

Perhaps the biggest disappointment was a result of the English title; “ New 3D Graphics Card Features in 2006.” When I saw it the day it was posted, I eagerly opened it, hoping to see perhaps a future glimpse of some features to come in upcoming games, that had existed, but weren’t used, such as sub-surface scattering or radiosity.[/quote]

I felt the same way. Alot was sort of a re-hash. Like, 'been there, done that on B3D' feel. I was thinking it would be in preparation of the launch of 3DMark, especially since I only saw it appear yesterday while we're all waiting for 3Dmark to be downloadable (still waiting at work).

Quote:
Overall, it seems that while I have considered THG to be the best source for techy data on PC hardware, I've been seeing a disturbing trend, with some of the articles seeming to slip in quality.


That's why that second to last option was there.

Quote:
My first complaint came from the 8th itineration of the VGA charts; I cannot consider it as fair as the previous versions, as while they used version 81.85 of ForceWare, which was quite fresh upon the time of the article, they used the thoroughly outdated Catalyst 5.10.


Yep, ATi has these drivers available early on request, but using the nV Betas once they become official and not making the effort to at least use the current ATi ones just seems unbalanced. IT's annoying when discussing cards too because nothing's current, but it is the only thing new people to the forum seem to ever use to defend their statements, despite improvements from both sides.

Quote:
Most importantly, this was old enough not to include the major “hotfixes” that had resulted in massive performance increased for the Radeon X1k cards in OpenGL games such as Quake IV. If memory serves me correctly, ATi had Catalyst version 5.12 availible around the same time as ForceWare 81.85, if not even before then.


Especially if you are in the industry. Driver improvements like that deserve to be added. It's like doing a hardware comparison the day before the GF7800 launch or the day before the X1800 launch and hten saying the competition has nothing to respond with, and then not updating the review in lightof the new reality. I hate when that happenes regardless of what product it is because the next refresh is months away and until then it's the high water mark for the site.

Anywhoo, nice break-down/analysis, thanks for that was interesting seeing your thoughts. Also nice to know that with comments like Cleeve's and yours I wasn't just being hyper-critical or something.
a b U Graphics card
January 18, 2006 6:58:11 PM

Quote:

GPU fandom is sad enough; I've perused a number of ATi and nVidia's internal memos and presentations, so I say we need to let THEM be the fanboys/fangirls here, and just draw amusement here, and potentially really good deals on video cards while we're at it, and they're at each other's throats. Business competition is supposed to be good for the customers as well as the media, so there's no reason to toss that away.


And that's the thing, people turn to the articles and to the members of the forum here to be objective and help them in their decision making where they would be spending their hard earned money. I'd hate to have someone think that buying an FX would get them anywhere near X800 level performance.
January 18, 2006 8:51:50 PM

I've been a big fan of THG since 2002, I've spent reading some interesting articles but still have no Idea who's TOM... who the hell is Tom anyways?
January 18, 2006 9:50:36 PM

Quote:

As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given.


Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html

Its an improper translation.

Here's the German version:
http://www2.tomshardware.de/graphic/20020819/index.html

In German the title is "Anspruch auf dem Thron"

Anspruch was mistranslated. The correct translation should be 'claim', not 'pretender'. Its possible that someone confused their expressions when translating.

So it should've been called, "Claim to the Throne".

I knew that German degree would come in handy some day. :D 

I'll take a look at the German version of this other article as well.
January 18, 2006 10:04:19 PM

Quote:
Man, not everybody 'at Tom's' knows as much as you about Graphics Cards. There are new people coming into the industry every day. Besides that, its a good thing to take stock of what features you can expect from graphics cards today so that people who don't have zotts of time to pour over forums and other websites can figure out what some of this stuff means.


let me restate myself i meant this

Quote:
What I saw was technology already in cards in 2005. I was hoping for glimpses of what the next generation ATV/nVidia cards have in mind for us.


ok?, but still would of been nice for them to of waited till the next big thing(directx10) and then look back then there would be a nice list of old technology and what we could forward too.
January 19, 2006 7:35:44 PM

...Because such a large response deserves a response of its own... :D 
Quote:
I hope you weren't in the forum yesterday around 1am Eastern (11Pm local) because I must've take up all the posts on the (display 10 posts screen) and about half of them on the lagrer views. I had like 12-15 going at once all with slight variantions to get past whatever I thought the culprit causing the 'error' was. When I finally returned to the forum to look at it, I said out loud Holy F-Uh-K! Spent next 5 minutes manually deleting each one and confirming deletion. I was worried someone would reply before I could delete them all. :lol: 

Now you've spent more time analyzing the article than I have, nice job on that BTW.

Well, luckily for you, I tend to be offline consistently before midnight. (Eastern American Standard Time) I'll take your word it looked impressive, though.

As for the post, it was just the natural result of the combination of my compulsive reading, (note the READING, not SKIMMING) and the manner in which I write posts; it surprises me that, with the way I tend to write posts, I'm nearing the 10,000-post mark for total posts across the Internet.

Quote:
And that's the thing, Lars/Bortsi is German as well, and his reviews always were solid. You might question some sidebar conclusions, but they showed the time/care take. There was no rush on this article either it wasn't some launch hardware. I may be being a little overly critical, but it seemed to do the opposite of what I would expect to be it's intent. Muddying the waters instead of clearing them in their DX8.1/DX9 glory. 8)

Sadly, I wasn't really a very solid reader during that time. However, I've seen some of their reviews, and I do agree that they were much better. and yes, it clearly didn't help too well in what was definitely its goal, to explain all these graphics tricks that have become literal buzzwords all on their own.

Quote:
I think they may have been focusing on PC, and Morrowind PC shipped as Morrowind Xbox did. I'm not sure about the programable shaders in Halo, it may not have shipped as such with the Xbox title, but I wouldn't say. But Morrowind on the PC was a testament to the difference between DX8 and DX8.1. The Xbox version was DX8 because of the GF3 engine. OpenGL makes it more of an issue due to the developement of features before the support of hardware sometimes (seem easier with the extensions). May be the first time to enable featues? I don't know.

Hence why I noted the "on Xbox." The original version used limited shaders; it, too, was effectively limited to cube-mapped water, if memory serves me correctly. I think they may have even used bump-mapping rather than normal-mapping... which would be pure ugh.

However, to be honest, as much as I've been into Morrowind, I never knew that there was a DX 8.1 version of the shaders as well. I never had any SM 1.x cards, only DX7 and earlier, and DX9 and later cards, oddly enough. (I went from a Radeon 7000, to a GeForce FX 5200, to my current Radeon X800XT; leaps and bounds, it has been)

Quote:
DX8.0 :p 

Remember ATi had the DX8.1 part with the R8500, one of the main reasons I chose the AIW R8500 for Morrowind instead of the Geforces. After seeing the ExtendedPlay review on TechTV, I wanted all the bling, never looked back for that.

SM 1.3 was part of DX 8.0? I thought that it was SM 1.0, 1.1, and 1.2 that were DX8.0, supported by the GeForce 3 and 3 Ti, While the GeForce 4 Ti was a DX 8.1 part, and supported up to SM 1.3. And, of course, the Radeon 8500 was DX 8.1, and supported up to SM 1.4.

Oh, and after *I* watched some TechTV, I wanted to avoid the channel at all costs, and never looked back after that. :p  (I spend very little time watching TV as it is; I'd rather not use it inneficiently getting stuff I could get online)

Quote:
Yep I pointed that out in an e-mail to THG. I don't remember seeing PS2.1 on MSDN's D3D info either. :wink:

Yes, I do distinctly recall it simply being labeled as "Shader Model 2.0 Extended." I normally consider it okay to shorten it, but as part of a professional article... *shakes head slowly*

Quote:
Exactly, this almost seems like a chapter taken from the Cg workbook.

Which part? :?

Quote:
Yes is is, and on the FX series even more so :lol: 
Morrowind transparent using PS1.4.

As I said, I never experienced Morrowind's shaders at less than "full," and I never even knew there was more than one setting. I don't personally have the Xbox version (or an Xbox, for that matter) so I wasn't able to look at the water closely enough to tell anything more than it was shaded.

Quote:
And then there's displacement mapping, or as ATi and nV do now, virtual displacement mapping. Getting rid of the virtual would be nice, but there seems to be little focus on rush to that point yet.

Personally, I still don't see the point of even real displacement mapping; normally, it seems like another way of doing something that's already been done, increasing the polygon count of a model. If I understand it correctly, it does allow for the real-time manipulation of the surface's geometry through the pixel shaders, but I've yet to see an application actually do that... And I may be wrong there. I guess part of my opinion on displacement mapping is likely due to me not understanding it as well as I should.

Quote:
The only issue I had with the lighting section was saying that Specular lighting istelf was a functions of HDR, which isn't correct either, and to compound it it was said once again only SM3.0 could do it.

If memory serves me correctly, isn't Phong specular shading possible with any card supporting T&L? It allowed for hardware rendering of at least Gourad shading, so it would reasonably also allow for Phong shading.

Quote:
And that was the uneasy feeling I got only in that last section. Before I just thought, Oooh sloppy, but with that final page, OIE it just seemed rather 'directed'.

Indeed, that was easily the most unsettling part of the entire article.

Quote:
I felt the same way. Alot was sort of a re-hash. Like, 'been there, done that on B3D' feel. I was thinking it would be in preparation of the launch of 3DMark, especially since I only saw it appear yesterday while we're all waiting for 3Dmark to be downloadable (still waiting at work).

Yes, B3D is a very good site, and I find myself going there more often all the time. THG needs to do more if they wish their graphics section to remain even viable in the face of B3D, else we'll be seeing closer to a media monopoly here. (AnandTech, in my opinion, just isn't big enough to cover everything important, and most of the other sites, like The Inquirer, are in the "touch with a 10-foot stick" category)

To be honest, though, I hadn't thought about 3Dmark06, but had I remembered the launch date at that point, I would've certainly hoped to see something about that, only to be dissapointed yet more.

Quote:
That's why that second to last option was there.

Alas, I had not really paid enough attention here long enough to select that; I thought about it, but went with simply "Ugh!"

Quote:
Yep, ATi has these drivers available early on request, but using the nV Betas once they become official and not making the effort to at least use the current ATi ones just seems unbalanced. IT's annoying when discussing cards too because nothing's current, but it is the only thing new people to the forum seem to ever use to defend their statements, despite improvements from both sides.

Yes, the way I see it, a benchmark loses all relevance if it doesn't reflect the cards as the actual users will be presumed to be using them. And that would include using the latest drivers, (or even Omega drivers) as hopefully that's been drilled into their heads.

Quote:
Especially if you are in the industry. Driver improvements like that deserve to be added. It's like doing a hardware comparison the day before the GF7800 launch or the day before the X1800 launch and hten saying the competition has nothing to respond with, and then not updating the review in lightof the new reality. I hate when that happenes regardless of what product it is because the next refresh is months away and until then it's the high water mark for the site.

Anywhoo, nice break-down/analysis, thanks for that was interesting seeing your thoughts. Also nice to know that with comments like Cleeve's and yours I wasn't just being hyper-critical or something.

Indeed, it's one of my main pet peeves when it comes to the technology world. When it comes to technology, I personally love nothing more than to see upheavals in the graphics war, and hence cementing the status quo like that really dissapoints me.

At any rate, thank you, and you're certainly welcome for the review of the review. :p 
a b U Graphics card
January 19, 2006 9:41:35 PM

Toms still does articles? ;)  JK

I hadn't seen this one, but wow... not a job well done. The "GF3, GF4, R9800..." and "GF5, X800" bit was inexcuseable. I like the concept of the article, but it drastically needs to be corrected. It looks like ole Kinney wrote it.

Anyway, You guys pretty much covered it well, and I agree with all the points made. I'd like to vote "Ugh! & bring back Lars"
January 20, 2006 2:15:17 AM

Hell, Lars can stay in retirement.

All they have to do is post this thread as an article, and they're in business...

Hell, they could pick the best thread out of here every month and post it. Would make for some real good reading.
a b U Graphics card
January 20, 2006 6:58:30 PM

Quote:
...Because such a large response deserves a response of its own... :D 


YEah it's amazing, good replies end up growing geometrically it seems. I'll try to be brief :mrgreen:

Quote:
However, to be honest, as much as I've been into Morrowind, I never knew that there was a DX 8.1 version of the shaders as well.I never had any SM 1.x cards, only DX7 and earlier, and DX9 and later cards, oddly enough. (I went from a Radeon 7000, to a GeForce FX 5200, to my current Radeon X800XT; leaps and bounds, it has been)


Some might consider the FX series the top of DX8.1 cards (DX9? Ummm...) :tongue:


Quote:
SM 1.3 was part of DX 8.0? I thought that it was SM 1.0, 1.1, and 1.2 that were DX8.0, supported by the GeForce 3 and 3 Ti, While the GeForce 4 Ti was a DX 8.1 part, and supported up to SM 1.3. And, of course, the Radeon 8500 was DX 8.1, and supported up to SM 1.4.


Yeah, DX8.0 was up to 1.3 and DX8.1 added PS1.4 support. Hence the GF3 and 4 are only DX8.0.

Quote:
Oh, and after *I* watched some TechTV, I wanted to avoid the channel at all costs, and never looked back after that. :p  (I spend very little time watching TV as it is; I'd rather not use it inneficiently getting stuff I could get online)


The OLD TechTV was great, the NEW Tech TV SUCKS, VERY BAD!

Quote:
Yes is is, and on the FX series even more so :lol: 
Morrowind transparent using PS1.4.


Quote:
As I said, I never experienced Morrowind's shaders at less than "full," and I never even knew there was more than one setting.


Funny thing is if you experienced it on your FX5200 the way I did on my FXGO5200, you experienced the FX series bug of completely transparent water. Looked like Air with a transition line and texture on top, completely different from All other cards (no opacity, just reflection).
It was finally fixed on Foreware 66.xx, so if you only played it after tha driver series you may never have seen it this;
http://www.ixbt.com/video/itogi-video/bugs0904/gffx5200-es3mw-bug1-1.jpg

Water looks like this for all others;
http://www.ixbt.com/video/itogi-video/gal0904/r9600-morrowind-1.jpg

Quote:
Personally, I still don't see the point of even real displacement mapping;


Well and it's true that the effects are rarely greatly different on current hardware, but the demo by Matrox was nice, and with it displacement mapping you should be able to do more with less code. And that ends up being the rub, will the performance difference or coding difference be enough to justify it. I think eventually we'll go that way anyways because the VPU performance penalty will drop far enough where any benifit to programs will be benificial, but then again, it may end up like n-pathces and not go as far as we'd hoped. Displacement maps should get rid of some of the nasty issues with things like Truform in Morrowind where things were TOO bulgy. Anywhoo, we'll see.

Quote:
If memory serves me correctly, isn't Phong specular shading possible with any card supporting T&L? It allowed for hardware rendering of at least Gourad shading, so it would reasonably also allow for Phong shading.


Now for that I'd actually have to check. Not enough time at work to check but it sounds right, just don't want to say yes, and then find out it was the VS generation instead. Hmm checking my facts, something that the authors/editors might like to get aquainted with. :oops: 

Quote:
Yes, B3D is a very good site...


Yep, B3D is among the best, the reviews are a little dry and don't compare cross-architectures/IHV much (other than power consumption it seems), but solid stuff. Andand I don't read unless directed to a specific article by someone, the InQ is nice for the 50/50 rumour, but you need to keep aware of that which you are. Others would be Xbit, The TechReport, Digit-Life, 3Dcentre.de, and [H]ard|OCP to name a few.

Quote:
At any rate, thank you, and you're certainly welcome for the review of the review. :p 


Always nice to see others expressing well thought out posts.

Anywhoo, back to work. Feeling kinda sick, but going skiing tomorrow.
January 23, 2006 6:45:14 PM

Quote:
Toms still does articles? ;)  JK

I hadn't seen this one, but wow... not a job well done. The "GF3, GF4, R9800..." and "GF5, X800" bit was inexcuseable. I like the concept of the article, but it drastically needs to be corrected. It looks like ole Kinney wrote it.


Whats that supposed to mean :?:
a b U Graphics card
January 23, 2006 8:10:38 PM

Quote:
Whats that supposed to mean

Unless you are Game ;)  , Asking this makes it look like you didn't read the other responses.
a b U Graphics card
January 23, 2006 8:35:13 PM

Quote:

Whats that supposed to mean :?:


Registered just to post that?

Hmmmm..... lemme guess who voted the 1 star....
a b U Graphics card
January 23, 2006 8:46:43 PM

Can you guess who voted 5 stars? :D 
January 23, 2006 8:50:30 PM

Quote:
Whats that supposed to mean

Unless you are Game ;)  , Asking this makes it look like you didn't read the other responses.

While I'm sure this Kinney fellow who must hold such an esteemed and worthy Scottish surname must be of the finest caliber... I do agree the article sucked.
Though no one of Scottish literary descent could have written it! :evil: 
January 23, 2006 8:51:51 PM

Quote:

Whats that supposed to mean :?:


Registered just to post that?

Hmmmm..... lemme guess who voted the 1 star....

What are you talking about, 1 star? :? I really dont know what you are referring too.. but whatever it is, must be my fault... lame!





edit to add- ok im guessing its the vote for the OP.. I voted 5 stars just to tickle your fancy. Now dont be a jerk and assume anymore. :evil:  Typical tho.
a b U Graphics card
January 24, 2006 3:27:40 AM

Quote:

edit to add- ok im guessing its the vote for the OP.. I voted 5 stars just to tickle your fancy. Now dont be a jerk and assume anymore. :evil:  Typical tho.


As typical as it may be it's interesting none the less. If I'm mistaken, so be it, the mooning wasn't for the vote but for the logging on to ask the one question on 'behalf' of kinney.

Personally I don't care about the star system, first time anyone's ever used it for me so I thought it funny it appeared after your post did too.
January 24, 2006 12:33:39 PM

nope, just a coincidence.

i did see this recently on drudge thoughlink
You are right though, a few people deserve stabbed in the face that frequent these forums.. but you arent one.

im going back to OCP as its my home now. i was surprised the first thread i check in at least a year has a reference to kinney in it.
and the article quality on this site is indeed horrible.
a b U Graphics card
January 24, 2006 7:04:16 PM

Quote:
i did see this recently on drudge though


Yep, it'll be interesting to see how long they can last as a minority gov't considering they are going to have to make a co-alition out of their previous sworn enemies.

Did you also see who won her seat again? :wink:

It's the people not teh politics that matter most.

Quote:
im going back to OCP as its my home now. i was surprised the first thread i check in at least a year has a reference to kinney in it.


I'm surprised it's [H] and not Anand. There have been other posts, but none started by me usually refered to me. I know not to expect you until the G71 launch, if even then. Hopefully the good times have evened your temperment towards the overall picture.

Quote:
and the article quality on this site is indeed horrible.

Definitely not what it used ot be, their LCD panel investigations and such are good, but I have to say compared to [H] the graphics reviews look like a start-up site. THG used to lead the way, now Digit-Life, [H], FiringSquad, and Xbit offer the most thorough reviews IMO.

Anywhoo, enjoy [H], hope life is treating you well.
January 24, 2006 8:24:58 PM

Quote:

I'm surprised it's [H] and not Anand. There have been other posts, but none started by me usually refered to me. I know not to expect you until the G71 launch, if even then. Hopefully the good times have evened your temperment towards the overall picture.


I currently use a evga 7800GT. I think the best bang/buck now, and in the forseeable future is 7800GT SLI. It wasnt too long ago I moved to the GF7, and my inside source told me a bit about the 7900, but nothing will affect the purchase of a 7800GT, being a 'value card'. Though I wouldnt purchase the GTX today.

Right now, for high end I'm excited about 7800GT SLI, I think its the ticket. The rest of the market I dont know how much I care about to analyze. I like the 6600GT still for lower/midrange, and low end the 9600 Pro.
A single 7800GT makes a nice budget/value high end choice, with SLI 7800GT being the wisest high end choice ATM IMO.
I might add another one, but my upgrade cost moving from a single 6800GT to a 7800GT was a total of $50. If I can do another $50 move from the 7800GT to the 7900GT I will.. otherwise I might add another one when I get my 2405FPW (still on the 2005fpw).

High end pick: 7800GT SLI (will need to be reevaluated by the new cards coming out of course, but currently a very solid choice)

Value high end: 7800GT (probably takes this spot for me because of the opportunity/value of adding a 2nd one, but still a great price/performance choice even if not considering SLI) I like this card and the whole GF7 series because of their GF6 SLI performance, and *much* lowered power consumption.
Thats a biggie for me these days, watching the power consumption of the high end cards. The PSU increases were getting ridiculous before NV added their mobile GF tech into their desktop parts.

Midrange: 6600GT (a nice card with fully functional PVP, decent performance, nice price.. SLI as well, though I wouldnt recommend that unless a 2nd one could be had very cheaply/free, just move to a 7800GT that performs like 4 of these)

Lowend: 9600 Pro (good old mainstay, tough to beat.. I would like to put something with fully functional PVP here, but I dont know if the 6600-plain would beat it or not.. havent really examined things that well because I dont care so much about this part of the market)

I dont break the market down as much as some people do, like $50 increments. These are the boundries that make sense to me and my purchasing habits. With SLI/7800GTX512 and equivalent ATI solutions, I think there needs to be a distinction between "high end" and "budget high end".
Or, a reasonable purchase, and unreasonable purchase :wink:

Quote:

Definitely not what it used ot be, their LCD panel investigations and such are good, but I have to say compared to [H] the graphics reviews look like a start-up site. THG used to lead the way, now Digit-Life, [H], FiringSquad, and Xbit offer the most thorough reviews IMO.

Anywhoo, enjoy [H], hope life is treating you well.


Ah and same to you. Live couldnt be better here. Lots of changes have happened, many unfortunate, yet progress is still going forward.. I work at a worldwide corporation now, and have a year left of university. Learning ASP and spanish currently for my work and school. Just loving life. Reading a book on the conquest of Peru. Oh, and I'm learning to salsa this summer. :tongue: I hope to travel for the company to south america and spain after graduation. Plus hispanic things are a new part of the American landscape, they are great hardworking, christian people. I personally openly embrace and encourage the coming changes to the nation. As well as keep our economy rolling for this century.
But thats the just of my life, right now.. I am no longer married. She was a nightmare..
but I am closer to God now than ever and I am happier than I have ever been with the changes I've made to my life. I'm a very happy person. I hope you are too.

On THG articles, I think the forum members would be able to write these articles better than the current THG staff. Even if people worked from their current locations and one person did the testing (Ape on GPUs, Cleeve on CPUs, ect and so on). I see no reason to visit this site whatsoever other than to pop in the forums as I really like everyone besides 1 or 2 characters, and they disgust me enough that I'd rather just take the high road and be happy elsewhere.
a b U Graphics card
January 24, 2006 9:54:33 PM

Quote:
I like the 6600GT still for lower/midrange,


Well I'd say for the price the GF6800GS and X800GTO just are really attractive because for gaming for the next year or so, the Gf6600GT is going to struggle IMO. Nie if you already own one, but the GS and GTO just are great values for the time being.

Quote:
and low end the 9600 Pro.


Well, once again if you already own one, buying new, the X1300 kinda own the truely down low, AGP or PCIe. The R9600P was good for it's time, but now it's not really powerful enough for anything other than the PVR stuff like you mentioned, and then the R9200/X300/GF6200 would be fine. The X1300 just gives you some nice AVIVO feature which may or may not be useful, it also does a little better than the previous cards if you game rarely. I gave away my R9600P for that very reason, my laptop does better now.

Quote:
If I can do another $50 move from the 7800GT to the 7900GT I will.. otherwise I might add another one when I get my 2405FPW (still on the 2005fpw).


I actually though of you when the Dell 30" was launched. 8)
Not sure about SLi still, when there is potential for single card solutions, but the 2xGF7800GT has been a good choice over the GTX-512 IMO, and I recommended it a few times around here since SLi has matured, and it's far more available and even sometimes cheaper than the GTX-512. Depending on deals it brushed up against the price of the X1800XT, so in that case the value is pretty strong because the 2xGT beats both of those cards in most of the popular games (always an exception here or there).

Quote:
Thats a biggie for me these days, watching the power consumption of the high end cards. The PSU increases were getting ridiculous before NV added their mobile GF tech into their desktop parts.


I agree with you on the PSU strain side of things (I doubt it'll bother me on my next build [I always over build the PSU]), overall consumption means little usually 10-50cents/month , but having to buy a new PSU sucks, not only for the cost, but it's another variable you have to account for after having found something reliable. I think my next one will be along the lines of a PCP&C 510, so it should be sufficient.

Quote:
I dont break the market down as much as some people do, like $50 increments. These are the boundries that make sense to me and my purchasing habits. With SLI/7800GTX512 and equivalent ATI solutions, I think there needs to be a distinction between "high end" and "budget high end".
Or, a reasonable purchase, and unreasonable purchase :wink:


Funny thing is I think now is one of the first times in a while that both sides have solid 'overall' choices, however heading into this future design of pipes versus shaders/ALUs you're likely going to see cards that do well in one area (X1900 in F.E.A.R. / G71 in Quake4) while doing poorly in another. The video features are also nice, but very buyer specific. I'd say the only poor choices now are overpriced cards, but in a few months someone may buy the wrong card for the games they play because the other company's old generation may outperform that current one. We'll see though, could be completely different though and favour one design over another. To guess at it my money would be on the G71 simply because it's more traditional, and the apps that might benifit the alternative aren't really flooding the market yet. Oblivion IMO will be one of those titles to watch because of it's different possible complex configurations/options.

Quote:
I personally openly embrace and encourage the coming changes to the nation. As well as keep our economy rolling for this century.


Definitely many changes ahead for your guy in that respect and for us (with are slightly more Asian influence).

Quote:
I'm a very happy person. I hope you are too.


Good to hear, yep doing good out here, especially now that ski season is finally starting to take hold.

Quote:
On THG articles, I think the forum members would be able to write these articles better than the current THG staff. Even if people worked from their current locations and one person did the testing (Ape on GPUs, Cleeve on CPUs, ect and so on).


Yep, Crashman does a damn fine job on his reviews and to me is an example of what you say, just wish they'd give him more work. I hope to see an AIW X1900 review from him soon enough.

Quote:
I see no reason to visit this site whatsoever other than to pop in the forums as I really like everyone besides 1 or 2 characters, and they disgust me enough that I'd rather just take the high road and be happy elsewhere.


I understand that's the way with any place, good and the bad, just a question of what the acceptable balance is. I tell you with the recent influx of n00bs you might not recognize the place. But if I hadn't still been here so long I probably would've dropped off and focus elsewhere too. I still post here from work, but I (like so many other forum regulars and former regulars) look for my graphics info and insight elsewhere nowadays, B3D being my personal fav to name one of a select few.

Anywhoo, all the best in the future man!

.
a b U Graphics card
January 24, 2006 11:04:49 PM

Quote:
Whats that supposed to mean

Unless you are Game ;)  , Asking this makes it look like you didn't read the other responses.

While I'm sure this Kinney fellow who must hold such an esteemed and worthy Scottish surname must be of the finest caliber... I do agree the article sucked.
Though no one of Scottish literary descent could have written it! :evil: 
DOH, searching an old username? :oops:  Well, this Kinney fellow was one of a kind and could make me laugh and fume, all at the same time. But he was also well known for taking an NV side of things even if it meant straying from fact to fiction. :wink: Anyway, that (and not the full of errors in general part) is the reason for the Kinney comment, as reading the 9800 pro being lumped in with DX 8 cards, and the GF FX series being likened to X800 series, just made me think of arguements in the past with a guy who would also have his own spin on reality and use it to relentlessly put down ATI hardware and drivers. This Tom's reviewer seemed to also share the same low appreciation for ATI hardware with a few of those comments. Only difference, the reviewer probably doesn't know any better, and the Kinney fellow,..., well, he probably did; although good luck getting him to admit it. :D 
January 24, 2006 11:18:22 PM

The poll didnt have an "I've got crabs" option :cry: 
January 25, 2006 1:24:29 AM

Quote:
I like the 6600GT still for lower/midrange,


Well I'd say for the price the GF6800GS and X800GTO just are really attractive because for gaming for the next year or so, the Gf6600GT is going to struggle IMO. Nie if you already own one, but the GS and GTO just are great values for the time being.
Aye. I dont know much about the low end anymore. Primarily I look at the 7800GT and above area. I wouldnt go with anything less (because I dont have too and this is my main hobby so plenty of cash to spare).
I think you are right though on the GS/GTO. I'm not sure if you can find a GTO/GS as cheaply as a 6600GT.. but either way good bang/buck.

Quote:
and low end the 9600 Pro.


Well, once again if you already own one, buying new, the X1300 kinda own the truely down low, AGP or PCIe. The R9600P was good for it's time, but now it's not really powerful enough for anything other than the PVR stuff like you mentioned, and then the R9200/X300/GF6200 would be fine. The X1300 just gives you some nice AVIVO feature which may or may not be useful, it also does a little better than the previous cards if you game rarely. I gave away my R9600P for that very reason, my laptop does better now.
Not a big fan of AVIVO from the investigation I've done on it purosing the Doom9 forums. Those guys really are experts in that area, and they disapprove for many reasons (that I wont go into to stop from derailing the thread or starting a fisticuffs match), but they do admit its a 'nice start'.

Quote:
If I can do another $50 move from the 7800GT to the 7900GT I will.. otherwise I might add another one when I get my 2405FPW (still on the 2005fpw).


I actually though of you when the Dell 30" was launched. 8)
Not sure about SLi still, when there is potential for single card solutions, but the 2xGF7800GT has been a good choice over the GTX-512 IMO, and I recommended it a few times around here since SLi has matured, and it's far more available and even sometimes cheaper than the GTX-512. Depending on deals it brushed up against the price of the X1800XT, so in that case the value is pretty strong because the 2xGT beats both of those cards in most of the popular games (always an exception here or there).
Nice to see we see eye-to-eye on the 7800GT SLI. SLI was mature on launch beyond widescreen support (which is kind of a niche thing anyway), also had vsync issues (which shouldnt be much of a concern IMO).
No 30" dell for me, thats too big. I plan on getting a 1080 hdtv in a year or so.. probably a very large LCD.. so if I need larger than 24inch, I'll hookup to the TV.
I dont forsee myself going any larger, ever. 27" would be nice, but I'm happy with the 1920x1200 resolution (1080 support, 1600x1200 support for games that dont support widescreen). Its all I really want to support with fast video cards, a 27" (or 30" for that matter) is going to simply require SLI to push at native res in cutting edge games.
24" seems larger than average (a status I am used to enjoying in my life, let me tell you :wink:)  and I feel a reasonable place to stop for a desktop monitor due to my reasoning above and size.

Quote:
Thats a biggie for me these days, watching the power consumption of the high end cards. The PSU increases were getting ridiculous before NV added their mobile GF tech into their desktop parts.


I agree with you on the PSU strain side of things (I doubt it'll bother me on my next build [I always over build the PSU]), overall consumption means little usually 10-50cents/month , but having to buy a new PSU sucks, not only for the cost, but it's another variable you have to account for after having found something reliable. I think my next one will be along the lines of a PCP&C 510, so it should be sufficient.

Its not the money for me, it just shows a quality engineered product. I want to see steps to stop the energy bleeding in all future products. I use a 550watt Fortron, so no worries on PSU capability here.. I just feel like its a part of engineering that as a consumer I'm not willing to ignore in a purchase.
It does put quite a bit of extra strain to not have energy optimized parts like the GF7 series (in comparison to the energy hungry GF6 series) while in SLI.

Quote:
I dont break the market down as much as some people do, like $50 increments. These are the boundries that make sense to me and my purchasing habits. With SLI/7800GTX512 and equivalent ATI solutions, I think there needs to be a distinction between "high end" and "budget high end".
Or, a reasonable purchase, and unreasonable purchase :wink:


Funny thing is I think now is one of the first times in a while that both sides have solid 'overall' choices, however heading into this future design of pipes versus shaders/ALUs you're likely going to see cards that do well in one area (X1900 in F.E.A.R. / G71 in Quake4) while doing poorly in another. The video features are also nice, but very buyer specific. I'd say the only poor choices now are overpriced cards, but in a few months someone may buy the wrong card for the games they play because the other company's old generation may outperform that current one. We'll see though, could be completely different though and favour one design over another. To guess at it my money would be on the G71 simply because it's more traditional, and the apps that might benifit the alternative aren't really flooding the market yet.

I would agree. Though I have nothign against ATI now that they too have SM3. Xfire wouldnt be my poison of choice though (compared to SLI).
I want to see their power consumption numbers though. Esp in Xfire compared to SLI (where it truley starts to matter).

Quote:
On THG articles, I think the forum members would be able to write these articles better than the current THG staff. Even if people worked from their current locations and one person did the testing (Ape on GPUs, Cleeve on CPUs, ect and so on).


Yep, Crashman does a damn fine job on his reviews and to me is an example of what you say, just wish they'd give him more work. I hope to see an AIW X1900 review from him soon enough.
I was not aware. I will be looking into these articles.

Quote:
I see no reason to visit this site whatsoever other than to pop in the forums as I really like everyone besides 1 or 2 characters, and they disgust me enough that I'd rather just take the high road and be happy elsewhere.


I understand that's the way with any place, good and the bad, just a question of what the acceptable balance is. I tell you with the recent influx of n00bs you might not recognize the place. But if I hadn't still been here so long I probably would've dropped off and focus elsewhere too.
Yep, I'd come back if i had my original acct back too. But that was stripped from me so I've found refuge in others arms :wink:
Both AT/OCP FYI. I've made some very close friends at AT, but OCP can be more informative on the technical side of things (bios versions, ect) in my experience.
We get a lot of B3D "outcasts" at AT, who were banned for being too "pro-NV".
The guy who runs B3D stepped in AT a few times, and people tried to be respectful, but he got torn up. He doesnt have all-encompassing power at AT :p  And no ATI engineers cheerleading.

Quote:
Well, this Kinney fellow was one of a kind and could make me laugh and fume, all at the same time.

8)
a b U Graphics card
January 25, 2006 2:17:07 AM

Quote:

Not a big fan of AVIVO from the investigation I've done on it purosing the Doom9 forums. Those guys really are experts in that area, and they disapprove for many reasons (that I wont go into to stop from derailing the thread or starting a fisticuffs match), but they do admit its a 'nice start'.


I respect the guys at D9, I can see where they'd have criticism, for decoding it's the best non-dedicated right now, for encoding it still has a ways to go. But for the low end anything that helps is usually a benifit, for the D9 crowd, they like me have rigs built for their editing so it's less impactful. So as an HTPC it's the right direction in my opinion, beyond that I doubt anyone would care about AVIVO.

Quote:
Its not the money for me, it just shows a quality engineered product. I want to see steps to stop the energy bleeding in all future products.


I think neither company will make that effort, they are both looking for top spot in other areas. Only VIA with their Eden and Integrated graphics solutions seems to really care alot about performance vs power consumption, and even then it's for the low-end of the market. I would expect that any developments in that area will come from the low-end and integrated market. Usually an efficient design simply means they can try to push it even further. Which do you think the 2 of them would choose, 700Mhz at 55watts or 800hz at 100 watts? That's when your view on SLi becomes very attractive because you hit that wall of diminishing returns.

Quote:
I would agree. Though I have nothign against ATI now that they too have SM3. Xfire wouldnt be my poison of choice though (compared to SLI).
I want to see their power consumption numbers though. Esp in Xfire compared to SLI (where it truley starts to matter).


Well there are a few reviewers out there who test this, like in the TechReport's review they have the GT and the XT in SLi/Xfire, plus the newer cards as well;
http://techreport.com/reviews/2006q1/radeon-x1900/index.x?pg=13

What will be interesting is seeing the move to the next gen unified shader design, and whether there are power efficiencies that can be had there. Increased power consumption means increased overall heat, and depending on how well that's dissipated it can lead to increased temperaturs, but definitely better means of removing that increased heat. So there does need to be some developments on squeezing more performance out of these parts without resorting to exotic or gigantic cooling solutions.

Oh yeah, and funny you should mention the SM3 on ATi, because the more I see of it's current primary implementation (HDR) on either ATi or nV hardware ([H]'s review simply reminded me of this and drove it home further), the more I feell it's still just a checkbox feature for both (as is AA with HDR [between the two I'd prefer better AA over HDR]). It seems like many developers have found a new toy and are still trying to ind ways of using it more than just as a "Oooh look at this.... now back to the game" type of feature. I think there's potential for it, I always have, but right now it still seems to be used like virtual glitter.
January 25, 2006 2:41:22 AM

AVIVO is nice.. its a nice "freebie".. but its by no means the professionals choice. Its not exactly the HTPC dream some thought it would be either. That said, I respect its addition to ATI cards.



I dont expect power consumption to become a primary priority. I mean Smart changes like implementing mobile tech (like NV did with the GF7 desktop parts).
That just makes sense, doesnt slow things down, and is just wisely integrating your R&D. I want to see that. I buy a product after evaluating the whole, not just the top benches.

One thing I concede openly to ATI, they have EXR+AA.
:arrow: NV needs this in the G71.

But like your stance on power consumption, I find this nearly a moot point as well as current cards are hardly going to push EXR+AA without Xfire or SLI.. even then..

so I think NV lucked out on that one. And I'd also say ATI got by without implementing SM3 till the X1Ks.
Both kinda squeezed by there (or Nvidia will squeeze by here with G71, I predict).


I enjoyed the power consumption link with the X1900s. Nvidia does extremely well there, look at my 7800GT :) 
Can't wait to see once NV is on 90nm on the highend like ATI to add to their mobile tech. 8O
!