3D Graphics Card Feature Article.

What did you think of the 3D Graphics Feature Article?

  • Good Article, Great Pics, wish I could get more.

    Votes: 6 24.0%
  • OK article nothing special

    Votes: 8 32.0%
  • No opinion

    Votes: 1 4.0%
  • Ugh!

    Votes: 4 16.0%
  • Old Reviews were better, Lars / Borsti come back!

    Votes: 6 24.0%
  • Who is Lars? Who is this Tom guy for that matter?

    Votes: 0 0.0%

  • Total voters
    25
Surprisingly I didn't notice this article until today;

http://www.tomshardware.com/2006/01/13/new_3d_graphics_card_features_in_2006/

I'd like to get the forum's opinions on it (especially from the veterans)

Also Check out my Poll!
binocs2eo.gif


POLL not Pole ya' Perv!
hump5uj.gif


.
--
Dang that was hard to make (very word sensititve!) It gave me dozens of post errors, and after each one it posted a truncated poll. I had to delete about 12 copies of this. :x
 

cleeve

Illustrious
Wow... wow. Come on, guys. There is some notable bad info in there:

The article suggests that the first DirectX 8 cards are the Geforce 4 & Radeon 9600?
Nope. Radeon 8500 and Geforce3 would be the first DirectX 8 cards.
The Radeon 9600 was a DirectX 9 card... Geforce 4 was DirectX 8 tho.


Mostly I take issue with the suggestion that you need an X1x00 or Geforce7 to see decent shader effects. Alot of similar, if not identical, shader effects can be done in plain DirectX 9 SM 2.0, and don't require SM 3.0 on the newest X1x00 or Geforce7 cards.

Geforce6 and Radeon 9550 and up can do DirectX 9. Even the GeforceFX series can do many of those things. Check titles like Doom3 and Half Life 2, lots of shader effects in there.

Any true DX9 card - i.e. Radeon 9550 and up, not sure about the GeforceFX series - can do HDR, not the SM 3.0 method but works just fine in Half Life. Why no half-Life HDR demo of Lost Coast?

Just seemed a little too pushy for the next gen cards. Or maybe I'm just hypersensitive, I dunno.
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
Ugh...was my vote

i didnt quite get the point of the article, thought i might of missed something nope i just read it again. for how fast cards and directx versions are being made would it not of been better for them to wait till directx 10 then list all the major changes.. directx 9 has been around for awhile now and most of the people at toms already know the dfifference between them(8 & 9).

P.S

how about an article comparing a 4400 X2 overclocked to a 4800 X2, compared with a 4800 X2 to see if we do actually get the correct performance.

and a couple of tope of the range intel vs amd, cause for some reason you always seem to go for a similair priced model, yes that gives us the best bang for buck but not which is more powerfull :)

or something you never see in the sound articles is how the thing actually sounds just cause the numbers are better it should sound better doesnt mean it does sound better.. kno what i mean?
 

Samsa

Distinguished
Dec 11, 2005
109
0
18,680
Man, not everybody 'at Tom's' knows as much as you about Graphics Cards. There are new people coming into the industry every day. Besides that, its a good thing to take stock of what features you can expect from graphics cards today so that people who don't have zotts of time to pour over forums and other websites can figure out what some of this stuff means.

Though the DX8 information may not be dead on, its kind of a moot point, considering no one is going to buy a DX8 card anymore.

Overall, despite a few shortcomings, I think this was a nice summary of current graphics card technology and trends in the industry.

Edit: And dude, this article has been on the front page for like a week. :lawl:

Edit 2: What did Lars and Borsti do better? I didn't read Tom's when they were around.
 
Wow... wow. Come on, guys. There is some notable bad info in there:

The article suggests that the first DirectX 8 cards are the Geforce 4 & Radeon 9600?
Nope. Radeon 8500 and Geforce3 would be the first DirectX 8 cards.
The Radeon 9600 was a DirectX 9 card... Geforce 4 was DirectX 8 tho.

Yeah that struck me as odd too, also interchanging PS2 with DX8 and PS3 with DX9. Having GF 3/4 equated to R9800, but the GF5 (aka FX) equated to the X800?
wow4lc.gif



Any true DX9 card - i.e. Radeon 9550 and up, not sure about the GeforceFX series - can do HDR, not the SM 3.0 method but works just fine in Half Life. Why no half-Life HDR demo of Lost Coast?

Exactly.! Maybe not OpenEXR as we've talked about, but that's not the only method of HDR. Heck Rthdribl has HDR in it's name, and it was the first true DX9 app IMO. Funny I read your R9550 and about and though about the R9500 being omitted, but you're talking performance and my brain instantly went to digits.
bonkself8pd.gif


Just seemed a little too pushy for the next gen cards. Or maybe I'm just hypersensitive, I dunno.

Also seemed somewhat vendor specific as to what's a benifit (no mention of vertec shader advanatage like geometric instancing) and which games to investigate (TWIMTBP?), but that may be my hype-sensitivity, similar to yours, I know you AGP guys are afraid of the next gen, luddites!
duck6yg.gif
bolt0qk.gif


8)
 
Though the DX8 information may not be dead on, its kind of a moot point, considering no one is going to buy a DX8 card anymore.

Well if they call the R9500-9800 and X800 series DX8, then many people.
If the information is wrong as to what the differences are then people may also think they NEED to upgrade, when in fact their old card is fine.

A little edit for cleanliness, no need for the last part in reply to a jest. I left my sense of humour at home today, which is rare.
 

Samsa

Distinguished
Dec 11, 2005
109
0
18,680
Heh, I see your point. 8)

So...I guess I approve of the concept of the article...but the quality appears to be lacking.

Edit: I was just teasing you, no offense intended.
 

cleeve

Illustrious
...many ignore them all together nowadays.

You know, it's funny that you mentioned that.

I came to this forum because a few years ago, I would go to Tom's homepage every day. Just the natural thing to do.

Now I rarely visit the homepage. In my opinion it seemed that they started spinning articles in an odd way.

Then they took the "high road" on some graphics card reviews by not reviewing them at all... for weeks after their competition did.

Yes, that's the high road, but this is a news site. Once news is old, it's no longer news. You used to be able to count on Tom's to have the first, or one of the first, unbiased reviews.

And bias... to this day, I do not understand why the 9700 PRO review called it the "pretender to the throne", when it beat the living snot out of it's competitor the Geforce 4 Ti. maybe I just don't understand the reference, I dunno...

Regardless, over time I started relying on Xbit and the Inquirer for my news tidbits. Tom's still has good reference articles, especially the VGA charts, but it's just not the king of graphics news anymore, IMHO....

... and these quasi-accurate articles are not helping their rep.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
I must say I was highly dissapointed in the article. However, I was impressed with your perseverance in making the poll. :p

Know that I have no idea what language the article was originally written in, so I do apologize for any criticisms that might not apply to the actual article itself, but rather an improper translation in the case of, say, the article originally being German, and poorly translated into English, which given some of the grammatical ackwardness I see, I am almost certain of. But still, I can't tell whether something is the way it is because the original article was like that, or whether it was an error in translation.

At any rate, the people above have already covered the point about confusing what cards supported what SM. Normally, this can be forgiven for most people, (having talked with a lot of people, a lot, about graphics cards, it's a common mistake) but I would've expected more out of an article on THG, which I've held as the gold standard for hardware. (but are now having doubts)

If memory serves me correctly, Morrowind was not the first game to display shaders; Halo on the Xbox had done so, and if we even count OpenGL, (though unclear, I interpreted the wording of the arcticle to specifically note shaders under D3D, not OGL) Quake III Arena was arguably the first game to implement pixel shaders. Of course, Microsoft held back a PC port of Halo until they had turned it into the first SM 2.0 (DX9) game.

Unfortunately, the article fails to immediately get more accurate; the comment on the development of shader models is incorrect as well; in 2002, nVidia only went to SM 1.3, (DX8.1) while ATi unvieled their first SM 2.0 cards, the Radeon 9700/9500 series on the R300. There was also no such thing as SM 2.1; the proper name, as far as I've been able to tell, was "SM 2.0 extended," though I see it alternately referred to as SM 2.1 or 2.0b; again, this is something that would be fine for normal people, but this is supposed to be a professional article, on a professional news site.

Further confusion arised when we get to discussion SM 3.0. Again, no surprise if this was an ordinary person, even with a fairly firm grasp on graphics, to do. the first ommision, which I feel likely can't have been the result of translation, is differentiating between DX 9, and DX 9.0c; that's the difference between SM 2.0 and SM 3.0. Then, although it's not a direct statement, the placement I see implies that parralax mapping is a SM 3.0 effect, when it can quite clearly be implemented without drawbacks under SM 2.0; F.E.A.R. readily shows this, as well as the upcomming Elder Scrolls IV: Oblivion.

Although not as significant as the others, it should be noted that the common misconception that "Bullet Time" first appeared in The Matrix is FALSE; it was merely the first appearance to truly popularize it. The first film to use it was the 1981 film Kill and Kill Again, (Wikipedia Link) and other places as well before 1999. The first actual CG version debuted in a project known as The Campanile Movie, by Paul Debevec, and the work there was directly drawn upon in the making of the scenes in The Matrix.

Once we get past the screenshots, and back into the article, things resume, unfortunately. Right off, my earlier suspicion had been confirmed, when they claimed that without SM 3.0, you cannot have HDR, parralax mapping, or transparent water. All three of those seperate claims, as we all know, are very false. A number of SM 2.0 demos and games demonstrate HDR. (including Masa's RTHDRIBL and Debevec's RNL) As for paralax mapping, there's my comment about F.E.A.R.; the parralax mapping works perfectly fine with a Radeon X-series card. And lastly, the water comment is laughable; even if you only count water that's shaded, even Morrowind's water is transparent, as well as that of other games, such as Far Cry, and all under SM 2.0. (Age of Empires 3 really cannot be considered a good demonstration of the capabilities of SM 3.0, given the inherent conflict of interest as it was produced by the same company that developed SM 3.0) Again, we also see what appears to be some odd confusions, that seem to be implying that SM 2.0 is part of DirectX 8, and SM 3.0 is part of DirectX 9.

However, at least in describing the effects of HDR and paralax mapping, the author is much more accurate, though they're slightly off on one point: normal-mapping is the true replacement for bump-mapping, as they both alter the way an object is lighted; (given that bump-mapping simply mimics varied elevation, while normal-mapping actually covers "angle") parralax mapping simply changes the way the texture is drawn on a surface, and allows for another tool to accomplish the same effect, but is still best when combined with normal-mapping.

Of course, once we get to the page on F.E.A.R.'s detail levels, we encounter more problems. Equating the Radeon 9800 with the GeForce 3 and 4, and the Radeon X800 with the GeForce FX? Let's slap a "the way it's meant to be played" icon over all the images while we're at it, okay?

In the end, my conclusion that the article was far less than professional; I’d like to believe that this was entirely due to lossy translation, but it is quite clear that many, if not the solid majority, of the major flaws are not the result of such.

Perhaps the biggest disappointment was a result of the English title; “ New 3D Graphics Card Features in 2006.” When I saw it the day it was posted, I eagerly opened it, hoping to see perhaps a future glimpse of some features to come in upcoming games, that had existed, but weren’t used, such as sub-surface scattering or radiosity. I was thoroughly disappointed for it instead give a flawed, and potentially slanted, review on the top shaders used in games in 2005. The title was very misleading; something like “All the latest shaders at work” might’ve been better for it.

Overall, it seems that while I have considered THG to be the best source for techy data on PC hardware, I've been seeing a disturbing trend, with some of the articles seeming to slip in quality. My first complaint came from the 8th itineration of the VGA charts; I cannot consider it as fair as the previous versions, as while they used version 81.85 of ForceWare, which was quite fresh upon the time of the article, they used the thoroughly outdated Catalyst 5.10. Most importantly, this was old enough not to include the major “hotfixes” that had resulted in massive performance increased for the Radeon X1k cards in OpenGL games such as Quake IV. If memory serves me correctly, ATi had Catalyst version 5.12 availible around the same time as ForceWare 81.85, if not even before then.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
You know, it's funny that you mentioned that.

I came to this forum because a few years ago, I would go to Tom's homepage every day. Just the natural thing to do.

Now I rarely visit the homepage. In my opinion it seemed that they started spinning articles in an odd way.

Then they took the "high road" on some graphics card reviews by not reviewing them at all... for weeks after their competition did.

Yes, that's the high road, but this is a news site. Once news is old, it's no longer news. You used to be able to count on Tom's to have the first, or one of the first, unbiased reviews.

And bias... to this day, I do not understand why the 9700 PRO review called it the "pretender to the throne", when it beat the living snot out of it's competitor the Geforce 4 Ti. maybe I just don't understand the reference, I dunno...

Regardless, over time I started relying on Xbit and the Inquirer for my news tidbits. Tom's still has good reference articles, especially the VGA charts, but it's just not the king of graphics news anymore, IMHO....

... and these quasi-accurate articles are not helping their rep.
Indeed, I must agree with you. As I mentioned above, it seems to be a disturbing trend to see the quality of the articles go down, and also to see the site become less and less the best source for the most up-to-date information; I often see things ommited entirely. And the slant that seems to be appearing in some articles is quite easily the strongest source of the quality loss.

As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given. I would be seriously appalled to find if I was wrong here. Graphics-chip favoratism is even worse than die-hard soda favoratism; at least there are substantial enough differences between Coke and Pepsi to merit a strong preference.

GPU fandom is sad enough; I've perused a number of ATi and nVidia's internal memos and presentations, so I say we need to let THEM be the fanboys/fangirls here, and just draw amusement here, and potentially really good deals on video cards while we're at it, and they're at each other's throats. Business competition is supposed to be good for the customers as well as the media, so there's no reason to toss that away.
 

cleeve

Illustrious
As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given.

Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html
 

p05esto

Distinguished
Jun 11, 2001
876
1
18,980
I personally liked reading the article and go for anything 3D/graphics related.

But the headline of the article had me believing I would learn about features and wiz-bang stuff not yet released. What I saw was technology already in cards in 2005. I was hoping for glimpses of what the next generation ATV/nVidia cards have in mind for us.

Some other reporting on screen resolutions, where LCD gaming is going and how it's improving and those sorts of things would have been nice.

But like I said the article was well done with the screen shots and explanations. But I was looking a little bit more for future stuff.

A good article might be about what Vista means for us hard core gamers. Good or bad? Do we want an OS hoggin up all the graphics horsepower? I'm still on W2K because I dispise the resource hogging, kiddie look of XP. And even in W2K I have all features diabled for optimal performance.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html
Although I haven't even been a READER of THG for terribly long, (only since late 2003) I can't say much for certain, but one thing that I've noticed is that even if 99% of their research is excelent, their translation often leaves something to be desired. I had forgotten that the title was phrased that way. :p
 
I must say I was highly dissapointed in the article. However, I was impressed with your perseverance in making the poll. :p

I hope you weren't in the forum yesterday around 1am Eastern (11Pm local) because I must've take up all the posts on the (display 10 posts screen) and about half of them on the lagrer views. I had like 12-15 going at once all with slight variantions to get past whatever I thought the culprit causing the 'error' was. When I finally returned to the forum to look at it, I said out loud Holy F-Uh-K! Spent next 5 minutes manually deleting each one and confirming deletion. I was worried someone would reply before I could delete them all. :lol:

Now you've spent more time analyzing the article than I have, nice job on that BTW.

original article was like that, or whether it was an error in translation.

And that's the thing, Lars/Bortsi is German as well, and his reviews always were solid. You might question some sidebar conclusions, but they showed the time/care take. There was no rush on this article either it wasn't some launch hardware. I may be being a little overly critical, but it seemed to do the opposite of what I would expect to be it's intent. Muddying the waters instead of clearing them in their DX8.1/DX9 glory. 8)

If memory serves me correctly, Morrowind was not the first game to display shaders; Halo on the Xbox had done so,

I think they may have been focusing on PC, and Morrowind PC shipped as Morrowind Xbox did. I'm not sure about the programable shaders in Halo, it may not have shipped as such with the Xbox title, but I wouldn't say. But Morrowind on the PC was a testament to the difference between DX8 and DX8.1. The Xbox version was DX8 because of the GF3 engine. OpenGL makes it more of an issue due to the developement of features before the support of hardware sometimes (seem easier with the extensions). May be the first time to enable featues? I don't know.



nVidia only went to SM 1.3, (DX8.1)

DX8.0 :p
Remember ATi had the DX8.1 part with the R8500, one of the main reasons I chose the AIW R8500 for Morrowind instead of the Geforces. After seeing the ExtendedPlay review on TechTV, I wanted all the bling, never looked back for that.

while ATi unvieled their first SM 2.0 cards, the Radeon 9700/9500 series on the R300. There was also no such thing as SM 2.1; the proper name, as far as I've been able to tell, was "SM 2.0 extended," though I see it alternately referred to as SM 2.1 or 2.0b; again, this is something that would be fine for normal people, but this is supposed to be a professional article, on a professional news site.

Yep I pointed that out in an e-mail to THG. I don't remember seeing PS2.1 on MSDN's D3D info either. :wink:

Further confusion arised when we get to discussion SM 3.0. Again, no surprise if this was an ordinary person, even with a fairly firm grasp on graphics, to do. the first ommision, which I feel likely can't have been the result of translation, is differentiating between DX 9, and DX 9.0c; that's the difference between SM 2.0 and SM 3.0. Then, although it's not a direct statement, the placement I see implies that parralax mapping is a SM 3.0 effect, when it can quite clearly be implemented without drawbacks under SM 2.0; F.E.A.R. readily shows this, as well as the upcomming Elder Scrolls IV: Oblivion.

Exactly, this almost seems like a chapter taken from the Cg workbook.

All three of those seperate claims, as we all know, are very false.

Exactly!

even if you only count water that's shaded, even Morrowind's water is transparent,

Yes is is, and on the FX series even more so :lol:
Morrowind transparent using PS1.4.

However, at least in describing the effects of HDR and paralax mapping, the author is much more accurate, though they're slightly off on one point: normal-mapping is the true replacement for bump-mapping, as they both alter the way an object is lighted; (given that bump-mapping simply mimics varied elevation, while normal-mapping actually covers "angle") parralax mapping simply changes the way the texture is drawn on a surface, and allows for another tool to accomplish the same effect, but is still best when combined with normal-mapping.

And then there's displacement mapping, or as ATi and nV do now, virtual displacement mapping. Getting rid of the virtual would be nice, but there seems to be little focus on rush to that point yet.

The only issue I had with the lighting section was saying that Specular lighting istelf was a functions of HDR, which isn't correct either, and to compound it it was said once again only SM3.0 could do it.

Of course, once we get to the page on F.E.A.R.'s detail levels, we encounter more problems. Equating the Radeon 9800 with the GeForce 3 and 4, and the Radeon X800 with the GeForce FX? Let's slap a "the way it's meant to be played" icon over all the images while we're at it, okay?

And that was the uneasy feeling I got only in that last section. Before I just thought, Oooh sloppy, but with that final page, OIE it just seemed rather 'directed'.

Perhaps the biggest disappointment was a result of the English title; “ New 3D Graphics Card Features in 2006.” When I saw it the day it was posted, I eagerly opened it, hoping to see perhaps a future glimpse of some features to come in upcoming games, that had existed, but weren’t used, such as sub-surface scattering or radiosity.[/quote]

I felt the same way. Alot was sort of a re-hash. Like, 'been there, done that on B3D' feel. I was thinking it would be in preparation of the launch of 3DMark, especially since I only saw it appear yesterday while we're all waiting for 3Dmark to be downloadable (still waiting at work).

Overall, it seems that while I have considered THG to be the best source for techy data on PC hardware, I've been seeing a disturbing trend, with some of the articles seeming to slip in quality.

That's why that second to last option was there.

My first complaint came from the 8th itineration of the VGA charts; I cannot consider it as fair as the previous versions, as while they used version 81.85 of ForceWare, which was quite fresh upon the time of the article, they used the thoroughly outdated Catalyst 5.10.

Yep, ATi has these drivers available early on request, but using the nV Betas once they become official and not making the effort to at least use the current ATi ones just seems unbalanced. IT's annoying when discussing cards too because nothing's current, but it is the only thing new people to the forum seem to ever use to defend their statements, despite improvements from both sides.

Most importantly, this was old enough not to include the major “hotfixes” that had resulted in massive performance increased for the Radeon X1k cards in OpenGL games such as Quake IV. If memory serves me correctly, ATi had Catalyst version 5.12 availible around the same time as ForceWare 81.85, if not even before then.

Especially if you are in the industry. Driver improvements like that deserve to be added. It's like doing a hardware comparison the day before the GF7800 launch or the day before the X1800 launch and hten saying the competition has nothing to respond with, and then not updating the review in lightof the new reality. I hate when that happenes regardless of what product it is because the next refresh is months away and until then it's the high water mark for the site.

Anywhoo, nice break-down/analysis, thanks for that was interesting seeing your thoughts. Also nice to know that with comments like Cleeve's and yours I wasn't just being hyper-critical or something.
 
GPU fandom is sad enough; I've perused a number of ATi and nVidia's internal memos and presentations, so I say we need to let THEM be the fanboys/fangirls here, and just draw amusement here, and potentially really good deals on video cards while we're at it, and they're at each other's throats. Business competition is supposed to be good for the customers as well as the media, so there's no reason to toss that away.

And that's the thing, people turn to the articles and to the members of the forum here to be objective and help them in their decision making where they would be spending their hard earned money. I'd hate to have someone think that buying an FX would get them anywhere near X800 level performance.
 

Samsa

Distinguished
Dec 11, 2005
109
0
18,680
As for the 9700pro article, I took the title to refer to the GeForce 4 Ti, not the Radeon 9700pro; the way the opening spoke, that's the impression I was given.

Yeah. I've read and reread the damn thing and I still can't figure out for sure exactly what is meant by that.

However, what is for certain is that the title of the article is: "ATi Radeon 9700 PRO - Pretender To The Throne"

It's probably a german translation glitch, but damned if it doesn't give the opposite impression it's supposed to.

http://www.tomshardware.com/2002/08/19/ati_radeon_9700_pro_/index.html

Its an improper translation.

Here's the German version:
http://www2.tomshardware.de/graphic/20020819/index.html

In German the title is "Anspruch auf dem Thron"

Anspruch was mistranslated. The correct translation should be 'claim', not 'pretender'. Its possible that someone confused their expressions when translating.

So it should've been called, "Claim to the Throne".

I knew that German degree would come in handy some day. :D

I'll take a look at the German version of this other article as well.
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
Man, not everybody 'at Tom's' knows as much as you about Graphics Cards. There are new people coming into the industry every day. Besides that, its a good thing to take stock of what features you can expect from graphics cards today so that people who don't have zotts of time to pour over forums and other websites can figure out what some of this stuff means.

let me restate myself i meant this

What I saw was technology already in cards in 2005. I was hoping for glimpses of what the next generation ATV/nVidia cards have in mind for us.

ok?, but still would of been nice for them to of waited till the next big thing(directx10) and then look back then there would be a nice list of old technology and what we could forward too.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
...Because such a large response deserves a response of its own... :D
I hope you weren't in the forum yesterday around 1am Eastern (11Pm local) because I must've take up all the posts on the (display 10 posts screen) and about half of them on the lagrer views. I had like 12-15 going at once all with slight variantions to get past whatever I thought the culprit causing the 'error' was. When I finally returned to the forum to look at it, I said out loud Holy F-Uh-K! Spent next 5 minutes manually deleting each one and confirming deletion. I was worried someone would reply before I could delete them all. :lol:

Now you've spent more time analyzing the article than I have, nice job on that BTW.
Well, luckily for you, I tend to be offline consistently before midnight. (Eastern American Standard Time) I'll take your word it looked impressive, though.

As for the post, it was just the natural result of the combination of my compulsive reading, (note the READING, not SKIMMING) and the manner in which I write posts; it surprises me that, with the way I tend to write posts, I'm nearing the 10,000-post mark for total posts across the Internet.

And that's the thing, Lars/Bortsi is German as well, and his reviews always were solid. You might question some sidebar conclusions, but they showed the time/care take. There was no rush on this article either it wasn't some launch hardware. I may be being a little overly critical, but it seemed to do the opposite of what I would expect to be it's intent. Muddying the waters instead of clearing them in their DX8.1/DX9 glory. 8)
Sadly, I wasn't really a very solid reader during that time. However, I've seen some of their reviews, and I do agree that they were much better. and yes, it clearly didn't help too well in what was definitely its goal, to explain all these graphics tricks that have become literal buzzwords all on their own.

I think they may have been focusing on PC, and Morrowind PC shipped as Morrowind Xbox did. I'm not sure about the programable shaders in Halo, it may not have shipped as such with the Xbox title, but I wouldn't say. But Morrowind on the PC was a testament to the difference between DX8 and DX8.1. The Xbox version was DX8 because of the GF3 engine. OpenGL makes it more of an issue due to the developement of features before the support of hardware sometimes (seem easier with the extensions). May be the first time to enable featues? I don't know.
Hence why I noted the "on Xbox." The original version used limited shaders; it, too, was effectively limited to cube-mapped water, if memory serves me correctly. I think they may have even used bump-mapping rather than normal-mapping... which would be pure ugh.

However, to be honest, as much as I've been into Morrowind, I never knew that there was a DX 8.1 version of the shaders as well. I never had any SM 1.x cards, only DX7 and earlier, and DX9 and later cards, oddly enough. (I went from a Radeon 7000, to a GeForce FX 5200, to my current Radeon X800XT; leaps and bounds, it has been)

DX8.0 :p

Remember ATi had the DX8.1 part with the R8500, one of the main reasons I chose the AIW R8500 for Morrowind instead of the Geforces. After seeing the ExtendedPlay review on TechTV, I wanted all the bling, never looked back for that.
SM 1.3 was part of DX 8.0? I thought that it was SM 1.0, 1.1, and 1.2 that were DX8.0, supported by the GeForce 3 and 3 Ti, While the GeForce 4 Ti was a DX 8.1 part, and supported up to SM 1.3. And, of course, the Radeon 8500 was DX 8.1, and supported up to SM 1.4.

Oh, and after *I* watched some TechTV, I wanted to avoid the channel at all costs, and never looked back after that. :p (I spend very little time watching TV as it is; I'd rather not use it inneficiently getting stuff I could get online)

Yep I pointed that out in an e-mail to THG. I don't remember seeing PS2.1 on MSDN's D3D info either. :wink:
Yes, I do distinctly recall it simply being labeled as "Shader Model 2.0 Extended." I normally consider it okay to shorten it, but as part of a professional article... *shakes head slowly*

Exactly, this almost seems like a chapter taken from the Cg workbook.
Which part? :?

Yes is is, and on the FX series even more so :lol:
Morrowind transparent using PS1.4.
As I said, I never experienced Morrowind's shaders at less than "full," and I never even knew there was more than one setting. I don't personally have the Xbox version (or an Xbox, for that matter) so I wasn't able to look at the water closely enough to tell anything more than it was shaded.

And then there's displacement mapping, or as ATi and nV do now, virtual displacement mapping. Getting rid of the virtual would be nice, but there seems to be little focus on rush to that point yet.
Personally, I still don't see the point of even real displacement mapping; normally, it seems like another way of doing something that's already been done, increasing the polygon count of a model. If I understand it correctly, it does allow for the real-time manipulation of the surface's geometry through the pixel shaders, but I've yet to see an application actually do that... And I may be wrong there. I guess part of my opinion on displacement mapping is likely due to me not understanding it as well as I should.

The only issue I had with the lighting section was saying that Specular lighting istelf was a functions of HDR, which isn't correct either, and to compound it it was said once again only SM3.0 could do it.
If memory serves me correctly, isn't Phong specular shading possible with any card supporting T&L? It allowed for hardware rendering of at least Gourad shading, so it would reasonably also allow for Phong shading.

And that was the uneasy feeling I got only in that last section. Before I just thought, Oooh sloppy, but with that final page, OIE it just seemed rather 'directed'.
Indeed, that was easily the most unsettling part of the entire article.

I felt the same way. Alot was sort of a re-hash. Like, 'been there, done that on B3D' feel. I was thinking it would be in preparation of the launch of 3DMark, especially since I only saw it appear yesterday while we're all waiting for 3Dmark to be downloadable (still waiting at work).
Yes, B3D is a very good site, and I find myself going there more often all the time. THG needs to do more if they wish their graphics section to remain even viable in the face of B3D, else we'll be seeing closer to a media monopoly here. (AnandTech, in my opinion, just isn't big enough to cover everything important, and most of the other sites, like The Inquirer, are in the "touch with a 10-foot stick" category)

To be honest, though, I hadn't thought about 3Dmark06, but had I remembered the launch date at that point, I would've certainly hoped to see something about that, only to be dissapointed yet more.

That's why that second to last option was there.
Alas, I had not really paid enough attention here long enough to select that; I thought about it, but went with simply "Ugh!"

Yep, ATi has these drivers available early on request, but using the nV Betas once they become official and not making the effort to at least use the current ATi ones just seems unbalanced. IT's annoying when discussing cards too because nothing's current, but it is the only thing new people to the forum seem to ever use to defend their statements, despite improvements from both sides.
Yes, the way I see it, a benchmark loses all relevance if it doesn't reflect the cards as the actual users will be presumed to be using them. And that would include using the latest drivers, (or even Omega drivers) as hopefully that's been drilled into their heads.

Especially if you are in the industry. Driver improvements like that deserve to be added. It's like doing a hardware comparison the day before the GF7800 launch or the day before the X1800 launch and hten saying the competition has nothing to respond with, and then not updating the review in lightof the new reality. I hate when that happenes regardless of what product it is because the next refresh is months away and until then it's the high water mark for the site.

Anywhoo, nice break-down/analysis, thanks for that was interesting seeing your thoughts. Also nice to know that with comments like Cleeve's and yours I wasn't just being hyper-critical or something.
Indeed, it's one of my main pet peeves when it comes to the technology world. When it comes to technology, I personally love nothing more than to see upheavals in the graphics war, and hence cementing the status quo like that really dissapoints me.

At any rate, thank you, and you're certainly welcome for the review of the review. :p
 

pauldh

Illustrious
Toms still does articles? ;) JK

I hadn't seen this one, but wow... not a job well done. The "GF3, GF4, R9800..." and "GF5, X800" bit was inexcuseable. I like the concept of the article, but it drastically needs to be corrected. It looks like ole Kinney wrote it.

Anyway, You guys pretty much covered it well, and I agree with all the points made. I'd like to vote "Ugh! & bring back Lars"
 

cleeve

Illustrious
Hell, Lars can stay in retirement.

All they have to do is post this thread as an article, and they're in business...

Hell, they could pick the best thread out of here every month and post it. Would make for some real good reading.
 
...Because such a large response deserves a response of its own... :D

YEah it's amazing, good replies end up growing geometrically it seems. I'll try to be brief :mrgreen:

However, to be honest, as much as I've been into Morrowind, I never knew that there was a DX 8.1 version of the shaders as well.I never had any SM 1.x cards, only DX7 and earlier, and DX9 and later cards, oddly enough. (I went from a Radeon 7000, to a GeForce FX 5200, to my current Radeon X800XT; leaps and bounds, it has been)

Some might consider the FX series the top of DX8.1 cards (DX9? Ummm...) :tongue:


SM 1.3 was part of DX 8.0? I thought that it was SM 1.0, 1.1, and 1.2 that were DX8.0, supported by the GeForce 3 and 3 Ti, While the GeForce 4 Ti was a DX 8.1 part, and supported up to SM 1.3. And, of course, the Radeon 8500 was DX 8.1, and supported up to SM 1.4.

Yeah, DX8.0 was up to 1.3 and DX8.1 added PS1.4 support. Hence the GF3 and 4 are only DX8.0.

Oh, and after *I* watched some TechTV, I wanted to avoid the channel at all costs, and never looked back after that. :p (I spend very little time watching TV as it is; I'd rather not use it inneficiently getting stuff I could get online)

The OLD TechTV was great, the NEW Tech TV SUCKS, VERY BAD!

Yes is is, and on the FX series even more so :lol:
Morrowind transparent using PS1.4.

As I said, I never experienced Morrowind's shaders at less than "full," and I never even knew there was more than one setting.

Funny thing is if you experienced it on your FX5200 the way I did on my FXGO5200, you experienced the FX series bug of completely transparent water. Looked like Air with a transition line and texture on top, completely different from All other cards (no opacity, just reflection).
It was finally fixed on Foreware 66.xx, so if you only played it after tha driver series you may never have seen it this;
http://www.ixbt.com/video/itogi-video/bugs0904/gffx5200-es3mw-bug1-1.jpg

Water looks like this for all others;
http://www.ixbt.com/video/itogi-video/gal0904/r9600-morrowind-1.jpg

Personally, I still don't see the point of even real displacement mapping;

Well and it's true that the effects are rarely greatly different on current hardware, but the demo by Matrox was nice, and with it displacement mapping you should be able to do more with less code. And that ends up being the rub, will the performance difference or coding difference be enough to justify it. I think eventually we'll go that way anyways because the VPU performance penalty will drop far enough where any benifit to programs will be benificial, but then again, it may end up like n-pathces and not go as far as we'd hoped. Displacement maps should get rid of some of the nasty issues with things like Truform in Morrowind where things were TOO bulgy. Anywhoo, we'll see.

If memory serves me correctly, isn't Phong specular shading possible with any card supporting T&L? It allowed for hardware rendering of at least Gourad shading, so it would reasonably also allow for Phong shading.

Now for that I'd actually have to check. Not enough time at work to check but it sounds right, just don't want to say yes, and then find out it was the VS generation instead. Hmm checking my facts, something that the authors/editors might like to get aquainted with. :oops:

Yes, B3D is a very good site...

Yep, B3D is among the best, the reviews are a little dry and don't compare cross-architectures/IHV much (other than power consumption it seems), but solid stuff. Andand I don't read unless directed to a specific article by someone, the InQ is nice for the 50/50 rumour, but you need to keep aware of that which you are. Others would be Xbit, The TechReport, Digit-Life, 3Dcentre.de, and [H]ard|OCP to name a few.

At any rate, thank you, and you're certainly welcome for the review of the review. :p

Always nice to see others expressing well thought out posts.

Anywhoo, back to work. Feeling kinda sick, but going skiing tomorrow.
jumpmirror1fl.gif
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Toms still does articles? ;) JK

I hadn't seen this one, but wow... not a job well done. The "GF3, GF4, R9800..." and "GF5, X800" bit was inexcuseable. I like the concept of the article, but it drastically needs to be corrected. It looks like ole Kinney wrote it.

Whats that supposed to mean :?: