I must say I was highly dissapointed in the article. However, I was impressed with your perseverance in making the poll.
Know that I have no idea what language the article was originally written in, so I do apologize for any criticisms that might not apply to the actual article itself, but rather an improper translation in the case of, say, the article originally being German, and poorly translated into English, which given some of the grammatical ackwardness I see, I am almost certain of. But still, I can't tell whether something is the way it is because the original article was like that, or whether it was an error in translation.
At any rate, the people above have already covered the point about confusing what cards supported what SM. Normally, this can be forgiven for most people, (having talked with a lot of people, a lot, about graphics cards, it's a common mistake) but I would've expected more out of an article on THG, which I've held as the gold standard for hardware. (but are now having doubts)
If memory serves me correctly,
Morrowind was not the first game to display shaders;
Halo on the Xbox had done so, and if we even count OpenGL, (though unclear, I interpreted the wording of the arcticle to specifically note shaders under D3D, not OGL)
Quake III Arena was arguably the first game to implement pixel shaders. Of course, Microsoft held back a PC port of
Halo until they had turned it into the first SM 2.0 (DX9) game.
Unfortunately, the article fails to immediately get more accurate; the comment on the development of shader models is incorrect as well; in 2002, nVidia only went to SM 1.3, (DX8.1) while ATi unvieled their first SM 2.0 cards, the Radeon 9700/9500 series on the R300. There was also no such thing as SM 2.1; the proper name, as far as I've been able to tell, was "SM 2.0 extended," though I see it alternately referred to as SM 2.1 or 2.0b; again, this is something that would be fine for normal people, but this is supposed to be a professional article, on a professional news site.
Further confusion arised when we get to discussion SM 3.0. Again, no surprise if this was an ordinary person, even with a fairly firm grasp on graphics, to do. the first ommision, which I feel likely can't have been the result of translation, is differentiating between DX 9, and DX 9.0c; that's the difference between SM 2.0 and SM 3.0. Then, although it's not a direct statement, the placement I see implies that parralax mapping is a SM 3.0 effect, when it can quite clearly be implemented without drawbacks under SM 2.0;
F.E.A.R. readily shows this, as well as the upcomming
Elder Scrolls IV: Oblivion.
Although not as significant as the others, it should be noted that the common misconception that "Bullet Time" first appeared in
The Matrix is FALSE; it was merely the first appearance to truly popularize it. The first film to use it was the 1981 film
Kill and Kill Again, (
Wikipedia Link) and other places as well before 1999. The first actual CG version debuted in a project known as
The Campanile Movie, by Paul Debevec, and the work there was directly drawn upon in the making of the scenes in
The Matrix.
Once we get past the screenshots, and back into the article, things resume, unfortunately. Right off, my earlier suspicion had been confirmed, when they claimed that without SM 3.0, you cannot have HDR, parralax mapping, or transparent water. All three of those seperate claims, as we all know, are very false. A number of SM 2.0 demos and games demonstrate HDR. (including Masa's RTHDRIBL and Debevec's RNL) As for paralax mapping, there's my comment about
F.E.A.R.; the parralax mapping works perfectly fine with a Radeon X-series card. And lastly, the water comment is laughable; even if you only count water that's shaded, even
Morrowind's water is transparent, as well as that of other games, such as
Far Cry, and all under SM 2.0. (
Age of Empires 3 really cannot be considered a good demonstration of the capabilities of SM 3.0, given the inherent conflict of interest as it was produced by the same company that developed SM 3.0) Again, we also see what appears to be some odd confusions, that seem to be implying that SM 2.0 is part of DirectX 8, and SM 3.0 is part of DirectX 9.
However, at least in describing the effects of HDR and paralax mapping, the author is much more accurate, though they're slightly off on one point: normal-mapping is the true replacement for bump-mapping, as they both alter the way an object is lighted; (given that bump-mapping simply mimics varied elevation, while normal-mapping actually covers "angle") parralax mapping simply changes the way the texture is drawn on a surface, and allows for another tool to accomplish the same effect, but is still best when combined with normal-mapping.
Of course, once we get to the page on
F.E.A.R.'s detail levels, we encounter more problems. Equating the Radeon 9800 with the GeForce 3 and 4, and the Radeon X800 with the GeForce FX? Let's slap a "the way it's meant to be played" icon over all the images while we're at it, okay?
In the end, my conclusion that the article was far less than professional; I’d like to believe that this was entirely due to lossy translation, but it is quite clear that many, if not the solid majority, of the major flaws are not the result of such.
Perhaps the biggest disappointment was a result of the English title; “
New 3D Graphics Card Features in 2006.” When I saw it the day it was posted, I eagerly opened it, hoping to see perhaps a future glimpse of some features to come in upcoming games, that had existed, but weren’t used, such as sub-surface scattering or radiosity. I was thoroughly disappointed for it instead give a flawed, and potentially slanted, review on the top shaders used in games in 2005. The title was very misleading; something like “
All the latest shaders at work” might’ve been better for it.
Overall, it seems that while I have considered THG to be the best source for techy data on PC hardware, I've been seeing a disturbing trend, with some of the articles seeming to slip in quality. My first complaint came from the 8th itineration of the VGA charts; I cannot consider it as fair as the previous versions, as while they used version 81.85 of ForceWare, which was quite fresh upon the time of the article, they used the thoroughly outdated Catalyst 5.10. Most importantly, this was old enough not to include the major “hotfixes” that had resulted in massive performance increased for the Radeon X1k cards in OpenGL games such as
Quake IV. If memory serves me correctly, ATi had Catalyst version 5.12 availible around the same time as ForceWare 81.85, if not even before then.