We just learned that the color of each vertex of the kettle as well as the texture(s) that are supposed to be applied to it later and their positions are also carried as information within the vertex data. However, what still needs to be applied to each vertex in order to make this 3D-object visible is the lighting. These much more complex calculations are done by the lighting engine. A lot has to be taken into consideration for those calculations. The number, the type (global, uniform, directional, spot) of each light as well as the way it is being applied (ambient, diffuse or specular) need to be taken into consideration by the lighting engine before the vertex receives its lighting data.
The actual lighting process comes only into play if the game is actually using 'vertex lighting' instead of the today most commonly used 'light mapping', which applies light to a triangle in form of a light texture.
The Vertex Shader
What I just described was the normal work of the already known T&L-unit as found in GeForce256, GeForce2 and ATi's Radeon. GeForce3 does also contain the so-called 'hardwired T&L' to make it compatible to DirectX7 and also to save execution time in case a game does not require the services of the 'Vertex Shader'.
We know now that each vertex carries a considerably huge amount of information, like its coordinates, weight, normal, color, texture coordinates, fog and point size data. The 'vertex shader' allows the alteration of this information with little programs. These programs are executed by the vertex shader itself and do therefore not require any computing power of the CPU. While the classic T&L-engine restricted the influence of the game developer to what happens before the 3D-pipeline reaches the transform level, he can now do whatever he likes with the 3D-objects of his game, and that without any CPU-performance penalty.
The opportunities are huge. The program could alter the coordinates of a vertex, thus basically changing the shape of the object for e.g. realistic movements, motion blur, blending, interpolations (morphing) or deformations. It could change the color values or the texture coordinates and thus the surface appearance of the object for stuff like color effects, reflections, bump map setup (e.g. for Blinn bump mapping) or other projected textures. The lighting can also be influenced almost to the developer's heart's content. There are restrictions however. The possibilities of the vertex shader are not as 'infinite' as NVIDIA managers wanted to make us believe when they baptized it 'nfiniteFX Engine'. The programs executed within the vertex shader can only have a certain size (128 instructions) and the execution time is of course another limiting factor. Let's have a quick look at the details.