Is This Stuff All That New? History Repeats Itself
Having this top down view of the P10 VPU, you can also get a sense of the historical perspective we now have on past developments in workstation graphics that have gotten us here. The P10 VPU, and probably most of what is yet to come in support of OGL 2.0 and DX9, owes much to developments that took place ten years ago.
In the early 90s, Silicon Graphics (SGI) and Evans & Sutherland (E&S) dominated the 3D graphics industry. They put out, respectively, the RealityEngine and Freedom series of graphics subsystems. Both architectures brought high levels of parallelism to pixel processing using multi-plane graphics subsystems. One difference with the past is that what the P10 integrates onto one chip today was a set of boards and ASICs back then.
The RealityEngine had eight geometry engines, and up to 320 pixel processors, among other things. I think it came on three big circuit boards. The Freedom series used a DSP (digital signal processor) farm where each DSP was a separate processing unit that could work on its own set of vertices. Each processed vertex was then passed on to a set of parallel pixel processors. The results were composited for delivery as a final image.
There are a couple of sources of information on these old boys that you might like to explore for more depth, but most are dated:
Technical Overview of RealityEngine in Visual Simulation has a enough of a hardware overview to show the corollary with what is going on today.
In addition, around the mid-90s, researchers at the University of North Carolina delivered a new chip architecture called PixelFlow. PixelFlow used a number of processors in a parallel array to process pixels from various subdivisions of the screen. Call it tiling.
Tiling was, and is, a difficult process for which to program. Software has to sort different graphics primitives for each portion of the screen, assign it to one set of pipes, and composite the final image at the end of the pipe.
There's an old Byte article that explains the architecture simply and succintly:
These technologies, for their time, were revolutionary in so far as they brought very high-end 3D image processing capabilities to the workstation market. Sure, these workstations cost tens of thousands of dollars, but they owed their existence to research on graphics displays in the early 80s, in the areas of military and flight simulators. When the GeForce3 came out, it brought some of the principles that we first saw in the workstation market of the early 90s into play on gaming desktops of this decade.
Another interesting aspect of the historical perspective is that SGI and E&S competed with proprietary systems, although SGI stole the show by developing an API to help developers use its hardware, an API that would eventually become OpenGL. Depending on who you believe, Nvidia used its relationship with Microsoft on Xbox to help define DX8, which actually laid the groundwork for programmable architectures.
Now, today's announcement of the P10 is an evolutionary step which is putting the spotlight on OGL 2.0. The P10 doesn't want to have to worry about putting transistors in for a T&L engine and DX7. So, we now have this drive on the part of P10 to make sure that it's the first hardware to root for OGL 2.0.
So, what does Creative do in all this? If Creative really gets into the P10 and swings back into graphics, will the company push OGL 2.0 to offset the influence of ATi and Nvidia on DirectX? Or, will Creative use the OGL 2.0 to create its own developer support base? Bear in mind that most games are going to be targeted at DirectX8.1 and Xbox level features at the base level. Surely, Creative is going to want some differentiation, and it already has a strong core of game developer support and influence from the audio side of things. Its worth thinking about.