The multiprocessing story is somewhat similar to the old question of which came first, the chicken or the egg. Given a large number of multi processor or multi core systems, software developers would have a cake-walk in creating new applications and usage models. As this is not the case today, why should a small or medium sized software company spend a lot of energy figuring out how to further exploit the potential of multi processor or multi core environments?
We already mentioned certain services that can be considered essential today, such as anti-virus and firewall programs that help ensure your system's integrity. An average desktop computer usually runs no less than five to ten non-Windows services. These are things such as graphics card utilities, task bar icons for video or audio software, monitoring utilities, network accessories or services for other devices related to storage and communication. Each task bar icon you find in the lower right corner of your Windows screen represents a service that consumes some memory and CPU time. Given the scale of processing performance available today, we should want in the long term for these services to run without us having to notice their impact on the system.
But that does not yet answer the question of what to do with the additional resources a dual core processor provides. Well, here's an example. I remember Intel talking about speech recognition when the first Gigahertz Pentium III was released. I'd say this is something that is not widely available to date; at least I haven't found how to enable it in Windows XP yet. How about controlling your computer by talking to it? How about limiting access to your voice so only you can control it? Or chatting with somebody by having your computer convert your voice to text and reading your counterpart's answer, to give you the freedom of walking around with a Bluetooth headset?
Let's think about gaming: Have you ever played a game whose artificial intelligence would have been good enough to match a human player? I doubt it. More probabilistic evaluation, complex strategies with backup strategies, learning aptitude, and risk assessment could help to make computers worthy opponents. Educational software could equally benefit here.
The upcoming version of Windows, Longhorn , is another example of what computers should be capable of. The operating system needs to take care of the intelligent management, organization and display of data, exceeding the limits of hierarchical organization. For example, if I receive an email or create some other content, I want my system to be aware of the key information it contains in order to simplify my life. I don't want to spend one second considering whether I should store an audio file in a folder named after the artist or the kind of music it contains.
As you can see, this is going in a very specific direction. What we need is for computers to become more intelligent, in order to enhance our efficiency at interacting with each other. We also want them to become capable of managing the digital lifestyle that everybody is promoting. And for many, this is not just a vision, but an urgent necessity. Take some time and check how many MP3 files, documents, charts, presentations, digital images or other content you're storing on your hard drive. I bet it might already be enough content to scare you from doing that much needed reorganization...
- Here Comes The King: Athlon 64 X2 Reviewed
- Hyper Threading Vs. Dual Core Processing
- Will Dual Cores Fight Performance Demand Saturation?
- Future Applications Require Intelligence
- Athlon 64 X2 In Detail
- Performance Rating 4200+ To 4800+
- Athlon 64 Models Compared
- Athlon 64 X2 Test System
- The Competitor: Intel Pentium D Processor 840
- Test Setup
- Multi-Tasking Tests
- Multi-Tasking Benchmark Results
- No Hunt For Insane Frame Rates Any More
- DirectX 8
- DirectX 9
- Synthetic, Continued
- Power Consumption Test