Sign in with
Sign up | Sign in
Your question

Software and CPU: The Future

Last response: in CPUs
Share
January 25, 2007 1:35:23 PM

I admit compared to many here I am as smart as a lima bean in soup when it comes to computers etc. But I wanted some type of internal roadmap for my own guaging of the future. First I'll start off with my assumptions of what I think I know. Windows NT and XP can support multiple CPU's. I am pretty sure that XP can, anyway. If I am confused it would likely be in the difference between cores and physical CPU's/sockets. My current assumption is that while little is coded for multi-cores, XP still makes use of it by allowing better multi-tasking. So, I can be encoding a movie with Pinnacle and surfing the web while virus checking at the same time. And Vista is supposed to offer better support for multiple core CPU's.

CPU's: Up until the C2D's, it had been the megahertz game. AMD was always clocked lower and several months ago it seemed clear to me that the limit had been reached. In other words, the point of diminishing returns. Instead, the concept appears to be analogous to this: rather than try an bulk up a man to lift 900 pounds, let's just get 4 rather fit guys to do it.

Of course, the market (OEM's and software writers etc.) always cater to the common denominator. Though XP could work with multiple cores, no one demanded it a year ago and no one wrote for it.

So the question is, are we roughly seeing [in terms of the future . . say 2 years out] the clock numbers that represent the upper limits and only more cores (working more effeciently instead) will be added, or are we going to see dual cores that can reach into the 6 ghz range? Would that even be desireable?

With respect to the Vista (aside from serioius DRM issues) and XP, what is the logical reason (besides DirectX 10) to switch from XP to Vista and lastly what is the logical reason to get more than a two core CPU? Will 4 cores allow my scenario above (Pinnacle/Virus Checking/Web surfing) to go any faster than a dual core?

More about : software cpu future

January 25, 2007 3:54:24 PM

From my point of view, we, as an industry, are going to need better programming tools to help take advantage of the massive parallelism on the horizon. Today's languages leave most of the parallelism to the programmer and the operating systems do most of the work by process (or program, if you prefer).

There are a million programming languages out there already but very few automate parallel programming. We will likely see more extensions to existing languages at first, but I'm sure someone out there will come out with something entirely new that will catch on. The trick will be simplifying the synchronization code that would have to be in place to support parallel programming.

One alternative to new programming languages would be application frameworks. If the framework is inherently parallel then software developed with it would be too. So, let's say Microsoft rebuilt the .NET framework to assume parallelism for certain tasks, such as sorting a list of items. When a programmer uses that sort function a new thread would automatically be launched and synchronization mechanisms would be automated as well. This approach has a lot of potential but will also require more common sense from programmers (such as avoiding doing stupid things like launching 100000 threads to make an application more "efficient" - that many threads would be a LOT of overhead with no benefit).
January 25, 2007 4:15:28 PM

2 years out we will see 4-8 core processors running at the same or maybe slightly higher clock rates than now (if the rumors about Intel's 45nm technology turn true). All CPU-intensive applications will be parallel and will scale with the number of cores.

In your scenario the only CPU-intensive task is video encoding, which indeed will be faster with 4 cores.
Related resources
January 25, 2007 4:19:26 PM

Quote:
...So, I can be encoding a movie with Pinnacle and surfing the web while virus checking at the same time. ...
Will 4 cores allow my scenario above (Pinnacle/Virus Checking/Web surfing) to go any faster than a dual core?


No, 4 cores don't matter for that scenario of use, but 2 are better than 1. Also, it's not always popular here in the cpu forum to point it out, but for your scenario of use, with the virus check, and in many other situations, you can get a bigger boost by spending an extra $40 more on the hard drive (for a nice Seagate 7200.10) or $100 more (for the cheap Raptor), than putting an extra $100 on the cpu, if it is at least a dual core, of any kind.
January 25, 2007 4:20:48 PM

Quote:
All CPU-intensive applications will be parallel and will scale with the number of cores.


How exactly can you make this claim? There is nothing currently in C++, C#, Java, Delphi, Perl, or VB (or any other "common" programming language) that is inherently parallel. All parallelism must be explicitly designed by programmer and I would be VERY surprised if it were commonplace even 2 to 3 years from now.
January 25, 2007 4:25:47 PM

Quote:
All CPU-intensive applications will be parallel and will scale with the number of cores.


How exactly can you make this claim? There is nothing currently in C++, C#, Java, Delphi, Perl, or VB (or any other "common" programming language) that is inherently parallel. All parallelism must be explicitly designed by programmer and I would be VERY surprised if it were commonplace even 2 to 3 years from now.

Indeed it must be explicitly designed by the programmer. But, since CPU intensive applications usually compete on performance, vendors will make sure their apps scale with the number of cores. There are very few tasks that cannot be made parallel in principle.

Of course with the current tools wiriting parallel code is a pain (say you need a junior dev to write single thread code, and you need a senior dev to write multi-threaded code). But that's just an engineering problem :D 

Edit - on the server side, we've had multi-threaded code for many years now, so there are plenty of experienced devs who can do it, too
January 25, 2007 4:39:06 PM

Most server apps should be able to be made parallel because of their nature. If nothing else, each "connection" to the app could be its own thread or process.

On the desktop side, though, things are a whole lot tougher. Most frameworks do not support multi-threaded access to the UI code (neither Microsoft's MFC nor Borland's VCL, for instance). Most desktop apps spend most of their time waiting on the user anyway. Things like background spreadsheet recalculation in Excel and background spell checking in Word have been around for a long time and are perfect examples of how to make things more parallel friendly.

Background applications really are where most people will benefit from more cores. Having real-time virus, malware, firewall, encryption and other processing tasks going at once will benefit greatly from more cores. A fully encrypted OS (including memory and swap file) would be easily possible with a 4 or 8 core computer. But I doubt there will be any foreground application in the near future (besides games) that will take advantage of that available parallelism for most users.
January 25, 2007 4:45:03 PM

If by "foreground" applications you mean interactive apps/tasks then yes, there's no point to make them parallel as they are not limited by CPU - most of the time, they idle around waiting for user input.

Most current background apps are not CPU limited either - you can run an encrypted file system, a virus scanner and azureus in the background using just 1 core and still have spare CPU cycles.
January 25, 2007 4:52:21 PM

Quote:
.... A fully encrypted OS (including memory and swap file) would be easily possible with a 4 or 8 core computer. ....
Quote:



This is interesting. Would it take 4 cores for sure? I mean, do you have a pretty good handle on the load?
January 25, 2007 4:54:41 PM

Quote:
Quote:
This is interesting. Would it take 4 cores for sure? I mean, do you have a pretty good handle on the load?


What's the point of encrypting RAM though? Encrypting the filesystem and the swap file can be done with current hardware, you don't need 4 cores to do that :D 
January 25, 2007 5:13:08 PM

Quote:
This is interesting. Would it take 4 cores for sure? I mean, do you have a pretty good handle on the load?


What's the point of encrypting RAM though? Encrypting the filesystem and the swap file can be done with current hardware, you don't need 4 cores to do that :D 

There are lots of good reasons to encrypt RAM. Malware applications frequently pull information from memory that would be inaccessible if it were encrypted.

As far as NEED goes, no you do not need 4 cores to make it all work, but the point would be to make it all work in such a way that the user doesn't even notice (so near zero hit to performance, even while gaming). That was my point for 4 cores being a realistic minimum if you were going to encrypt EVERYTHING.
January 25, 2007 5:17:02 PM

Quote:

There are lots of good reasons to encrypt RAM. Malware applications frequently pull information from memory that would be inaccessible if it were encrypted.


How would the OS tell a difference between a malware app accessing RAM and a legitimate app accessing RAM? Surely, if it could tell, it'd just disable the malware app outright...
January 25, 2007 5:18:14 PM

I'll be quite happy when popular applications start coming out that really do much better with 4 cores.

Actually, there's opportunity there, for a genius programmer.
January 25, 2007 5:23:51 PM

Quote:

There are lots of good reasons to encrypt RAM. Malware applications frequently pull information from memory that would be inaccessible if it were encrypted.


How would the OS tell a difference between a malware app accessing RAM and a legitimate app accessing RAM? Surely, if it could tell, it'd just disable the malware app outright...

I think you're missing the point. If done right, the OS wouldn't have to care, so long as it makes it "impossible" for one process to appear like it is another process. One current approach for malware is to allocate a bunch of memory after the target application has been closed, then "sniff" that memory for a known footprint and grab the stuff close by that it wants, like a password or something else. If it tried that when the memory is encrypted then all it would get is encrypted garbage.
January 25, 2007 5:24:38 PM

This is aready done with programs like Entrust. 4 cores not needed, though helps depending on the encryption scheme.
January 25, 2007 5:30:27 PM

Quote:

I think you're missing the point. If done right, the OS wouldn't have to care, so long as it makes it "impossible" for one process to appear like it is another process. One current approach for malware is to allocate a bunch of memory after the target application has been closed, then "sniff" that memory for a known footprint and grab the stuff close by that it wants, like a password or something else. If it tried that when the memory is encrypted then all it would get is encrypted garbage.


Well, to prevent that all you need is to zero out memory upon allocation/deallocation, no need for encryption.
January 25, 2007 5:49:39 PM

Quote:
Well, to prevent that all you need is to zero out memory upon allocation/deallocation, no need for encryption.


True to a point, except if the malware can gain access to the memory while the application is running, which it could do if it were pretending to be a device driver (ring 0 access, so to speak). If the OS could uniquely identify every process and thread running (which it should be able to do) then it could encrypt based partly on that and prevent any outside process or thread from accessing memory it does not have the right to access. None of it is really this simple, but malware does currently use tricks like I mentioned to get information that the OS could relatively easily protect.
January 25, 2007 6:25:20 PM

So there is a distinction here I think needs to be made since I am a lame-O:

multi-cores in an environment where an individual program can make use of multi-cores

and

multi-cores in an environment where an OS can make use of multi-cores.

Of the latter we have that. Can it be improved upon significantly to justify Vista's expense?

Secondly, it would appear that (to me anyhow) that the SAFEST thing to do it to buy a fast quadcore CPU and wait for the apps to catch up.

I mean, you are a programmer and you have to decide whether or not to get "speed" by expecting the user to buy faster single core CPU's OR to get speed by taking advantage of multi-core environments.
January 25, 2007 6:33:10 PM

Quote:

True to a point, except if the malware can gain access to the memory while the application is running, which it could do if it were pretending to be a device driver (ring 0 access, so to speak). If the OS could uniquely identify every process and thread running (which it should be able to do) then it could encrypt based partly on that and prevent any outside process or thread from accessing memory it does not have the right to access. None of it is really this simple, but malware does currently use tricks like I mentioned to get information that the OS could relatively easily protect.


Um, the OS already outright blocks a process that tries to access the memory of a different process (I think the memory is simply not mapped i.e. there's no way to address the memory of a different process).

The only exception is the debugger API which lets the debugger read the memory of the program being debugged - but once again, the OS can't distinguish between a legitimate debugger and a malware problem trying to spy, so encryption won't help here either.

Legitimate device drivers need to access the memory so the OS will have to unencrypt it on-the-fly, even if the "driver" is in fact a spyware/trojan.
January 25, 2007 6:38:04 PM

Quote:

Of the latter we have that. Can it be improved upon significantly to justify Vista's expense?


The short answer is no, Windows XP already does a fine job. The main advantages of Vista are improved security (supposedly), smoother UI, and better games (which don't yet exist).

Quote:

Secondly, it would appear that (to me anyhow) that the SAFEST thing to do it to buy a fast quadcore CPU and wait for the apps to catch up.


I would recommend to buy a CPU which has good price/performance right here, today, without worrying about the future too much. The apps you use will run fine with a dual core, you can upgrade later if you need it.
January 25, 2007 7:40:33 PM

Quote:
Um, the OS already outright blocks a process that tries to access the memory of a different process (I think the memory is simply not mapped i.e. there's no way to address the memory of a different process).

The only exception is the debugger API which lets the debugger read the memory of the program being debugged - but once again, the OS can't distinguish between a legitimate debugger and a malware problem trying to spy, so encryption won't help here either.

Legitimate device drivers need to access the memory so the OS will have to unencrypt it on-the-fly, even if the "driver" is in fact a spyware/trojan.


Not exactly, one process can attach itself to another process. In fact you can execute a thread in the memory space of another process in Windows, especially easy if you already have admin rights, but possible without. And since that lower 2 GB of physical RAM is actually shared between applications, it is possible to read the contents of memory from a formerly resident program. Windows does not clear this memory by default.

An interesting trick I've seen used is to attach to a process that has higher rights, such as a virus scanner, and then use that to gain even higher rights until you have the rights you need to do what you want to do. All it takes is one trojan-type program and a known weakness. It is surprisingly easy with root-kit technology (which you can pick up books that tell you what to do to make a root-kit).

You have a good point about the device driver issue. Once something gets that level of control there is probably nothing reasonable the OS can do unless all drivers operate from "user" memory instead of "kernel" memory. Not even Windows Vista is that strict (and would be very slow if it were). My main point is that anything that is done at the OS level should help more than relying on a application programmer to do it.
!