Sign in with
Sign up | Sign in
Your question

CPU power vs programming...

Tags:
  • CPUs
  • Performance
  • Programming
Last response: in CPUs
Share

Do you feel today's software are failing to unleash the full potential of new processors? Are we all being fooled by hardware/software vendors?

Total: 43 votes

  • Yes
  • 85 %
  • No
  • 16 %
July 8, 2006 11:43:01 AM

Just taking a look @ 2000's and today's games one doesen't notice that increase in detail or overall performance that transition from a P3 700MHz to P4 3.0 GHz should give. I used to play Porsche Unleashed 2000 on a P2 400 but you can't dream of doing the same with NFS2 undergroung though there's not much added.
AutoCAD2004 reccomends 128 MB of RAM; it ramps to 512 for the 2005 and 1GB for the 2006. HOLY SH*T, 8X the RAM while there can be no more than 2% of difference between '04 and '06. Once programmers used to do a lot of optimisations to increase performance on less powerful CPUs but now, with powerful platforms, they just don't and it's not surprising how fast they (and we all) hit the ceiling of the newest technology.

More about : cpu power programming

July 8, 2006 12:23:36 PM

I don't know if you've used ACAD, but even if the recommended memory for 2004 is 128M (I don't know what the recommended is) that amount of memory would be much too painful to use in any capacity other than viewing drawings, as example. Heck, 128M is too painful for just the OS alone, let alone for using any demanding application.

I also am not sure how you're coming up with only a 2% difference in options between 04 and 06 but I have to assume it higher. From what I've seen just by looking at the menus (I don't use autocad, but I support it) makes me think it's higher than 2 percent.
July 8, 2006 1:17:43 PM

Programming is hard even for quite bright people. That means creating software is expensive. You've got a lot of highly-skilled man-hours to pay for. The extra effort and cost of producing highly-optimised code makes no sense financially if cheap, powerful hardware is readily available.

Really the challenge for a programmer is not to write code for a machine to read: it's to write code which people can read. Easy to maintain code will be less expensive and that's where the main effort goes. It may not be perfectly optimised but it doesn't need to be. It just has to be good enough.
Related resources
July 8, 2006 1:24:47 PM

I'll answer the question about memory usage first. Ok, as you probably know programs typically get more and more bloated each new revision, i.e additional features/eye candy, this in turn adds to both the memory footprint of the binary when loaded and the number cycles it uses to display eye candy widgets.

Generally the new features require more internal data storage to represent the canvas state/objects (graphical). This all adds to memory bloat, just look at vista.

Next, taking advantage of newer processors. Well from what you saying this isn't really a processor bottleneck, it's more down to the graphics card(s) used and the people designing the textures.

I don't personally think software is falling behind hardware overall, however there is always a period of catchup to be expected.

In the context of multiple cores, it's not a simple task programatically to add multithreading. Often program structure has to be rewritten from scratch to create a viable framework. Even when working on new programs, multithreading is difficult to implement efficiently.

You have to deal with alot of issues in a multithreading environment.
(1) Synchronized global resource access (which can be challenging at the best of times).
(2) Re-entrant and non-Re-entrant functions. (Do these functions modify global state/data ?)
(3) Managing thread state (waits/resumes/termination)
(4) Thread communication (event objects[win32]/pipes?)
(5) Unforseen deadlocks/race conditions (these are nasty to debug)

Existing games will never take full advantage of multiple cores unless they are re-designed from scratch. I don't blame companies who choose not to use multithreading, it can add alot of extra development/testing time to a project, and often adds additional bugs to release versions.
July 8, 2006 1:27:46 PM

Quote:
I don't know if you've used ACAD, but even if the recommended memory for 2004 is 128M (I don't know what the recommended is) that amount of memory would be much too painful to use in any capacity other than viewing drawings, as example. Heck, 128M is too painful for just the OS alone, let alone for using any demanding application.

I also am not sure how you're coming up with only a 2% difference in options between 04 and 06 but I have to assume it higher. From what I've seen just by looking at the menus (I don't use autocad, but I support it) makes me think it's higher than 2 percent.


Well, I use it and as for the menus they only look better but have nothing new to bring and many architects/engineers I know think the same; 2006 only takes longet to boot and mekar your graphics heavier. In everyday (2D use), the difference is less than 2%.
While you can work fine with 2004 on a 500MHz machine, 2006 doesn't even install there.
One more thing I have noticed about it; it's 2006 and they still haven't put a stupid option to draw the quote (height) sing. You still have to waste time n to draw it or import it when you're doing a cutoff. No man, I'think programming is becoming less intelligent everyday/ :cry: 
July 8, 2006 1:30:37 PM

i think the point he is trying to make is that once upon a time programs had to be scrutinised down to the nth degree in order that they can run on a 486 running windows 3.1 however nowadays even tho we all have 2Ghz plus cpus and roughly 1gb ram installed things do not seem quicker.
but i beleive the problem is with overheads. take for example windows xp the first thing i do after a clean install is turn off everything that makes it look nicer (read themes) i also turn off any unwanted services. now translate this to autocad (note i do not know auto cad that well im a pro engineer man myself) im sure that there are little things that are running in the background to make things (apparently) a little easier. now i personally prefer quick response times over things being slighly done for me. now using office as an example it still takes me 10 or so seconds to load up the application why is this you ask? hard drives. whilst cpu's are infinately more powerful than in the times of good old 3.1 hard drives whilst faster still take an age seek and to transfer data note i run office off a raid 0 array so speed is partially increased. an idea i liked but would never risk trying (both for price and stability) is the pci based ram hard drive that was brought out about a year ago and reviewed here at thg (cant remember the manufacturer). i did play around with ramdisk for a while but for some reason it was limited to 32MB so could only really run doom off it and the fact that it wiped on a restart really p***** me off, but hey there is nothing that can be done about that.

Any way ive gone off on a tangent; going back to office that F****G clippy thing is the bane off all things quick, designed to make things easier by asking him a question in his pop up window he only really served to slow things down for those of us who know what we are doing. Clippy is the representation of all things bad about microsoft (not saying that MS is bad infact i like windows but some things do annoy me).

Now looking forward to Vista i have mixed emotions
1.) with integrated DirectX 10 will this really speed up daily use of the system or just serve to make things look pretty/easier to use?
2.) will the clock panel and everything associated with it (not sure if that is still gonna be included forgive my ignorance) only serve to increase cpu usage?
3.) how will the new file system serve to speed things up?
4.) will the new file system even be included? ( last i heard was no!)

personnally i would still run everything off DOS if it allowed 5.1 surround sound 32bit colours resolutions of 1280x1024 and graphics acceleration that allowed me to play far cry just like i do now (only without windows overheads)

however it is these very overheads that allow us to connect to the internet network with mates and watch them die in high res glory. so can we expect programmers to show us that are cpus are alot faster than they once were my answer is they already are but not fully in the way that every one would like. yes they could tidy things up a little but by making programs do things that you wouldnt necessarly notice, auto spell check in word is a good example you know the little read line that appears under your lettering when youve misspelled something, that was not there in 3.1 nor do i think it was there in 95 but when processors became able to cope with the extra load they thru it in for us (thanks by the way Bill).


anyway hope you enjoyed my little rant this pissed me off for a while to until i started thinking bout it. what could be the solution is that with all the different variations of software out there nowadays is to release non pleb versions Windows Vista Non pleb version would be great its symbol being clippy on a noose :twisted: :twisted: :twisted:
lol
July 8, 2006 1:34:44 PM

Most of it's down to windows and it's gui eye candy mentality (led by M$ of course). Adds more and more bloat. Just look at the number of windows services running on a default install :)  M$ tried to hide this obscene number by using wrapper processes called services.exe and svchost.exe.

Try installing gentoo linux with X and fluxbox. Base memory usage is about 30/40mb :D 
July 8, 2006 1:53:25 PM

Quote:

Now looking forward to Vista i have mixed emotions
1.) with integrated DirectX 10 will this really speed up daily use of the system or just serve to make things look pretty/easier to use?
2.) will the clock panel and everything associated with it (not sure if that is still gonna be included forgive my ignorance) only serve to increase cpu usage?
3.) how will the new file system serve to speed things up?
4.) will the new file system even be included? ( last i heard was no!)


(1) The directx 10 spec only defines the behaviour of graphics drivers in respect of windows, not the implementation, ergo, graphic card hardware will not likely be changed because of it.

The only thing that will change is the windows vista software driver. M$ want you to think that graphic cards are specifically designed for windows so they can have more control over users and the industry, but the fact is, graphic cards are designed to work independent of OS. It's only the OS specific driver that changes, and thats exactly what will change in vista, not the hardware.

I wouldn't expect to see much in the way of performance increases. If anything you will probably see a decrease because of the extra abstraction layers.

(2) Haven't got a clue (mostly linux user :D )
(3) There is no new file system. It was shelved due to project time constraints. Even if they had implemented it, it would probably be slower because (acording to speculation) it had to store more information about how files are related, which obviously requires more processing at the basic level.
(4) No. The only thing you get in vista is a bloated 3D GUI, which requires a high end graphics card just to run. That means integrated chipsets will not work well.
July 8, 2006 1:57:30 PM

Ok, software require more memory as they are updated but 8X the memory in 2 years is a little scary. My word processor starts slower than my 3D software just pecause it loads al the possible libraries on startup; I bet they could come up with something more elegant.
July 8, 2006 2:08:37 PM

Quote:
Ok, software require more memory as they are updated but 8X the memory in 2 years is a little scary. My word processor starts slower than my 3D software just pecause it loads al the possible libraries on startup; I bet they could come up with something more elegant.


Personally, i use GNU Nano as a basic editor, and openoffice for more fancy writing. I agree that generally office software is bloated and loads slow, even some of the open source stuff.

As soon as you eliminate the gui, you eliminate alot of the bloat. Try a ncurses based editor like nano, it's lightning fast :) 
Anonymous
a b à CPUs
July 8, 2006 2:13:40 PM

My most beloved example is that of the playstation; They managed to squeeze every last drop out of it in the end. Imagine what you could do with a modern puter...
July 8, 2006 2:14:43 PM

I'm still not sure where you're getting 8x. I looked at the recommended requirements for CAD, it's 256 for 2004 and 512 for 2006. I think both numbers are low and should be a gig for both, 2G's on XP, because as soon as you try to open a drawing of any decent size it's going to slow down. But, minimum requirements accounting for ones threshold for pain I can see at half a gig.

I also question working fine on 500Mhz with 04. I don't know anyone who would want to struggle with that, and I haven't installed anything higher than 2000 on anything less than 1G, simply because the cost of man hours to use it is simply not justified.

This is probably all beside the point, but I guess I'm just not understanding the rant.
July 8, 2006 2:15:41 PM

Quote:
My most beloved example is that of the playstation; They managed to squeeze every last drop out of it in the end. Imagine what you could do with a modern puter...


Your imagination is realised in the form of GNU/Linux :) 
Anonymous
a b à CPUs
July 8, 2006 2:18:55 PM

Quote:
My most beloved example is that of the playstation; They managed to squeeze every last drop out of it in the end. Imagine what you could do with a modern puter...


Your imagination is realised in the form of GNU/Linux :) 

I wish I could play games on a Linux system; This is the only thing that holds me back from deleting windoze...
July 8, 2006 2:26:12 PM

Quote:

I wish I could play games on a Linux system; This is the only thing that holds me back from deleting windoze...


You can play some games on it. Doom3, Quake 3/4, Enemy-territory. But i have to agree, games are the only thing that keeps me running a dual boot with windows.
July 8, 2006 2:45:07 PM

It's all about the money.

It takes longer time to program optimized code and to use low level programming languages. More workhours means a higher cost to develop an application.

Meanwhile the hardware manufacturers want to make money so this is just a cartel between the hardware and software industry.
July 8, 2006 2:58:40 PM

I had a 1mhz commodore 64 in the early 80's.... The 3d shoot'um games on it got pretty amazing.... Because they ony had a 1mhz cpu to deal with, they learned how to be good programers.... Now anyone today can call themseves a programer, but really know nothing about programing for speed, and efficiency....

Imagine if those same programers had the 5ghz(like) CPU's and outragous GPU's we have today.... Kinda like when the games DOOM came out for pc's - It could do what it did because of great programing....
July 8, 2006 3:03:04 PM

Quote:
My most beloved example is that of the playstation; They managed to squeeze every last drop out of it in the end. Imagine what you could do with a modern puter...


Your imagination is realised in the form of GNU/Linux :) 

I wish I could play games on a Linux system; This is the only thing that holds me back from deleting windoze...

Well all the PS3 games are written with Linux, so when those games are available you may be able to delete windoze - As I have....
July 8, 2006 3:17:26 PM

Quote:
I'm still not sure where you're getting 8x. I looked at the recommended requirements for CAD, it's 256 for 2004 and 512 for 2006. I think both numbers are low and should be a gig for both, 2G's on XP, because as soon as you try to open a drawing of any decent size it's going to slow down. But, minimum requirements accounting for ones threshold for pain I can see at half a gig.

I also question working fine on 500Mhz with 04. I don't know anyone who would want to struggle with that, and I haven't installed anything higher than 2000 on anything less than 1G, simply because the cost of man hours to use it is simply not justified.

This is probably all beside the point, but I guess I'm just not understanding the rant.


I remember 1G for 2006 but who cares, I told you before; they practically eat your RAM and haven't provided for a "Quotes" option in their "Dimensions" menu. If you work with it for a long time, you will noce a lot of these inflexibility issues (not so intelligent, hatches, multilines etc) In my opinion, it could be a far more productive tool than what it is.
July 8, 2006 10:19:07 PM

Quote:
Well all the PS3 games are written with Linux


:roll: It uses a custom OS.
July 8, 2006 11:26:37 PM

Quote:
Well all the PS3 games are written with Linux


:roll: It uses a custom OS.

custom linux OS - The way Sony makes it sound, it is closer to real linux, than say max OS X.... Since their custom PS3 OS runs all linux software, I would think it will be no big deal to go the other way, and have linux run PS3 games? If linux will need some kind of emulator, I expect to see one released very quickly....
July 8, 2006 11:29:23 PM

I'm not even going to bother. You can live in your own little world.
July 8, 2006 11:36:20 PM

Quote:
I'm not even going to bother. You can live in your own little world.


I suppose you want me to google 'linux sony ps3' for you, and get you some links? I don't follow ps3, but i read http://www.linuxtoday.com/ everyday....
July 8, 2006 11:44:16 PM

Quote:
I'm not even going to bother. You can live in your own little world.


google:
http://www.google.com/search?hl=en&q=linux+sony+ps3&btn...

here is a wikipedia link for you....
http://en.wikipedia.org/wiki/PlayStation_3
It has been confirmed that Linux will be pre-installed on the PS3 hard drive. Sony hopes that with its wide variety of features, PS3 will supplant the PC in the home.[21] In addition, Sony hopes that the presence of Linux in every PS3 will encourage independent content creation such as homebrew games.

PS3 to ship with Linux, Sony confirms
http://www.linuxdevices.com/news/NS2370343858.html

Get your facts straight, before you start putting other people down.... Educating you is like pulling teeth....
Anonymous
a b à CPUs
July 9, 2006 10:15:53 AM

Quote:
My most beloved example is that of the playstation; They managed to squeeze every last drop out of it in the end. Imagine what you could do with a modern puter...


Your imagination is realised in the form of GNU/Linux :) 

I wish I could play games on a Linux system; This is the only thing that holds me back from deleting windoze...

Well all the PS3 games are written with Linux, so when those games are available you may be able to delete windoze - As I have....

You can't compare console games to PC games...
July 9, 2006 11:50:24 AM

Whatever Operating System you use to develop applications you still generally face the same issues, especially if using the same hardware architecture.
Using more RAM can be a way to speed up the application and this doesn’t have to have anything to do with lazy programming. If you keep your data set in RAM then performance can be significantly improved. When you look how cheap RAM is compared to CPU power, then this makes a lot of sense. If an application can be made to run significantly faster with an extra 512MB footprint then that seems a good deal to me. You can’t buy much extra CPU power for the cost of the RAM, ~£25.

I wrote my own Windows MP3 player as I couldn’t find one that could handle my very large music collection without slowing to a crawl. It was actually embarrassing to see how some of the big name media players were totally humbled by a large music collection.
I made the decision to optimize it for speed and not to worry about the RAM requirements. In the end it doesn’t even consume much RAM especially compared to iTunes, which I subsequently discovered can handle large music collections in an elegant way. The irony is that iTunes wasn’t even initially a native Windows application; kudos to Apple.

I was sad to see this develop into another MS v Linux rant :(  Software fanboys are just as sad as the hardware variants to me. I really don’t care what OS you like or dislike, try and keep on topic people. If only…..
July 9, 2006 12:24:08 PM

There wasnt a single piece of home user and non-opensource software that supported HT (even the pretty physics heavy HL2), enough for an answer? =)
Of course, Enterprise software already supported multi-threading years ago, but that kind of software isnt for us :) 
July 9, 2006 12:58:07 PM

Do you feel today's software are failing to unleash the full potential of new processors? Are we all being fooled by hardware/software vendors?

because people make bad software in a bid to piss us off. that doesnt happen. stop being stupid.
July 9, 2006 3:16:57 PM

Quote:
Just taking a look @ 2000's and today's games one doesen't notice that increase in detail or overall performance that transition from a P3 700MHz to P4 3.0 GHz should give. I used to play Porsche Unleashed 2000 on a P2 400 but you can't dream of doing the same with NFS2 undergroung though there's not much added.
AutoCAD2004 reccomends 128 MB of RAM; it ramps to 512 for the 2005 and 1GB for the 2006. HOLY SH*T, 8X the RAM while there can be no more than 2% of difference between '04 and '06. Once programmers used to do a lot of optimisations to increase performance on less powerful CPUs but now, with powerful platforms, they just don't and it's not surprising how fast they (and we all) hit the ceiling of the newest technology.


IF you write code for a game, it pays to be efficient, if you are writting a biz app, then it pays to be on time, use a Programming language that is just right for the task, and has good support, tools, and development community.

I've seen companies port their pure C++/C code into the .Net C# or C++ world and it suffered badly, why? cause they haven't figured out nor do they care about the inner workings of the language, or the technology their software will be running on. Developers generly figure by the time they can fix something , a new, much more powerfull processor will come out which will help in performance.

I think that is the reason that a 2003 edition of product XYZ lists the memory minimum at 256, while the 2006 version will list it at 1024MB not because they added code that is so weird and complex, becuase they know their existing code sucks and needs more ram to operate faster to justify the upgrade. lol
July 9, 2006 4:30:13 PM

Quote:
Do you feel today's software are failing to unleash the full potential of new processors? Are we all being fooled by hardware/software vendors?

because people make bad software in a bid to piss us off. that doesnt happen. stop being stupid.


This crappy world is all about people in a bid to piss off other people; if you haven't understood this yet then I can't help you.
July 9, 2006 4:38:24 PM

Quote:
Do you feel today's software are failing to unleash the full potential of new processors? Are we all being fooled by hardware/software vendors?

because people make bad software in a bid to piss us off. that doesnt happen. stop being stupid.


This crappy world is all about people in a bid to piss off other people; if you haven't understood this yet then I can't help you.

are you being serious? u really think companys think of ways to make your day harder?
July 9, 2006 4:52:23 PM

Conpanies think of ways to make their day easier and that often contrasts with our interest; We want performance and they want money but not always we are given performance worth the money we pay. Most companies come up with yearly versions of their software that require better hardware for a tiny fraction of improvement. That's the concept I wanted peopla to talk about.
July 9, 2006 5:03:35 PM

Quote:
Conpanies think of ways to make their day easier and that often contrasts with our interest; We want performance and they want money but not always we are given performance worth the money we pay. Most companies come up with yearly versions of their software that require better hardware for a tiny fraction of improvement. That's the concept I wanted peopla to talk about.


i agree that corners are cut alot of the time to meet deadline and to get products on the market but no company is going to make bad software deliberatly. also poeple take for granted how sprawling and ever changing the pc market is. code is constantly out of date and in need of improvment but that is the nature of the pc.
July 9, 2006 5:13:51 PM

companies don't do that deliberately, of course, but actually, their performance has been degenerating in extent. taking again the AutoCAD example; I go mad that there still isn't a "Quotes" function in the "Domensions" menu. I would take them at most a day of coding to ba cautious but still they haven't done it. :evil: 
July 9, 2006 5:38:02 PM

So what do you guys suggest. Start programming things in assembly again?
July 9, 2006 6:20:38 PM

Quote:
Programming is hard even for quite bright people. That means creating software is expensive. You've got a lot of highly-skilled man-hours to pay for. The extra effort and cost of producing highly-optimised code makes no sense financially if cheap, powerful hardware is readily available.


1. Those who say it is hard -- they should quit and get a job as a gardener. I am sick of those lame excuses.
2. There is no need for extra effort -- just for paradigm shift. They need to write with optimization in mind and not optimize already (poorly) written code.

Quote:
Really the challenge for a programmer is not to write code for a machine to read: it's to write code which people can read. Easy to maintain code will be less expensive and that's where the main effort goes. It may not be perfectly optimised but it doesn't need to be. It just has to be good enough.


1. Writing readable programs is one thing but we are going into extremes like if Average Joe is going to read it and not other programmers. IMO, if you can not understand a program written by fellow programmer then you are not a programmer but an Average Joe.

2. Code which is "human readable" is often not easier for maintainance. The opposite is also true, optimized code doesn't have to be hard to maintain.

Quote:
I don't personally think software is falling behind hardware overall, however there is always a period of catchup to be expected.


Well I do think so, and I believe I have some proof:

- Pentium 4 has been introduced back in 2000, 6 years and counting and still many programs do not use SSE, not to mention SSE2 instructions.

- There have been numerous updates of Windows XP kernel since Prescott core has been introduced and still it doesn't use MONITOR/MWAIT for thread synchronization on those CPUs.

- Take any average game or application and dissassemble it, chances that you will end up looking at legacy code mix which penalizes latest CPU architectures are in the range of 95% and up.

- Aside from some multimedial advances (video and audio recording, editing, etc) PC is still most often used as a typewriter.

- Take a good look at Vista, and tell me how does it change the way you interact with computer?

You still use mouse and keyboard, you still have Start Menu, you are still clicking on icons, running Setups, opening, resizing and closing windows, organizing files using folders and partitions, etc -- you were able to do all that with Windows 95 on Pentium 1 with 32MB of RAM weren't you?

Now take a look at hardware requirements for the Microsoft Vista itself. 15GB of disk space and 2GB of RAM and DX9 compatible video card for what?!? To be able to organize files and launch applications! It is rediculous.

There is yet another worrying trend to consider -- instead of forcing programmers to thread existing applications to enable functional and data parallelism, hardware vendors are offering them an easy way out in form of Reverse HyperThreading and its equivalents.

I can conclude few things by observing the current software situation:

1. Programmers nowadays write "readable" code but they are almost never coming back to read (and improve) it.

2. Optimization is still a taboo. Many programmers and their CEOs do not understand that if applied correctly it does not add considerably neither to the time schedule nor to the cost.

3. Everyone can be a programmer nowadays with the advent of .NET, Visual Basic, C#, Java, etc. By being provided with so much abstraction programmers often neglect basic principles of programming which are:

- Always have target platform specifics in mind
- Do not use brute force
- Do not waste resources

One of the most common errors I have seen is accessing arrays in wrong order. It happens because "programmer" doesn't know a shit about memory organization.

4. Code reuse is retro. Everyone is talking about it and it seems like nobody uses it because applications get bigger and bigger when they should get smaller instead. Just check all those DLLs and EXEs, I bet each one of them has its own strcpy() and strlen().

5. You get much more for the money you invest in hardware than for the same amount of money invested in software.

Quote:
So what do you guys suggest. Start programming things in assembly again?


I wouldn't go that far. I use assembler, but more often it is enough just to write code so that compiler can read it and in most cases it will be able to optimize it nicely. Assembler is there for fine-tuning when you already have healthy legs to stand on or for tasks where compiler can't cope with data flow you need.

What I would suggest instead is making knowledge about underlying hardware, assembler and even compiler knowledge a requirement for a programmer.
July 9, 2006 6:52:44 PM

Quote:
Do you feel today's software are failing to unleash the full potential of new processors? Are we all being fooled by hardware/software vendors?

because people make bad software in a bid to piss us off. that doesnt happen. stop being stupid.


This crappy world is all about people in a bid to piss off other people; if you haven't understood this yet then I can't help you.

are you being serious? u really think companys think of ways to make your day harder?

I think Microsoft unintentionally figures out ways to make my day harder? Do they think to do it, no, do they do it, yes.

ha. you love bill really. u know you would hug him if you had the chance. :D 
July 9, 2006 7:29:18 PM

if only to have a chance to pocket his wallet... :wink:
July 9, 2006 8:52:27 PM

Quote:
Pentium 4 has been introduced back in 2000, 6 years and counting and still many programs do not use SSE, not to mention SSE2 instructions.
SSE was introduced with the P3 and Intel has supported the various versions of SSE with software libraries for Windows and more recently with Linux I believe. I developed software that used the libraries from the beginning when Intel gave away the Signal Processing software suite; I believe they charge for it now. The beauty of the library is that you don’t call a particular version of SSE from within your program, as it knows what version of SSE is available and maps calls to routines optimized for that version. I wrote software that used SSE and when SSE2 came out I got the benefit of that with a negligible amount of coding. Intel makes a large number of optimized software suites that utilise SSE 1-4 & even MMX :) 
July 9, 2006 8:53:53 PM

Quote:
So what do you guys suggest. Start programming things in assembly again?


I'm talking about MORE INTELLIGENT PROGRAMMING, just that will do. We have always grater CPUs at one side but softwate in many fields is not keeping up so I guess C++, perl, python etc will still do :wink:
!