Sign in with
Sign up | Sign in
Your question

Why is software 7-8 years behind CPU technology?!?!?!?!?

Last response: in CPUs
Share
June 11, 2007 11:19:26 PM

This is kind of a re-hash of a reply that I made in another topic, but I thought that it might serve better as it's own thread and see what comes out of it. Sorry if you've read this twice, but I think that I went a little off-topic in the last thread.
I really dont know what to make of the whole state of software right now. As far as CPU's go, I dont understand the real technical details like why they are having problems with clock speeds on smaller dies or whatever. It seems to me that quad-core/single-die CPUs would not bring any significant increases to the table if you still are running an OS that doesnt even utilize 2 cores fully, let alone 4. I know that AMD has their lab in Europe that is supposed to be working on software/hardware integration, but if Micro$oft isn't involved or helping, what's the upside? I think that it's just such a waste of effort and money to market and release a product that can't be utilized to it's capacity because the software won't utilize it for another 5 years or more.
Now, I know some will say: "But the software will come." Bullshit! HOW LONG HAVE WE WAITED FOR 64 BIT APPLICATIONS/GAMES TO GO MAINSTREAM?!!?!?!? Micro-fraud has been saying "64 bit will be the standard in 2 years from now" for over 5 YEARS!!! You have more potential locked away in the hardware that you have now than will come out "new" in the next 2 years. I can guarantee that everyone who reads this has a 64 bit processor, but maybe 5 of you use WIN 64. Why? Because it's not supported on the back-end. How long ago did they release the 64 bit cpu's? Hell, I bought my Athlon64 2000+ back in 2000. Why don't they force 64-bit computing down everyone's throats like they did Crapta on new computers? Hell, why did they even bother making a 32-bit version of it? Oh, yeah... backwards compatibility with old products and software from when I owned a Gravis Ultrasound and Doom was the newest thing. Why not force your "partners" to make 64 bit games, drivers, and applications available? I don't know anything, but I bet that someone will tell me why...
Seems to me that all this forward hardware development is great and AMD has some great things in the works (IF it works and WHEN it works are 2 totally seperate arguments). I think that instead of shrinking the dies and making the same products smaller, they should work on software integration and enable us to use the true potential that the chips that we have now.
Oh, wait... that would kill their sales. Nevermind

Postscript: I do understand the economics of this with software makers wanting their products to be available to the largest number of potential users. What troubles me is that no one seems to even care about the potential that's there for 64 bit computing when EVERYONE who has bought a computer in the last 5 years has a 64-bit capable rig, yet no one (read: software companies) wants to make the jump. This is SAD... :cry: 
June 11, 2007 11:30:58 PM

Linux has been 64-bit supportive for years... Can't see why M$ and Co can't get their finger out if the GNU crowd can pull it off for nothing.... :lol: 
June 11, 2007 11:44:55 PM

With the way the amount of RAM needed by PC's is increasing I think the MS prediction of 2 years now holds true. More and more people are buying 4gb RAM for their Vista rig's, there's even 2x2gb RAM kits. I'm sure you know that you can't use all 4Gb of RAM with a 32bit OS so eventually the end users are going to be forced into 64 bit by their requirements.
Now the enthusiast crowd is going to move into this area of needing 64bit before the mainstream (mum+dad, office users), so software companies may ignore us somewhat for a while as the mainstream is we're the money's at.

Basically the transfer is going to be slow and painful, but it is going to happen soon.
Related resources
a b à CPUs
June 11, 2007 11:52:45 PM

Try Linux. Just a thought. very true indeed not many apps are 64bit let alone multi threaded.
June 11, 2007 11:58:57 PM

Quote:
Try Linux. Just a thought. very true indeed not many apps are 64bit let alone multi threaded.


Not really a solution, Linux has the exact same problem as 64-bit, lack of software/support.

And just so I don't get minced, by software I mean programs like full budget 3D games, Industry standard software (CATIA/AutoCAD, not sure if these have a linux counterpart) and things like that, I've used Linux (Ubuntu) before and am well aware of the abundance of software that you can get for free :D 
June 12, 2007 12:00:42 AM

Quote:
With the way the amount of RAM needed by PC's is increasing I think the MS prediction of 2 years now holds true. More and more people are buying 4gb RAM for their Vista rig's, there's even 2x2gb RAM kits. I'm sure you know that you can't use all 4Gb of RAM with a 32bit OS so eventually the end users are going to be forced into 64 bit by their requirements.
Now the enthusiast crowd is going to move into this area of needing 64bit before the mainstream (mum+dad, office users), so software companies may ignore us somewhat for a while as the mainstream is we're the money's at.

Basically the transfer is going to be slow and painful, but it is going to happen soon.


You've hit the nail on the head. There hasn't been a good reason. CPU performance has been good enough for desktop users. And RAM usage hasn't hit the limit until recently. And 64bit linux usage is mainly in the server space since there's no 64bit flash support meaning a lot of websites don't work.

Now though we're starting to hit the limit with 4GB machines all over the place.

I've been using 64bit linux for the servers I administer for several years now and it's a must for me, but until 64bit builds of some of the critical packages I use for consumer stuff (as opposed to server stuff) like flash64 exists, I'm not switching even if it's faster.
June 12, 2007 12:01:48 AM

Quote:
How long ago did they release the 64 bit cpu's? Hell, I bought my Athlon64 2000+ back in 2000.


im sure other people saw this... and i might very well be wrong, possibly, but for consumers i think they only had 32bit XPs/durons on s462 back then, and p3 and p4... the move to s754 a few years later brought 64bit/32bit capability... but even with 64bit capable cpus, the average user wasnt using anywhere near 4GB of memory... it was more like 512MB - 1GB (the average enthusiast still only has ~2GB)... so the main reason for 64bit wasnt quite needed, for most people... still isnt quite needed, but its getting there, as software becomes more bloated (read: vista)

there are performance improvements going with 64bit, no doubt... and with vista 64 being out, and actually supported (unlike xp 64)... were already seeing software being released... but, the timeline isnt anywhere near 7-8 years though... closer to half that... ...so only 4 or so years ago, most people didnt have 64bit s754 cpus, a few did though im sure, the early adopters... but the software industry isnt going to change overnight for a few early adopters though either
June 12, 2007 12:44:02 AM

The real questing would be why go 64bit. There really isn't even a need today for the average consumer. Without the consumer there is no money in it. With no money there is no software. M$ is going there because there next OS will be crippled with out it (speculation).

In most applications there wouldn't be a huge gains in performance in the first place. Most consumers are using PC's for the Internet, email and office type stuff, you don't need much ram to that. I would almost challenge you by saying why can't the developer write code that doesn't use so much ram in the first place. Think about it there are printer drivers that are 500Mb downloads that's equivalent to 10,000,000 pages of text.

I use to have a C64 back in the day with 64k of ram it could do quite a bit with that ram. Word processing check, gaming check and I'm sure if it had been around in the mainstream email and surfing.

My First PC was 386 SX25 with 4Mb of ram and 42Mb hard drive. Today it's E6600 4Gb of ram and around 1Tb of disk space. It's true that it's faster and can do more but it's not what I would have expected considering the computing power.

64bit has it's purpose for sure and it's used in server environment where vast amounts of data are managed. In consumer applications it's mostly a crutch for sloppy code running on a bloated OS. Don't worry though it's just around the corner now, just hope that we're not all underwhelmed...
June 12, 2007 12:46:27 AM

I think it has alot to do with the original concept of PC architecture, a lower cost alternative to the Motorola and Sparc processors. They were running 64 bit multi threaded applications years before others even considered it. x86 Architecture (in my opinion) was trying to compete by offering more raw power through MHz and eventually GHz at a lower cost. Motorola and Sparc where and are powerful processors but had lower frequencies. Granted server applications came online in 64bit well before desktops, but it would be a big lose for software companies to not offer legacy support. If people can't find what they need for what they have they will either pirate the software or use open source. I would love to see a large shift in software but until the masses speak up, we will have to deal with what makes the most money for these companies and unfortunately that is 32bit software.
June 12, 2007 12:56:08 AM

OldGoat said:
Quote:
I can guarantee that everyone who reads this has a 64 bit processor, but maybe 5 of you use WIN 64. Why?


That's absolutely not true. Why would someone need a 64bit processor + few gigs of Ram to read this? The machine i'm using right now only have 128mb of RAM and 600mhz 32BIT intel celeron processor, and 2.5GB harddrive. It is certainly capable of viewing this thread.
a b à CPUs
June 12, 2007 1:08:58 AM

You are blaming Microsoft for the low quality or complete lack of drivers for Windows 64 and/over Vista 64. You do realize that these drivers are not Microsoft's problem, but they need to be written and tested by the companies who make the hardware? For example the 8800 GTX still has problems on Vista, especially with SLI, but Vista has been available to nVidia's programmers for a year or so. Blame nVidia, not Microsoft. By the way, Microsoft has finally noticed the problem and now driver certifications for Vista 32 require that drivers for Vista 64 also pass the tests. That should soon make Vista 64 just as reliable as Vista 32, at least that's what MSFT tries to achieve.

Multithreaded programming exists, but it requires very good programmers and lots of testing. It's hard to achieve this in an open-source O/S simply because good programmers and testers often want to be paid a lot. BTW, that also explains why software is not ready for multicore CPUs yet, it's just that most programmers don't have the brain power and skills to write correct and big multithreaded programs. Easy things yes, sure, I wrote a couple myself, but when it comes to something massive you won't find enough skilled people. Companies don't push for it too much either because it would increase the cost of development and the risks of delays.
June 12, 2007 2:02:57 AM

I think the OP hasn't taken into account that the symptom hes looking for an answer to is the same problem but with two threads to it.
First of all theres way hasn't support taken off for 64 bit OS and second why hasn't support or good use been implemented for dual core.
From my way of thinking software is going to always lag behind hardware. The hardware is needed to be there for the programmers to take advantage of. Once it is they only going to start to develope in a big way when demand is there. So once the hardware is there an OS has to be written for it, theres a couple of years gone. Once that is in place the multitude of companies and programmers can go away spend 6 monthes learning it before starting on some sellable software. So that covers the 64 bit OS and a fair chunk of his time frame eaten into.
Next dualcore, which pretty much will add a fair amount of dev time to any software due to it's complexity.
So yea that pretty much covers his 5 year wait.
Oh yes as an afterthought...how long would it take the bigger companies to develope software that allows your average joe programmer to be able to write software for dualcore utilization without having to delve into the complexities of load balencing etc
Well thats my views and thoughts, may be wrong, may be right, just my opinion.
June 12, 2007 2:04:58 AM

This is my first post, but i'm a gamer to the core. Something that has always bothered me is the usage of Oblivian as a benchmark. I highly suspect that maybe this software might not have been given much thought as to what hardware was currently available. ( It was the first game that choked my machine). amd 4000, 850xt. I think programming is sloppier than most people are willing to admit. I can't wait until there's a viable alternative to ms as it's the king of bloatware.
June 12, 2007 2:48:04 AM

I think you got something confused. Vista can support four cores, it's four sockets that's the problem.

"HOW LONG HAVE WE WAITED FOR 64 BIT APPLICATIONS/GAMES TO GO MAINSTREAM"

If this is really such a problem, why not buy a Nintendo 64 and be happy. We don't have those kinds of games, because there's not an overwhelming need for it. Programming and consumer desire is part of the equation, but truth is, we really just don't need it yet. Lots of applications out there throw 128-bit code at a processor, and the only thing that happens is that it's broken down into two 64-bit packets instead (I think).

Dual core is also nice, but the need, it just isn't there, especially when most games fall unto the graphics processor for it's needs, rather then the central processing unit. A game that utilizes an x amount of cores; who cares if it's buggy and the graphics are choppy?
June 12, 2007 2:59:17 AM

Apples new OS is running 64bit + 32bit natively at the same time, or something or other:D 
June 12, 2007 4:13:31 AM

Quote:
OldGoat said:
I can guarantee that everyone who reads this has a 64 bit processor, but maybe 5 of you use WIN 64. Why?


That's absolutely not true. Why would someone need a 64bit processor + few gigs of Ram to read this? The machine i'm using right now only have 128mb of RAM and 600mhz 32BIT intel celeron processor, and 2.5GB harddrive. It is certainly capable of viewing this thread.



Perhaps he means that because 64-bit processors have been out for years now (desktop, that is), that it's a pretty damn good chance that you have a 64-bit processor? Just a thought. Not so sure he meant that you NEEDED it to view the thread...
June 12, 2007 10:48:25 AM

Think of the millions of lines of code in any mature software product out today.
Photoshop, 3dmax, game engines, etc, etc.
Millions of dollars have gone into the writing of that code going back years. That is an investment most software companies can't afford to throw away and rewrite for multi-threading and 64bit. If it ain't broke don't fix it.

In terms of software efficiency. It takes time to write very tight and highly optimized code. Time is money so it's much easier to use a huge library of code even if you are only using one function, rather than unpick it just to extract what you need. And you never know what dependencies there are on other functions within the library.
June 12, 2007 11:53:46 AM

Quote:
OldGoat said:
I can guarantee that everyone who reads this has a 64 bit processor, but maybe 5 of you use WIN 64. Why?


That's absolutely not true. Why would someone need a 64bit processor + few gigs of Ram to read this? The machine i'm using right now only have 128mb of RAM and 600mhz 32BIT intel celeron processor, and 2.5GB harddrive. It is certainly capable of viewing this thread.

You need to reread my post cowboy, I do not believe I said that.....

For most people older systems are just fine, because they just use it for general surfing of the internet, email, and writing letters. The OP is wondering why software has not forced the market to go 64 bit.
I have five computers at my house and all but one of them still have a PIII, K6 or lower processor. My laptop has a 1.8Ghx Turion64. So I fully understand the potential that is still left in older computers.
June 12, 2007 1:00:25 PM

simply because its harder to write software!
June 12, 2007 1:03:15 PM

Reading this thread almost makes me cry. Simple misinformed guessing and ranting.

64 bit belongs to the server space. Thanks to massive advertisment everyone believes it´s the real thing. While 64 bit is indeed an improvement over 32 bit, the advantage for everyday computing isn´t that big.
Microsoft caters to both home and professional users and in the professional space 64 bit is a requirement. Servers need a lot of memory, they need every microsecond of speed that can be squished out of a CPU - and everyone that had to programm a simple multiplication in 16 bit assembler code will understand that concept.
AMD advertised their 64 bit advantage over Intels failing Itanium concept and every village idiot sucked it up like Kate Moss cocaine.
Now, why is 64 bit showing up on the desktop? Well, since Microsoft killed the Win98, Win Me line of it´s OS, the only thing left was the heirs of NT - which is, the really bright ones might have guessed it (it´s okay if you checked the wikipedia), their workstation/server OS. And now we get 64 bit in software for the same reason we got it in hardware a gazillion years ago - no, not because AMD and Intel wanted everyone to be able to put their phone number into a single register - but because it´s a byproduct of the server space.

64 bit wasn´t developed because games run faster on it or because 64 bit makes your e-penis twice as long as 32 bit.
June 12, 2007 1:19:58 PM

For most applications i don't think 64 bit really is necessary... Well maybe if you have windows hogging up a couple of GB of RAM just to surf the net, yes. :D 
32 bit has been more than enough for any requirement i have thrown at it. HDD speed has usually been a lot more of a bottle neck. Hmmm... maybe if i can get 320 GB of RAM set as a virtual disk... drool drool
a c 102 à CPUs
June 12, 2007 1:34:53 PM

If you want to have applications that use all of the features that your hardware has, run an open-source OS like Linux or one of the BSDs and compile it and all of your programs (must also be open-source) from scratch. You can then take advantage of all of the features of your hardware, whether it be 64-bit, SSE3, or something else.

I use Gentoo Linux on my desktop and it's an excellent distribution. It's not for new Linux users, though. FreeBSD is similar to Gentoo and you can also compile apps through its ports package manager. But if you're new to Linux/BSD, I'd suggest using a binary distribution that's straightforward, like Ubuntu Linux and then recompiling packages using apt-build. But even if you do not compile a single line of code by yourself, most all Linux and BSD distributions have 64-bit versions in which 99.99% of all applications are 64-bit also. The only apps that are not 64-bit are the proprietary ones like Adobe Reader and Flash Player. There are tens of thousands of 64-bit Linux and BSD applications out there and they ship right alongside the 32-bit ones.
June 12, 2007 1:40:45 PM

Simple answer : there is, at this time, no need for the change. There certainly wasn't a need five years ago when 64-bit desktop processors were common, and there probably won't be a need for another two or three years after this point.

The purpose of 64-bit processing is not to create faster computers, but instead to allow a solution to the 32-bit memory pointer problem -- that with only 32^2 integers you only have 4.3 billion possible memory addresses. That wasn't a problem back when the x86 architecture came out and 16 MBs of memory was an hefty amount, but with many of today's motherboards supporting 8 gigabytes or more, it will eventually come up.

But it hasn't. Not yet, at least. Bioshock on Vista may look to be a memory hog, but most estimates are going at less than 3 gigabytes of RAM as the sweet spot.

As more and more programs are created with the need to map to over 2-3.8 gigabytes of memory, then we'll see 64-bit software come into play, but right now making any program that does not meet those requirements would only result in needlessly swollen pointers and driver compatibility issues (in Windows XP or Vista 64-bit -- Mac OS X 64-bit instead ends up with a slower kernel).

Why was this sort of thing being marketed back then, if there was not going to be any 64-bit applications even in five years time? Market saturation helps when we do eventually need to switch. It's probably been cheaper for AMD and Intel to deal with this architecture now rather than having to keep both pure 32-bit and 32/64bit architecture moving at the same time.
a b à CPUs
June 12, 2007 2:00:47 PM

Quote:

From my way of thinking software is going to always lag behind hardware. The hardware is needed to be there for the programmers to take advantage of.


Excellent point. How the heck am I going to write programs that use multiple cores if I only have a single core CPU. Same goes for DirectX support in games, and so on. By the time you can buy the latest and greatest hardware, some programmers have had it for weeks, others won't even have it for weeks. You have to allow a year or so for them to do the work. I admit though that 7 or 8 years is ridiculous :roll:

Microsoft and Sun have already done a lot of work in their frameworks (.Net, J2EE) to make it easier for developers to use multithreading. Expect some results one of these days :p 
June 12, 2007 2:16:27 PM

Not old enough to remember the 16 bit/32 bit transition in hardware and software I see? The intel 386 got introduced in 1986. Yet 32 bit software did not thake of until way after 1995 with windows NT introduced in 1993 windows 95 in 1995. :roll: Only with windows 2000 32 bit software became generally used.

I have the feeling that it might take even longer with the 32 bit /64 bit transition, before the majority uses 64 bit software in Windows.
June 12, 2007 2:37:56 PM

I don't have a 64 bit system.
June 12, 2007 2:43:01 PM

Yep... People use even today 8 to 10 years old computers, that didn't were the bleeding edge even then...
Multicores there has been available for some years. From all of my friends about one of the ten has multicore prosessor now... And even now they are selling single core prosessors, I am not sure, but I think that some low end prosessors are even 32bit even now (the market computers). So it will take something like 10 years before there are so many multicore and 64bit computers that it's worth of making programs only to them...

Einheriear above is quite right!
June 12, 2007 2:54:28 PM

The simple way to look at it we are all in the buisness to make money. You can sell a proc unless it is better than the old one. (64-bit was over sold in my oppinon).

Then on the other side you don't want to spend more than you have to. Why would software dev rewrite or write the software to support the minority of users.

Yes 64-bit is the way to go but untill dev are forced to program for this or it is econmicaly viable they wont simple as. Intel & AMD will always bring out something that sounds good (even if you don't need it now) to make you buy the product.

I look after meny buisness system and none of that for normal office use needs more than 1GB of RAM of XP and processing power is more than enough. Office inviroment as far as I am aware is were most of the money is for general apps. If companies don't need it why would they pay for R&D to give the general home users it.

They only people that needs all this power is gamers/high powered apps ohhh and the poor soles that have got Vista and want the eye candy. This equates to only a small percentage.
a c 103 à CPUs
June 12, 2007 2:56:24 PM

Here a thought on your OP.

How much is forward software progress affected by software piracy?

Does piracy cause resources to go towards copy protection instead of covering a full spectrum of CPU capabilities?

Do software programers expect to be paid for their work to support their families?

What are the long term affects of getting software for free?

So how is software piracy affecting today and future developement?

How is a software developing company, that invests millions in software developement thats stolen right out the door, dealing with those kind of loses, how do they pay their employees when their product doesn't return a profit, and how does it affect us, and will it affect us in the longrun?

I'm not talking open source here, I'm talking software developement for the profit, to be able to stay in business, the more the software is stolen, the more possibility that we have no software period.

So we all better not just learn Linux but become open source software developers ourselves, thats probably going to end up being our future, cause software piracy will not stop, until theres none left to be stolen.

I think I just had a Brainfart! 8O

No offence but just something to think about.
June 12, 2007 2:57:09 PM

You can write a program that uses multiple threads on a single - processor system. So development process does not require multi-core CPU in this case.

Quote:

From my way of thinking software is going to always lag behind hardware. The hardware is needed to be there for the programmers to take advantage of.


Excellent point. How the heck am I going to write programs that use multiple cores if I only have a single core CPU. Same goes for DirectX support in games, and so on. By the time you can buy the latest and greatest hardware, some programmers have had it for weeks, others won't even have it for weeks. You have to allow a year or so for them to do the work. I admit though that 7 or 8 years is ridiculous :roll:

Microsoft and Sun have already done a lot of work in their frameworks (.Net, J2EE) to make it easier for developers to use multithreading. Expect some results one of these days :p 
June 12, 2007 3:08:39 PM

What i know hardwares are based on bit (1 and 0) but softwares are based with logic. It means the programmer must think about the question and the two answers in the same time.

For example a small program which can convert the temperature which are programmed in basic/pascal language. Programmers have to asks an input from users, programmers must know the formula how to convert the input into what users want, programmers must know if the input is correct and the answer too or vice versa. So, in resume, programmers must know what to do if the answer is "yes" and if the answer is "no".

Long time a go, this small program which are running in 286 computer from the time you hit run the program until receive the answer can took about 7 seconds (let's say it like that).
But with current technology, this small program can run in 0.0001 seconds.

So, try to imagine an operating system such microsoft, let's take Windows 3.1. There are hundreds files, millions possibilities, millions yes and no.
With 286, loading the OS can took about 2 minutes.

While hardware are advancing, 386 or 486 are founded, higher clock in CPU which the result is loading the OS took less time. People are happy, while microsoft try to advancing with his programmers creating Windows 95 with expanded capabilities and facilities. And so..on and So...on till now.

Intel find that advancing the hardware (processors) are in their bussiness which the result is everything can run more "faster" or efficient than before.
While people demands more and more facilities or softwares which this must be achieved by microsoft if "the world" wants to use its product, which is not an easy tasks (bug, patch, updates are exists everytime) to get a perfect software.

Well, that's my opinion. I hope it doesn't make you more confused. If there's something wrong, please correct me.
June 12, 2007 3:15:41 PM

Quote:
... <--ellipse, and 0 <-elipse. relationship?



Please edit to

... = ellipsis, 0 = ellipse

Thank-you.

Q
June 12, 2007 3:35:09 PM

Quote:
Reading this thread almost makes me cry. Simple misinformed guessing and ranting.

64 bit belongs to the server space. Thanks to massive advertisment everyone believes it´s the real thing. While 64 bit is indeed an improvement over 32 bit, the advantage for everyday computing isn´t that big.
Microsoft caters to both home and professional users and in the professional space 64 bit is a requirement. Servers need a lot of memory, they need every microsecond of speed that can be squished out of a CPU - and everyone that had to programm a simple multiplication in 16 bit assembler code will understand that concept.
AMD advertised their 64 bit advantage over Intels failing Itanium concept and every village idiot sucked it up like Kate Moss cocaine.
Now, why is 64 bit showing up on the desktop? Well, since Microsoft killed the Win98, Win Me line of it´s OS, the only thing left was the heirs of NT - which is, the really bright ones might have guessed it (it´s okay if you checked the wikipedia), their workstation/server OS. And now we get 64 bit in software for the same reason we got it in hardware a gazillion years ago - no, not because AMD and Intel wanted everyone to be able to put their phone number into a single register - but because it´s a byproduct of the server space.

64 bit wasn´t developed because games run faster on it or because 64 bit makes your e-penis twice as long as 32 bit.


You're also a bit misinformed on why MS is moving to 64 bit and why they have a single Kernel now for both home and professional users.

Microsoft got rid of the 9x kernel for one very good reason, development. Having to write things that are compatiable to two different kernels is a royal PITA. Ever since MS made Win95 and WinNT3.5, they've been moving to a merged kernel. XP was the finalization of that merged kernel as each O/S moved closer and closer together. Microsoft is trying to move everyone to one platform to save money for everyone, not just themselves.
June 12, 2007 3:39:48 PM

its a simple answer as to why software is lagging so far behind the current hardware.

Hardware is cheap now. Which has caused a large number of software writers to become exceedingly lazy in their work.

gone are the days where you had very little hardware resources to work with, and you were forced to write small, precise, and therefore (usually) very quick executing code.
June 12, 2007 4:02:54 PM

Quote:
Try Linux...


And use it for what ? Browsing and music/movies? in that case what's the point? and does it make you feel better that you use the hardware at maximum potential but you are only able to do basic things?
June 12, 2007 4:09:50 PM

Quote:
M$ is going there because there next OS will be crippled with out it (speculation).


I don't think that's speculation... I'm pretty sure Microsoft is releasing one more OS as a server platform that is 32 bits... after that, I think they've announced all future operating systems will be 64 bit.
June 12, 2007 4:24:08 PM

ok, I built an x64 platform at the very end of 2006. So far i've had xp x86 (for about a week) then i went to xp x64 corprate (for about 2 months), and now I'm on the latest Ubuntu x64 version... I'm still learning linux, my first time, but I love it... this honestly a great os... and my downloads are all faster, and my 64 bit firefox browser blows my expectations out of the water... literally, i click on a page my connection is actually slower than the browser ( i have a 3mbs cable connection with a decent ping ) so that's amazing! and as far as support goes, there's more than a good handful of apps that you can get for ubuntu x64 without searching for hours on the internet... I love ubuntu, maybe just an opinion... but it's not really... because It really is smoother and faster, and doesn't eat 1gig ram sticks for breakfast... haha.

anyway, point being, I think that if you want real x64 support right now, you're going to have to get the BUN (ubuntu). It really solved a ton of problems over windows x86 and windows x64... and like i said, requires less system than shista...
So really... just go get ubuntu, nothing is pure x64 yet.. but wait till dell starts offering it as an os... a MAJOR MAINSTREAM RETAILER... you'll get your x64 support soon after.
June 12, 2007 4:35:00 PM

I just wanted to speak up for the poor old code jockeys of which i used to be one.

I would have been one of the first generation to have the ability in the compilers to write multi-threaded code and it was not that easy. Although the actual code structure was OK for smaller apps it could get way out of hand in larger stuff that ran to millions of lines. Also, in those days, if you ran it on a single core processor the apps could be really slow but maybe they've fixed that now. Basically, without going into details there are loads more effects to consider when coding like this, it means you have to be much more aware about the system as a whole when writing.

On the flip side, we want to write decent, optimised code, but the boss always says you've got to get it done in 2 days...
June 12, 2007 4:42:28 PM

that's why ubuntu is great... open source means that half the people working on it have no deadlines...
June 12, 2007 5:15:30 PM

Hey

I'll tell you why I'd switch

116,000+ 16/32bit virus's have no effect on an x64 system.

Or use a Mac or Linux, easier.

The only reason I'm on XP most of the time is because I've got a shite wireless card and it doesn't have linux drivers, and because its USB it doesn't work with ndiswrapper.

:( 

VIRUS RESET :D 
June 12, 2007 5:31:15 PM

The main thing is: Software is not efficient. There's lot of code bloat. I think programs that are 80MB compiled w/ all there binaries can be easily reduced to ~30MB only if the programmers had the required skill to actually find ways not only to 'make it work', but to use less RAM too.

Maybe its only a hunch but I think European companies produce better programs since they have demoscene competitions. You ppl also tell.
June 12, 2007 6:20:17 PM

Quote:
Why not force your "partners" to make 64 bit games, drivers, and applications available? I don't know anything, but I bet that someone will tell me why...


If I'm not mistaken, didnt microsoft start requiring hardware manufacturers to start producing x64 bit drivers as well as the 32-bit drivers, if they want them digitally signed?

That's about as much as MS can do to force 64-bit. *Shrugs*
June 12, 2007 7:07:56 PM

only addressing threading/multicore

Quote:
It seems to me that quad-core/single-die CPUs would not bring any significant increases to the table if you still are running an OS that doesnt even utilize 2 cores fully, let alone 4.


The quote above seems to be placing multiple core support on the OS when every current OS is capable of supporting as many cores as you can throw at it (yes M$ limits cores/CPUs based on the licesnse you purchased, XP home does not suport 2 sockets and XP Pro does not support 4 but this is an intentional criple). Open up task manager and look at how many threads are running vs. number of processes (hint: only one process in my list of 45 running is single threaded.) Most software developed these days uses multiple threads for various tasks they jsut don't consume many resources.

Almost all professional applications (graphics/imaging, CAD, CFD, rendering, encoding, compiling) have used multiple threads for many years but the harware to support it was not common in the mainstream. when multicore was first introduced I thought, wow mainstream is finally catching up, I had been there for years on a dual proc workstation that I used for CAD and CFD which could peg both CPUs at 100% for 12+ hours at a time. In this sense the harware is what was lagging behind the software.

I'm not a gamer but I think most of the complaints around "can't use that many cores" come from the gaming segment which is a small drop in the bucket when compared to $ spent on hardware for professional use where all the resources are put to use regularly.

It will take time for the game developers to catch up to where the pro apps have been for years
June 12, 2007 7:34:47 PM

Wow... didn't expect this kind of response.
I think that somewhere, the point was lost. I was trying to say that we were all marketed and sold this idea of "64bit" computing being the next big thing over 6 years ago or more. Now, the hardware is here, but there's no support for it. Kind of like the multi-core issue. 4 cores aren't even utilized by any software that I can think of. Yeah, it's kind of a gaming issue, but at the same time, if the hardware manufacturers are saying that it's the next big thing, wouldnt that make them full of siht again?
Maybe I'm just a simplton, but doesnt wordlength have anything to do with it? I mean, it seems to me (now, I'm just an electronics tech, not a computer tech) that if the software can only move 32 bits across the registers at a time, that the output would also be limited to 32 bits, thereby negating the fact that you have a 64 bit processor. Is it that the processor breaks up the intructions, or can it force 2 instructions at a time across at 64 bits? I'm under the assumption (I know what they say about assuming...) that if you are running a 32 bit app across a 64 bit processor, that you're only using half the registers and the processor is crippled. Wouldnt it be more efficient (not to mention increase processing speeds) to use the entire 64 bits? How does this work?
June 12, 2007 8:09:52 PM

I make no claims of knowledge on how 64 bit CPUs execute 32 bit code but I think I have read before that at least a portion of the core goes unused. I believe it does have to do with the registers but I am not certain.

I agree with your point that 64 bit has been marketed for a very long time, many users here have retired several rigs that were 64 bit capable without ever running 64 bit software on them. Like another poster said the 16 to 32 bit transition was over 10+ years.

As for threading I was not trying to say you were wrong, just pointing out that the OS (M$, Linux, etc.) is ready for multiple threads, multiple cores, multiple CPUs. That is the primary job of the OS managing resources and processes, not providing all the bloat that it does today. I was also pointing out that many applications today are already taking advantage of the latest hardware, games being the major exception.
a c 87 à CPUs
June 12, 2007 8:23:36 PM

Have you by any chance considered switching to decaf? Seriously, this level of hate about something so "trivial" isn't healthy...

Quote:

Postscript: I do understand the economics of this with software makers wanting their products to be available to the largest number of potential users. What troubles me is that no one seems to even care about the potential that's there for 64 bit computing when EVERYONE who has bought a computer in the last 5 years has a 64-bit capable rig, yet no one (read: software companies) wants to make the jump. This is SAD... :cry: 


Theres your problem. If you have a 64bit CPU and a 64bit game and a 32bit os, then the best you can do is 32bit operating. I'm glad MS didn't for 64bit down our throats, as it is only just now ready for prime time. As has already been mentioned, software people aren't going to spend 25+million dollars on a game that only 5% of the newest computer can run. Look at Ageia. Why would someone make a game that requires that card if so few people have bought it? The hardware companies will continue to develop better and better chips, and the software companies will continue to develop software that takes economical advantage of those chips.

BTW, if the software can use 2 cores/CPUs, then it should have no problem with 4. Once you are multi thread aware, you should be able to use however many "thread processors" are available. I do know some games can use 2 cores but don't show improvment with four, I am assuming they for some reason did something so that the game will only use 2 cores.
June 12, 2007 9:42:14 PM

IBMs main OS "OS/400" has been 64 bit for many years, and I believe that it is currently on 128 or 192 bits. :roll:

Fact: Windows is shit
Proof: On upgrade from 48 to 64 bits, all programs and nesscary data was converted :!: from 48 to 64, and likewise for all upgrades. Impressive.

However much custom written software (as stated by CodeJunkie) is written either maxes out 32bit hardware easily, and some benefits from the increased word size. (bit-ness if you like).
June 12, 2007 11:58:31 PM

Actually, I don't drink caffeine. It's just a pet-peeve of mine in the computer arena. One of only a few... but this is one of my main ones.
You said it best, BUT most people have 64bit CPU's. If you've bought a new computer in the last 2-3 years, you have a 64-bit rig. Considering that the average user tanks a computer and buys a new one every 4-5 years, most people should have the hardware necessary to run 64bit software. Hell, I bet most computers have had 64 bit CPU's for more than 1-2 years now.
I say that 4-5 years comment because most of the computers that I work on are so clogged up with crap after 4-5 years and technology is increasing the performance-per-dollar ratio exponentially by the year, my "average" users are switching to faster rigs just because they can buy a Gateway or eMachines for $300-400 (x2 3800+, 512mb, 80 gb HD)and have a pretty fast computer for what they do and it would cost them that much or more just to have their old computer upgraded/cleaned out.
With entry-level computers becoming more of a commodity, computer techs are having a hell of a time (if they're honest).
EDIT- To retort to your statement that it's trivial: I argue that it's not. It's deceptive. No one has ever answered my question about the word length: Does software magically spread 2 words across the bitpath to utilize all 64 bits that a processor has, or are we only working with 32 bits on a 64bit chip? Like I said before, I'm not an engineer, but it seems as though if you can double your wordlength, you should be able to double your processing output.
a c 87 à CPUs
June 13, 2007 1:00:52 AM

Quote:

EDIT- To retort to your statement that it's trivial: I argue that it's not. It's deceptive. No one has ever answered my question about the word length: Does software magically spread 2 words across the bitpath to utilize all 64 bits that a processor has, or are we only working with 32 bits on a 64bit chip? Like I said before, I'm not an engineer, but it seems as though if you can double your wordlength, you should be able to double your processing output.


Yes, most of us are working with 32bits on a 64bit chip. The thing is, moving to 64bit windows doesn't make things faster. Indeed, I haven't seen anything that suggests x86-64 is faster then x86-32. What I have seen are driver issues when moving to x86-64. I hear Vista64 is better then XP64, but I don't run either to test.

Yes, most rigs are able to run 64bit code. Most rigs however don't even begain to use the power that they are capable of. Increasing clock speed, or adding extra cores to the chip will increase performance. The biggest benefit so far to x86-64 is the extra memory registors. And even then it will be awhile before most people need 4GBs+.
!