Software vs. Hardware

I don't know if the cpu forum is the right place for this thread.

I'm interested in your opinions about today's software. In my opinion, with this dramatical development of cpu architecture and all hardware, programers
are becoming lazy, maybe not prepared, in optimizing the code to get full advantage of the processor power.

What's the catch?
26 answers Last reply
More about software hardware
  1. Sure, and it's always harder to sind full optimization on the software you use; Programming companies can't catch up that fast with new architectures and instruction sets cince thaqt would vut out all the previous processors without them.
    Take for example the 3D modelling tool I use, Blender; the official release is only optimized for MMX. It's almost 2 times slower than the SSE3/64bit optimised build.
  2. I personally think they are lazy about updating their code for newer hardware. That's for sure.

    But at the same time, I know it's much harder to do that saying: "OK, let's do it!" They all have to keep older hardware running on it, so just updating is not nearly enough.

    What I'd personally like to see more is applications that detect your cpu and install the instructions optimized for your cpu. All that is required is different install procedure but in the end you get code optimized for your personnal cpu. It could be as simple as detecting what instructions set your cpu support and installing the corresponding code. Say you have a P3 at 1.0GHZ, it would only install tha MMX and SSE instructions code. Smaller and more efficient install would result from it.

    From there, they could built modular design applications in which updating the hardware support could be add in the long run much more easily. The core of the application could be update in case of larger update as going from 32 to 64-bit Windows and CPU. It would be much easier to stay on top of it.

    I know part of this is already used. Let me know where I wrong. I'm no softwere engeneer, so I might be totally off. Still, wishfull thinking never hurt... I hope... :)
  3. What you state is a very good idea. I don't have anything in hand to see that it is really working. Can you pass me a link or article where such topic is discussed acompanying that kind of programmes.
  4. You can do that with Open Source software. You download the source compile it on the machine you are going to use it on, then it's optimized for that arcitecture.
    Thats the premise behind Gentoo linux, they don't offer software repositories like many of the other distrobutions you install everything from source, which is a bit more work, but offers the speediest environment you can have.
  5. Quote:
    I don't know if the cpu forum is the right place for this thread.

    I'm interested in your opinions about today's software. In my opinion, with this dramatical development of cpu architecture and all hardware, programers
    are becoming lazy, maybe not prepared, in optimizing the code to get full advantage of the processor power.

    What's the catch?


    What's the question? :?
  6. Quote:
    I don't know if the cpu forum is the right place for this thread.

    I'm interested in your opinions about today's software. In my opinion, with this dramatical development of cpu architecture and all hardware, programers
    are becoming lazy, maybe not prepared, in optimizing the code to get full advantage of the processor power.

    What's the catch?


    What's the question? :?

    Maybe it'll sound dumb, but why to invest in the state of the art hardware when the only benefit is for encoding sth. As far as I know there are very few applications that make an advantage of the modern cpu. Correct me if I am wrong.
  7. Quote:
    It could be as simple as detecting what instructions set your cpu support and installing the corresponding code. Say you have a P3 at 1.0GHZ, it would only install tha MMX and SSE instructions code. Smaller and more efficient install would result from it.

    It actually happens; 3DS MAX gives the possibility to activate SSE optimizations while rendering. This is one of the happiest cases because in many other cases it is much more easier to say than do. Optimization affects almost all the code and would be like building 4 or 5 versions of the same app (MMX optimized, SSE, SSE2, SSE3, x64 etc), in a society where time is money. Think about it; you'd rather optimize your app for the minimum CPU it's for and up, say SSE and then think of the new version instead of starting another optimized version. Even though rendering is very important; I'd still use blender, even if there was no optimised version because I like the overall feeling. (currently use the SSE3/x64 optimized)
  8. Sup cuz.
    I'd say a fair amout of games and applications take advantage of the capabilities of the modern cpu.
    Someone correct me if I'm wrong.
    DaSick.
  9. As m25 has said:
    Quote:
    you'd rather optimize your app for the minimum CPU it's for and up, say SSE and then think of the new version instead of starting another optimized version


    The majority of software has to be written for the mainstream market. I believe that the majority of computers sold are still single-core Celeron and Semprons with integrated graphics. I would be suprised if the dual core market penetration is over 30%.

    Publishers are not going to invest time & money into developing for systems the market doesn't have, so there probably won't be many programs (aside from high-end games and professional apps) that will take full advantage of your CPU or GPU.

    Programs may start coming out with specific optimizations for multicore chips, but as far as being built specifically for multicore - that will probably be a couple years down the road. (At least for mainstream).
  10. Which is why there are more PC games than Mac, and also on account of the PowerPC architeture, right?
  11. Quote:
    Which is why there are more PC games than Mac, and also on account of the PowerPC architeture, right?


    Right on the first count - Why spend the resources on a platform with little market share.

    Right on the second count - Why spend the resources learning a new architecture for a platform with no market share.

    To be fair to Apple - their sales are increasing. Now developers need to find out if people are going to be using OS X on them, or are people buying Apples for the "cool" factor and primarily using Windows on them.

    I can see the latter. I think eventually Apple may become a botique, upscale Windows PC vendor.

    Who knows? - Apple may release OS X (or whatever version) to run on any x86 PC. They could become the next Microsoft.
  12. I agree with you.

    So, do you mean that the cpu industry will stop for a while after introducing quad cores or not, bearing in mind that dual cores come at very affordable price?

    If the state with programms is at this level it would mean less power dissipation not taking the full use of dual or quad cores?

    Can this multicore evolution be the answer for the heat problem in the next couple of years?

    Correct me!
  13. Quote:
    I agree with you.

    So, do you mean that the cpu industry will stop for a while after introducing quad cores or not, bearing in mind that dual cores come at very affordable price?

    It may well slow down. More cores mean more die space - which means smaller fabbing. Intel is currently at 65nm, and at this size 4 cores is probably all that's going to reasonably fit on a chip. Intel also uses a seperate memory controller, so I think if they go with more than 4 cores, it may saturate the fsb and negate the advantages of more cores.
    When Intel moves to a 45nm process, they may implement an Integrated Memory Controller (IMC). With an IMC and a 45nm process, you may see 8 cores on a chip (probably around 2008). As they move to 32nm - they may have space for more cores.

    4 or 8 core chips will probably be for the novelty or enthusiast markets for quite some time. Dualcore will probably be the standard for years. (Desktop market. In the Server market, the more cores - the merrier.)


    Quote:
    If the state with programms is at this level it would mean less power dissipation not taking the full use of dual or quad cores?

    Not really, if Windows comes up to speed on multicore. Ideally Windows will have better support, and be running all background threads more efficiently along the multicores.

    Quote:
    Can this multicore evolution be the answer for the heat problem in the next couple of years?

    Correct me!

    Intel's new line of chips are much more efiicient in both power and heat then the previous generation, and AMD has their EE line of lower-power chips. Energy Efficiency is becoming a priority in the basic design.
    If multithreading is used correctly, programs will be able to use a slower (cooler) chip with more cores, than a faster chip with less cores.
  14. The problem is that we buy new hardware and then load it with all our old, unoptimized software. Some of today's new software applications are tuned for today's hardware.
  15. I think that as it is stated in many other threads and reviews, that optimised software will come up after a year vista is officially released. Maybe it'll speed up things in software industry.
  16. Quote:
    The problem is that we buy new hardware and then load it with all our old, unoptimized software


    True.
  17. I agree. That's why I recommend that one should build/buy a computer based upon the tasks it will be expected to perform.
  18. Quote:
    I think that as it is stated in many other threads and reviews, that optimised software will come up after a year vista is officially released. Maybe it'll speed up things in software industry.


    I hope so.

    I think you may see some development before then. People who have multicore now, spent more money for it than those who bought a Celeron or Sempron (or didn't buy at all and still have an old singlecore system).

    I think developers will see that dualcore users will spend the money, and therefore, the developers will start optimizing their software for it. They will, however, have to keep it "dumbed down" for the legacy machines.

    So, in terms of "optimized for dualcore" software - I think we'll see that soon.
    In terms of "made for dualcore" software - I think it will be awhile.
  19. Nice point.

    But how to keep resistant to such agressive marketing where cores, fsb's, htt's, ht's, ddr2,3,4... are in every corner. :)
  20. Quote:
    Nice point.

    But how to keep resistant to such agressive marketing where cores, fsb's, htt's, ht's, ddr2,3,4... are in every corner. :)


    :lol:
    Well, the nice (and most frustrating) thing about technology is that there's always something better coming. :wink:
  21. Quote:
    Nice point.

    But how to keep resistant to such agressive marketing where cores, fsb's, htt's, ht's, ddr2,3,4... are in every corner. :)


    :lol:
    Well, the nice (and most frustrating) thing about technology is that there's always something better coming. :wink:

    I know that you, me and others serious in this know what they want, but we are 10% tops, i think.

    Others are just spending cash, but what the heck, they don't go to TGF! :wink:
  22. Like how when I finally got a X6800 it becomes outclass and made obsolete by the new Kentsfield core. And within 3 or 4 months my new card will be outdated? :cry:
    The only way to stay on the cutting edge is to say ahead of the cutting edge. Or OC
    And that is expensive (and troublesome) as all hell.
  23. Quote:
    Like how when I finally got a X6800 it becomes outclass and made obsolete by the new Kentsfield core. And within 3 or 4 months my new card will be outdated? :cry:
    The only way to stay on the cutting edge is to say ahead of the cutting edge. Or OC
    And that is expensive (and troublesome) as all hell.


    Yeah, when Kentsfield comes out your whole system will be out of date.

    I'll tell you what I'll do:
    I'll trade you two Commodore 64 systems for your current one.
    This way you'll have 2 out of date systems, and it will only cost you one. Therefore, you'll double the value. 8O


    :P
  24. I resist the urge to strangle you. Somewhat. :evil:
    Atleast the motherboard will support the C2Q Kentsfield.
  25. It seems as though for the most part if the application is popular enough and shows signs of standing strong on the market, then most likely they will optimize it for whatever architecture is popular atm.
  26. Actually you can do optimizations based on the cpu in use, its more work however.

    One way of doing it is to detect the cpu at app startup, lazy load dll's as needed using loadlibrary. Of course this is a windows way. I am pretty sure you could do this on linux etc also but I am unfamiliar with coding for them.

    You can make a wrapper class to load on demand for example. Its really quite simple to do. I just write a program to make the header for me.

    class contains a function pointer set to null in the ctor.
    Call the class function, checks if pointer is null and loads if required, calls function through pointer.

    So detect the cpu, when function is required(IF) check cpu capabilities and load the appropriate dll file.
Ask a new question

Read More

CPUs Hardware Software Product