DuckTape

Distinguished
Jan 16, 2002
165
0
18,680
Hello folks.

Does anyone know anything about Denormalization (Denormalisation) and "CPU spikes" occurring when using audio applications on a computer, evidently especially observed in Pentium 4 systems?

Anyone know how any possible resulting problems can be avoided?

I saw these two links regarding the subject posted by someone at the forums at audioforums.com...

http://phonophunk.phreakin.com/p4denormal.html http://www.digitalfishphones.com/main.php?item=2&subItem=6

Thanks much!
DuckTape
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
<A HREF="http://phonophunk.phreakin.com/p4denormal.html" target="_new">Clickable 1</A>
<A HREF="http://www.digitalfishphones.com/main.php?item=2&subItem=6" target="_new">Clickable 2</A>

Does anyone know anything about Denormalization (Denormalisation) and "CPU spikes" occurring when using audio applications on a computer, evidently especially observed in Pentium 4 systems?

Anyone know how any possible resulting problems can be avoided?
I'd never actually heard of it before, but after reading just the first link it all sounded pretty straight forward. So I went to google to dig up more on it from a programmer's perspective and came up with <A HREF="http://www.cs.berkeley.edu/~aiken/cs264/lectures/kahan.ps" target="_new">this</A>. It's a .ps file though, so you need a reader for it.

Anywho, it sounds like a simple case of bad programming. The x87 FPU uses the highest accuracy no matter what the actual accuracy needed, which is nice if you ask me. That I knew. The problem lies in the algoritmhs being used must be causing a lot of underflows at even the highest accuracy level. All of those underflows send the exception handlers haywire as they try to keep up with it all, and WHAM, 100% CPU usage as the algorithm just keeps plugging away at generating the exceptions from underflows.

The questions are, why are these algorithms:
1) <i>NOT</i> checking for an underflow themselves?
2) Forcing the FPU to calculate that kind of accuracy in the first place when the algorithms don't even <i>need</i> that level of accuracy?

The answer to both of those questions is simple: BAD PROGRAMMING. And this is coming from me, an x86 computer programmer.

Personally I haven't seen it happen in any of my code. But if it did, I'd certainly notice it while testing, and I'd certainly have to ask myself WTF I was doing in the first place forcing that poor FPU to calculate that kind of accuracy of near-zero if I don't even need it. I'd certainly have to smack myself for not including my own exception handling to keep the algorithm from happily looping once I've hit that wall of impossible accuracy. So in other words, it's simply bad software, through and through.

And I certainly can't say that it's undocumented when I can find documents pertaining to this situation dating back to as early as 1984. So there's really no excuse other than laziness as to why any code would be causing this problem in the first place.

That aside, it seems to be an inherant situation (<i>you can't really call it a flaw because the hardware handles it excellently, it's the software that is lacking in this case</i>) in <i>ALL</i> x86 and 680x0 processors with an x87 FPU unit. That's a heck of a lot of CPUs.

In the case of the audio software, the problem sounds incredibly easy to avoid. If the audio software is prone to this denormalization problem:
1) Report the bug to the software vendor.
2) Simply don't allow that audio software to render effects on a perfect (or nearly perfect) silent feed. Add just a smidgeon of low-level background noise to give the algorithms something calculably above zero to crunch yet is below the human range of hearing.

<font color=blue><pre>If you don't give me accurate and complete system specs
then I can't give you an accurate and complete answer.</pre><p></font color=blue>
 

Mephistopheles

Distinguished
Feb 10, 2003
2,444
0
19,780
The way I see it, slvr_phoenix is right... It sounds like an inherent problem, but it´s software in nature. One more thing, though... It sounds bad for the P4, as if the Athlon didn´t suffer from the same thing, and that would quite simply be ridiculous. The x86 architecture on both of them doesn´t yield a different result!...

And it goes without saying: the x86 architecture, like all others, has its implications. The guys who programmed these softwares obviously didn´t care about those problems - they could, and should have done so - and then, after releasing their products, they blamed Intel... which is a lot of arrogance for people who don´t mind writing bad code. So get this in your mind: <b>this is <i>bad programming</i></b>, and the only ones to blame are the programmers themselves, because they should know what the hell they´re doing. Floating point systems have inherent limitations, and <i>all</i> competent programmers should write code having that in mind! And don´t go flaming someone else if you´re bad at programming! Neither Intel nor AMD has anything at all to do with that!
 

Hoolio

Distinguished
Jun 26, 2002
291
0
18,780
I would like to note something, since memory prices fell many software comapnies allow their programmers to produce somewhat bloated code, if software programmers were told to make their code concise then we would not require 64mb RAM just to run windows!

No offence intended, I too am somewhat a programmer (being an electronics engineer (in training)
 

Mephistopheles

Distinguished
Feb 10, 2003
2,444
0
19,780
I actually think you´re right there regarding people programming bloated programs, but one thing is bloated code, which <i>doesn´t</i> return unwanted results, but just returns the results in a way that requires more resources, and the other thing is plain bad code, which returns unwanted results. (actually, one is deliberate and the other is not and is just result of incompetence!)
 

Hoolio

Distinguished
Jun 26, 2002
291
0
18,780
Maybe the hardware and software industry are in it together making sure we require faster hardware so we keep buying?


When I program or make a mod for a game so that the pc may have to be faster than the game requirements to run the mod, I go back and see if I can get it to be more effiecient. I am sure that we would not need 3Ghz processors to run games if they could be programmed better. However it is my opinon that for anyone to buy a 3Ghz processor to run Quake 3 or Unreal T 2 is a fool? Why do you need 300 or so FPS? 100 is fine for me and my 1Ghz machine.

:)
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
I would like to note something, since memory prices fell many software comapnies allow their programmers to produce somewhat bloated code, if software programmers were told to make their code concise then we would not require 64mb RAM just to run windows!
I couldn't agree both less and more. (Make sense of that one.)

It's true that <i>many</i> software engineers write code without considerations of optimization. Microsoft especially seems to excel at this. Indeed, many programmers should do more to optimize their code.

At the same time, optimizing for a low memory use is <i>not</i> the only type of optimization that can be done, and many would argue that it is the <i>least</i> important <i>because</i> of:
1) The common availability of RAM in a system.
2) Optimizing for absolute minimum memory usage often results in bad object-oriented programming and/or hard to follow code. (Example: Class member variables are <i>always</i> in scope while the class is constructed and thus 'wasting' memory. So class member variables should <i>never</i> be used if you are optimizing for memory. Needless to say, that can make code considerably more difficult to follow and maintain.)
3) Optimizing for memory reusability (the peak of which was done in the days when 640KB was <i>all</i> you had available) results in not only a minor loss in performance, but also very difficult to follow code.

In other words, optimizing for memory often hinders the software's performance and maintainability. Since most software developers are more concerned with performance than with memory usage, optimizing for memory just isn't often an option.

That aside, there still are some wasteful practices done in what can only be called 'bad code'. Those at the very least could be eliminated.

But let me give you something to think about: A screen resolution of 1600x1200 requires 1,920,000 individual pixels. At a 32-bit color depth, that requires 7,680,000 bytes of data. That's over 7MB <i>just</i> to display a desktop at 1600x1200@32bpp. Now say that we have perhaps four maximized windows open. Each of them consumes an additional 7MB <i>just</i> to display. Now we're up to 35MB for Windows (or Linux or whatever OS you want) <i>just</i> to <i>display</i> these programs, and this is actually just a fraction of the actual graphics resources needed to run a simple 'windowized' operating system on a modern PC.

<font color=blue><pre>If you don't give me accurate and complete system specs
then I can't give you an accurate and complete answer.</pre><p></font color=blue>
 

Hoolio

Distinguished
Jun 26, 2002
291
0
18,780
Yeah while that is very true about the optimisations, I was not just trying to apply it to memory but CPU optimisations too. Also have you noticed to pointless games added to excel many times because the programmers were bored? Why, they should not be allowed to bloat out the programs in this way. Also Microsoft seem to be a law unto themselves adding features that are not required by the average user.


Anyway back to the point I was programming PIC's (Programmable interupt controllers) and we were given some code as part of the project. Now the C compiler for the PIC micro controller did not run on my machine so I programmed it totally in Assembler. As it turns out when I compared the C Compilers assembler to my assember mine was more conscise, therefore requireing less instruction cycles and excuting quicker. :) Now I am not saying every programmer should program in ASM but it would be nice to get some code that uses slightly less resources though.


I do believe the memory problem is not too much of a problem. However CPU utilisation is important, look at the price of a new CPU (very expensive) and now we get programs requiring 500Mhz processors (It was not even a game or graphics editing program!, silly heh?)
 

juin

Distinguished
May 19, 2001
3,323
0
20,780
Yes software are getting cheat but you are in the same boat as m this rise the number of computer sold due to incompentence of programeur.SO i have nothing again it

[-peep-] french