Re: [linux-audio-dev] Paper on dynamic range compression

From: Alfons Adriaensen <fons.adriaensen@email-addr-hidden>
Date: Thu Oct 05 2006 - 11:45:54 EEST

On Wed, Oct 04, 2006 at 10:51:08PM -0500, Andres Cabrera wrote:

> I've written a paper analyzing the characteristics of some software
> dynamic range compressors. The interesting part concerning linux, is
> that all my results show jaggedness on the gain reduction curves. I did
> the tests using Ardour, Audacity and Rezound with the same results,
> which points to something strange in some part of the process.

Or in your measurement methods which are ill-defined, making it all but
impossible to correctly interpret some of the results.

- How do you measure gain ? By comparing single input/output samples ?
  It seems so, otherwise how do you obtain a gain vs. time curve at
  50 Hz with sub-millisecond resolution (Fig. 3) ?
  This will be invalid in many cases, for example if the plugin includes
  audio delay to achieve 'zero latency', as you suggest some of them do.

- This delay is the first thing that should be measured. Without
  this information it is impossible to evaluate the results.

- How on earth do can you define the level of a white noise signal
  by a peak value ?

- What is a square wave at 0dB FS ? Positive and negative samples
  at the maximum amplitude ? That does no correspond to a analog
  square wave signal.

- How do you expect to measure distortion using square waves ?

-- 
FA
Lascia la spina, cogli la rosa.
Received on Thu Oct 5 12:15:04 2006

This archive was generated by hypermail 2.1.8 : Thu Oct 05 2006 - 12:15:04 EEST