Re: [linux-audio-dev] Plugin madness

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] Plugin madness
From: David Olofson (audiality_AT_swipnet.se)
Date: pe loka   01 1999 - 18:38:21 EDT


On Fri, 01 Oct 1999, Guenter Geiger wrote:
> As it seems it is getting calm around plugins again, everyone agrees
> that they disagree ... and again an ambitious project for linux audio
> died.
>
> No !

Right, and I'm not giving up here. There is a reason why I'm writing an order
of magnitude more english than C.

(However, I need to get to bed soon, as I'm off to Denmark tomorrow. Will be
back sunday, perhaps with more ideas.)

> Ok, letīs summarize. What do we actually need (or better what do
> plugin writers need). We need to agree on a common level, which will
> satisfy the needs for writing plugins.
>
> Itīs really damn easy. No matter which audio engine, sound editor,
> soft synth or whatever is built around. A plugin is not more
> that a simple routine which modifies some signals, together with some
> properties plugins share under each other, and parameters, or
> k-rate signal or events or whatever you call them.

Seems simple on the high level, but the complicated part is doing it
efficiently, and in a flexible enough way in low level code... We really need
to work hard on this, or we'll just end up with another useless "standard" thay
won't be adopted.

> Parameters
> ==========
>
> The point on which people seem to disagree are these "parameters".
> Letīs look at it from the plugin programmers point of view.
> He just wants to have some variable (letīs say of type "double"),
> where he stores e.g his filter frequency.
>
> The problem is, that this variable has to be communicated to the
> application, so that the application can handle it, change it.
>
>
> Ways to do it:
>
> 1) VST plugin like
> The plugin programmer declares the variables he wants and communicates
> them via some setparameter(), getparameter() calls.
>
> --> not very efficient, as there is a function call for each parameter
> change.

Yes, inefficient, clumsy, not very clean and nice, and it still doesn't handle
sample accuracy without splitting buffers.

> 2) Quasimodo like
> - the parameters are communicated as pointers, the application
> can store them and acess the parameters directly.
>
> --> better, more flexible.

A *lot* nicer, and much more efficient. Still no sample accuracy without buffer
splitting, though...

> 3) Events system
>
> <To be filled out by David>, I still donīt know how his will be handled.

Well, basically, it's about turning events into data of a format similar to the
audio buffers. That is, you build a buffer of timestamped events to be executed
during the processing of the buffer, and then you send it all to the process()
callback with a single call. Event timing is independent of buffer size, and
the process() function can send events to plug-ins down the processing net,
still with sample accuracy.

> Sample accuracy:
> =================
>
>
> 1) Fixed buffer size, parameter are set with "sample offset" values.
> The VST way
> bad --> sample accuracy is only achievable by doing it in the process
> function

Fixed buffer size is good for optimization, but not very nice if you can't pass
accurate timing info...

> 2) Buffersize is passed as a process argument:
>
> --> better of both worlds.
> some applications want to have fixed buffers, and change paramters
> directly between process() calls
>
> other applications may implement an events system and call the process
> function with different buffer sizes (up to single sample processing).

...which leads to lots of function call overhead when there's heavy event
traffic... Inline code will have to be slow to be beaten by such a design,
performance wise.

With a timestamped/buffered event system, you can, in a way, do both.

> (NOTE: at this place I *can* imagine an events system, we may even
> add an
>
>
> Application Library
> ===================
>
> where the whole events system is implemented, and probably other
> solutions, multithreading and the like.

For my design, this is mostly inline code funcs/macros, and a few function
calls to use in special cases.

> Ok, enough for now, I know we havnīt touched the GUI topic, but first
> let us agree upon the Plugin API.

Yep. I think the GUI part belongs pretty far from the DSP part anyway. Some
plug-ins may not even have a GUI, and some kinds of systems won't have much of
a UI at all. What about games sound engines and embeded systems, for example?

//David

 ·A·U·D·I·A·L·I·T·Y· P r o f e s s i o n a l L i n u x A u d i o
- - ------------------------------------------------------------- - -
    ·Rock Solid David Olofson:
    ·Low Latency www.angelfire.com/or/audiality ·Audio Hacker
    ·Plug-Ins audiality_AT_swipnet.se ·Linux Advocate
    ·Open Source ·Singer/Composer


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:27:12 EST