Subject: Re: [linux-audio-dev] plugin ideas based on Guenter's mix plugin API
From: David Olofson (audiality_AT_swipnet.se)
Date: to elo 26 1999 - 21:56:28 EDT
Well, things are really moving here! :-)
First, I have a few comments related to those Eric posted...
est_AT_hyperreal.org wrote:
(...)
> * The whole thing should move to a class/instance paradigm so I can
> instantiate multiple instances of a plugin. Parameter descriptions
> should be in the class and their values in the instance, etc.
Agree. Absolutely essential.
> * The only global defined by a plugin should be a uniquely named
> function that returns a plugin class. The idea here is to make it
> possible to use static as well as dynamic linking.
Nope. For kernel modules, it only makes sense to export init_module()
and cleanup_module(), and init_module() has to
register_plugin(&audioeff_class) to tell the engine about the new class.
This eliminates name space conflicts, as no symbols are exported. Also,
a module/library can register multiple plug-in classes this way.
The registered audioeff_class should have a few callback functions,
including calls for creating and deleting instances.
> * String parameters should be provided for.
On the contrary, as much as possible of the not directly signal
processing related stuff should be kept *out* of the DSP part of the
plug-on spec. User interface stuff goes elsewhere, no matter if we're
dealing with a user space application, or the RTLinux Audiality engine.
Separation is good for stability. (You know what DirectX is like, right?
Let's not do it that way...)
> * We need a flag to say whether a parameter can be modified in between
> any two calls to the process method as opposed to being settable only
> for initizlization purposes.
Hmm... My plans are very different. I don't want parameters in the low
level plug-in spec at all. There should be only time stamped events and
data streams. Parameters are realized through an event based interface,
which means
1) Only one interface + subsystem for automation, MIDI-style
events and system control events.
2) Events and parameter changes are independent of buffer sizes.
3) *Everything* can be defined in relation to real time.
(A way for plug-ins to tell the engine which events they
support, and their real time characteristics is needed.)
> * The number of channels per input and output signal should be a
> plugin property vector. Counts of frames and samples should be
> clearly distinguished throughout.
IMHO, input and output signal should always be mono, because anything
else completely breaks the flexibility of free signal routing. Unless
you automatically throw in split/merge plugs where needed... I don't
think that's a good idea from the performance perspective.
OTOH, there might be a point in using multichannel bus formats for some
kinds of plug-ins, so supporting it isn't a bad idea. It can result in
faster code as long as everything fits nicely together.
> * The sample rate should be optionally settable at instance creation
> time and the input and output buffer sizes should be optionally
> settable per call to the process method. I usually code in a way that
> can accomodate this, but I agree that it's important to support less
> flexible plugins as well.
The basic rule is that the engine gets to decides what sample rate to
use. Complex processing nets with plug-ins that have various smart ideas
about what sample rates to use just becomes a mess of lost quality and
CPU power...
> * To handle resampling, variable output buffer sizes are important.
> It's also important that the process method can report how many output
> frames it generated (is this the function of the return value of the
> process method?). It may even be important that the plugin can
> be queried as to how many output frames *will* be generated to avoid
> overruns due to disagreements about fencepost issues.
DO NOT adapt output buffer size to the inputs!!! It's doing it all
backwards.
This will not work in a low latency real time system, as there's no way
in h*ll to keep track of all plug-ins that happen to deliver a few
samples less than expected, and needs another call, which requires
another bunch of source data buffers to be generated
first,................... Forget it.
The algorithm is: "I need X samples. How many input samples do you need
for each channel?"
That's what this section of my little draft is for:
int look_ahead; /* Number of extra frames needed
* after the end of input buffers.
* Can be negative.
*/
int skip_behind; /* Number of frames skipped
* at the start of input buffers.
* (That is, inputs[n]+=look_behind
* before process_XXX() is called.)
* Can be negative.
*/
Either that, or sample rate and data format conversion cannot be done in
real time without extra latency and an incredibly stimulating complexity
in the engine.
> * The popup method should probably be expanded to a vector of generic
> gui methods.
Nope. Drop it completely. There's no reason at all to have it in the
same file as the signal processing code. And you *CAN'T* unless you
restrict yourself to user space.
Besides, separating the GUI code from the processing code means that a
GUI crash doesn't even have to affect the signal processing - the GUI
process can just do an Explorer; that is, restart and hope the user
didn't notice... And you also get the bonus of knowing for sure what
part of your latest beta plug-in caused that crash after 3 hours of
correct operation while trying to reproduce that reported bug. And you
can skip the GUI code entirely and use a simple markup script to tell
the plug-in GUI engine what parameters there are, and how to display
them. No GUI code to port.
> * Memory allocation hooks are probably desirable. That way I can get
> a plugin to throw bad_alloc even if it know nothing about C++.
Yes, indeed. Also, it has to be strictly defined *when* you can allocate
and free memory, as it's not real time operations. It's possible to
design a fast, non fragmenting real time memory manager with a locked
memory pool for those who really need it, but dynamic allocation is
generally considered a no-no in hard real time systems. I'll have
something like that in the Audiality spec, but it's STRONGLY discouraged
for plug-ins to use it. It's an expensive resource...
Memory allocation/deallocation should be done in non real time context
as far as possible.
That's about what I have to say right now... I'll expand on my ideas for
the event passing/routing system later.
//David
This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:25:53 EST