Re: [linux-audio-dev] [ardour] custom graphics + MIDI control for LADSPA plugins

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] [ardour] custom graphics + MIDI control for LADSPA plugins
From: David Olofson (david_AT_gardena.net)
Date: Sat Dec 02 2000 - 07:38:42 EET


On Saturday 02 December 2000 03:28, Paul Barton-Davis wrote:
> >> initialization. for example, a thread handling MIDI i/o or some
> >> equivalent creates a plugin instance (or recyles one),
> >> initializes it and then requests the RT thread to insert it into
> >> the list of current plugins. the only thing here that has to be
> >> RT safe is the handling of the requests, which can be done using
> >> the event system of MAIA or the old standby, a lock-free FIFO
> >> queue of requests.
> >
> >It *works* if you do it that way, but then you don't have bounded
> >response times. That may or may not be acceptable depending on
> > what kind of system you're building - if you're building a
> > sampler or synth for live use, it's most probably not acceptable.
>
> i don't agree. actually, there are three choices here:
>
> 1. the event that instantiates the plugin is handled in the audio
> thread

Not suitable for RT, in it's basic form.

> 2. the event that instantiates the plugin is handled by a
> non-audio thread

Ok for RT, but suboptimal and messy WRT latency control.

> 3. the plugin is already instantiated, and the event simply
> changes its internal state

Excellent for hard RT, but uses up extra memory, and may thus cause
extra cache misses. (Very minor problems in this case, I think.) This
is the way dedicated hardware usually works; some (older) synth DSPs
even have it hardcoded into their cores...

> In the case of (1), we have a possible audio glitch if the
> instantiation takes too long, but this is an unbounded period,
> known only to (if even then) the plugin (author). In case (2) we
> have a possible snafu if the time between the receipt of the event
> and the insertion into the active plugin list is too long. This too
> is unbounded, and may exceed the time in case (1) by a full engine
> cycle if it just so happened that we stuck it in the request queue
> at the end of a cycle. BUT (2) will never cause an audio glitch.
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Very good point, and that's why I finally concluded that this is the
way to go if you really want to do "RT" plugin instantiation. It's
relatively easy, and requires minimal plugin API support.

> Finally, case (3) is optimal: that is, events based on real-time
> events do not instantiate plugins, but merely change their internal
> state. Think of this as being the basis for the "bypass" switch on
> most (all?) VST plugins: you create the plugin, its gets inserted
> into the active queue. if its "on" then a MIDI noteOn or its
> abstracted equivalent (SKINI, anyone ?) causes it to allocate a new
> voice with RT-friendly semantics (note: a voice is generally
> cheaper than a plugin instance). its its off then it does nothing,
> and as an optimization, the host might even detect that and not
> even call its "run()" method.

This is the method I assumed any serious RT synth for pro/live use
would use. It's not what the average application programmer would
consider good design, but RT is a different world with different
rules.

> >I don't see how it would have to be that way. On the contrary, if
> >you're dynamically instantiating voice plugins, some *other*
> > plugin, or a part of the host, would have to deal with these
> > events. You can hardy operate a plugin that doesn't yet exist,
> > can you...?
>
> yes, you can. you're not distinguishing between the object known
> the host after a plugin is loaded from the one(s) known each time
> we instantiate. its far from implausible to have a system where
> whatever causes "instantiate a new plugin" to be delivered to "the
> plugin" delivers this message to this "non-instantiated" object.

Actually I didn't consider the first part of the plugin at all here...

Looking at the API; a plugin basically cannot have an interface
without being at least "partially" instantiated, as the whole
interface is implemented inside the process() function. Anything
dealing with messages to non-instantiated objects would have to be an
addition entirely outside the normal ways of talking to plugins.

> >> whats ugly about this is that it attaches semantics to messages
> >> *before* they get to the plugin. i mean, maybe "parameter 34
> >> changes to 23.45" *means* "start a new voice" for a given
> >> plugin. i think we just have to know when to say "no!" :)
> >
> >Well, if you send a MIDI Note-On to an application, the response
> >might be that it instantiates a few plugins, and then sends some
> >events to those plugins. (It probably will not pass the MIDI event
> >itself on, as the plugins won't understand MIDI, and/or the
> >application will translate the MIDI Note-On into various other
> >things, according to the instrument definition.)
> >
> >Then, imagine that the code that translates the MIDI events is in
> > a plugin instead of hardcoded into the host. The host would have
> > an API for plugin instantiation and connecting, and - guess what
> > - that API would be implemented on top of the same event system
> > that the plugins use for communication within the net! :-)
> >
> >Now it doesn't matter if the code that decides how to respond to
> > MIDI events is in the host, of in one or more plugins - just feed
> > MIDI into it, and connect the output to wherever you have your
> > net building toolkit.
>
> this doesn't get around my basic point which is that you are not
> leaving plugins to interpret external events freely. instead, you
> are first converting external events into a somewhat parsed
> condition, and then passing them to the plugins.

Indeed, and the whole point with that is that events are not all that
different from audio data; it should be possible to route and process
events just like any other data in the system.

I have a small modular synth engine with a "broadcast to voice tree"
event model lying around... Is that more similar to what you're
suggesting?

> if a plugin wanted
> to treat "parameter 34 is now 73.45" as a condition for new voice
> allocation, it may have a hard time doing that if its sent using
> the Property Protocol.

Not really, if the plugin should want to do that (just boobytrap the
Write events from the Property protocol with some extra code), but
the Property Protocol isn't meant to be used that way. Logical
reasoning: How does this fit with the idea that a Property is a kind
of variable?

Anyway; you can't broadcast over a subnet of plugins with the
Property or Instrument Control protocols, as they're basically
peer-to-peer, which might hit performance if you want to control many
things from the same output.

What is it you want to do, exactly?

//David

.- M u C o S -------------------------. .- David Olofson --------.
| A Free/Open Source | | Audio Hacker |
| Plugin and Integration Standard | | Linux Advocate |
| for | | Open Source Advocate |
| Professional and Consumer | | Singer |
| Multimedia | | Songwriter |
`-----> http://www.linuxdj.com/mucos -' `---> david_AT_linuxdj.com -'


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Sat Dec 02 2000 - 08:23:39 EET