Re: [linux-audio-dev] [ardour] custom graphics + MIDI control for LADSPA plugins

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] [ardour] custom graphics + MIDI control for LADSPA plugins
From: Paul Barton-Davis (pbd_AT_Op.Net)
Date: Sat Dec 02 2000 - 04:28:08 EET


>> initialization. for example, a thread handling MIDI i/o or some
>> equivalent creates a plugin instance (or recyles one), initializes
>> it and then requests the RT thread to insert it into the list of
>> current plugins. the only thing here that has to be RT safe is the
>> handling of the requests, which can be done using the event system
>> of MAIA or the old standby, a lock-free FIFO queue of requests.
>
>It *works* if you do it that way, but then you don't have bounded
>response times. That may or may not be acceptable depending on what
>kind of system you're building - if you're building a sampler or
>synth for live use, it's most probably not acceptable.

i don't agree. actually, there are three choices here:

  1. the event that instantiates the plugin is handled in the audio
     thread
     
  2. the event that instantiates the plugin is handled by a non-audio
     thread

  3. the plugin is already instantiated, and the event simply changes
     its internal state

In the case of (1), we have a possible audio glitch if the
instantiation takes too long, but this is an unbounded period, known
only to (if even then) the plugin (author). In case (2) we have a
possible snafu if the time between the receipt of the event and the
insertion into the active plugin list is too long. This too is
unbounded, and may exceed the time in case (1) by a full engine cycle
if it just so happened that we stuck it in the request queue at the
end of a cycle. BUT (2) will never cause an audio glitch.

Finally, case (3) is optimal: that is, events based on real-time
events do not instantiate plugins, but merely change their internal
state. Think of this as being the basis for the "bypass" switch on
most (all?) VST plugins: you create the plugin, its gets inserted into
the active queue. if its "on" then a MIDI noteOn or its abstracted
equivalent (SKINI, anyone ?) causes it to allocate a new voice with
RT-friendly semantics (note: a voice is generally cheaper than a
plugin instance). its its off then it does nothing, and as an
optimization, the host might even detect that and not even call its
"run()" method.

>I don't see how it would have to be that way. On the contrary, if
>you're dynamically instantiating voice plugins, some *other* plugin,
>or a part of the host, would have to deal with these events. You can
>hardy operate a plugin that doesn't yet exist, can you...?

yes, you can. you're not distinguishing between the object known the
host after a plugin is loaded from the one(s) known each time we
instantiate. its far from implausible to have a system where whatever
causes "instantiate a new plugin" to be delivered to "the plugin"
delivers this message to this "non-instantiated" object.

>> whats ugly about this is that it attaches semantics to messages
>> *before* they get to the plugin. i mean, maybe "parameter 34
>> changes to 23.45" *means* "start a new voice" for a given plugin. i
>> think we just have to know when to say "no!" :)
>
>Well, if you send a MIDI Note-On to an application, the response
>might be that it instantiates a few plugins, and then sends some
>events to those plugins. (It probably will not pass the MIDI event
>itself on, as the plugins won't understand MIDI, and/or the
>application will translate the MIDI Note-On into various other
>things, according to the instrument definition.)
>
>Then, imagine that the code that translates the MIDI events is in a
>plugin instead of hardcoded into the host. The host would have an API
>for plugin instantiation and connecting, and - guess what - that API
>would be implemented on top of the same event system that the plugins
>use for communication within the net! :-)
>
>Now it doesn't matter if the code that decides how to respond to MIDI
>events is in the host, of in one or more plugins - just feed MIDI
>into it, and connect the output to wherever you have your net
>building toolkit.

this doesn't get around my basic point which is that you are not
leaving plugins to interpret external events freely. instead, you are
first converting external events into a somewhat parsed condition, and
then passing them to the plugins. if a plugin wanted to treat
"parameter 34 is now 73.45" as a condition for new voice allocation,
it may have a hard time doing that if its sent using the Property
Protocol.

--p


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Sat Dec 02 2000 - 05:11:36 EET