Re: [LAD] Should LV2 Activate() clear MIDI CC parameters?

From: David Robillard <d@email-addr-hidden>
Date: Wed May 30 2012 - 04:00:37 EEST

On Tue, 2012-05-29 at 21:51 +0000, Fons Adriaensen wrote:
> On Mon, May 28, 2012 at 10:49:06PM -0400, David Robillard wrote:
>
> > > I'd say that the standard case here is to *keep* all the MIDI controller
> > > settings, not reset them. Just imagine that you're running a reverb
> > > which forgets all settings when you briefly deactivate it in order to
> > > listen to the dry signal. That would essentially render such a plugin
> > > totally useless. Or am I missing something here?
> >
> > It does seem like the most reasonable thing to do.
>
> I agree.
>
> This issue raises some other questions on how plugins could/should
> handle MIDI input. I offer the following for your consideration and
> comments.
>
> The four cases outlined below are in fact points on a continuous
> scale, with (1) and (4) being the extremes. The interesting range
> is between (2) and (3).
>
> 1. The host accepts MIDI input, selects on port, channel, controller
> number (e.g. by providing a 'learn' function), and converts selected
> messages to control port values. The plugins are never aware that their
> input is from MIDI.
>
> 2. As above, but the selected data is presented to the plugin either
> in MIDI format, or via some more generic 'event delivery' mechanism.
> The difference with (1) is that the plugin knows it is dealing with
> events.
>
> 3. The host provides the MIDI ports, but offers all it gets to any
> interested plugin. The plugin performs selection on port, channel
> and controller number.
>
> 4. The plugin provides its own MIDI input, completely independent
> from the host.

LV2 takes the simple non-choice strategy that Jack *really* should have
also went with: you can have inputs and outputs that are a buffer of
time-stamped events with *whatever* in them.

In practice, this has been only really been used for MIDI until
recently. Now more 'high level' things (like loading samples) are
starting to be done using non-MIDI events (e.g. LV2 'atoms' which is
sort of a binary JSON-like thing). Firing around POD-blobs of whatever
is fantastic.

While I would like to see something better than MIDI get established for
notes and such, no interest there so far. Unsurprisingly I don't think
the core API should be burdened with all that.

> Some initial comments:
>
> * One could argue that (3) and (4) lead to a lot of non-trivial
> code and development effort being duplicated or repeated for
> each plugin. But that need no be the case: the plugin SDK could
> offer services that would make this (almost) as easy for the
> plugin developer as (1) or (2).
>
> * In (1,2) the selection of port/channel/controller are part of
> the host configuration, this data would normally be stored in
> the host's session file and the plugins are never aware of it.
> In (3,4), the selection criteria become part of the plugin's
> configuration, and could be included in e.g. plugin presets.
> That makes a difference for the user - but I'm uncertain as
> to which would be best.

I doubt any strong conclusion could be reached either way. The thing
is, there are all kinds of good reasons for plugins to want to process
streams of MIDI (or OSC, or...) events. Even if there was an
alternative somewhat equivalent system for it, this would be true.

Something better than MIDI would be great, but not supporting MIDI would
surely doom any plugin API. Better to support <whatever> and let
somebody pioneer a better alternative when they're actually going to get
around to doing something with it (we can't really afford to be idealist
on this one anyway, the lack of LAD synth plugins is bad enough without
adding that gigantic barrier to porting on top of it)

For example, a great Ardour project would be to support explicit note
control (e.g. the ability to draw a single note and pitch bend it as an
arbitrary vector, and apply controls to *that note alone*) and actually
send all that data to a synth plugin (which MIDI can not do, OSC could,
other options as well) would be a great project. A chicken & egg
situation, though, that could only be resolved by someone actually
setting out to make that happen. Hopefully some day they do.

> * I'd expect (1,2) for e.g. individual modular synth modules,
> it would feel strange if each individual one would implement
> (3,4). OTOH, it wouldn't feel so odd if e.g. a Linuxsampler
> or Yoshimi plugin would do that. [*]

Modulars can conveniently get around it by having separate plugins to do
whatever filtering the user wants. Probably the only "truly modular"
way to do it, really.

Stuff like LinuxSampler and Yoshimi are good examples of large fully
featured things that exist, and though existing code is
troublesome/irritating in an endless number of ways, I think practically
speaking it's clear this stuff has to work.

> * My tentative conclusion would be that a plugin standard
> should provide both (2) and (3), and maybe something in
> between those.

Thus far, 'the market' agrees, so that's probably the right conclusion.

> [*] Some tricky questions to disturb your sleep:
>
> We now have a few plugins that are in fact complete multitimbral
> instruments. Suppose someone would first add LV2 support to AMS
> (it can use plugins as if they were native modules), then turn
> the complete AMS itself into an LV2 plugin.
>
> - Should AMS then be able to load itself ???
> - What impact would that have on the things discussed above ???

Ingen is actually just that: an LV2 plugin that can load LV2 plugins.
It actually saves to valid LV2 plugins, always (i.e. a saved ingen
bundle *is* an LV2 plugin, if always with the same binary). It is
theoretically possible for it to load itself, but this would be insane
for probably obvious reasons. I havn't really done the UI polish here
yet, but what it does is simply detect that an LV2 plugin is actually an
Ingen patch, and loads it as Ingen patches are normally loaded, and not
as a separate LV2 plugin via the generic interface.

I basically consider Ingen largely as a "plugin development without
writing code" project, these days (ideally I'd like there to be no user
visible distinction between Ingen patch and hand written LV2 plugin at
all). The biggest hole right now is that LV2 lacks dynamic
ports/parameters. This means if you drop a patch in an Ardour strip,
you're stuck with the inputs/outputs of the template you loaded -
including controls, which is quite limiting.

Coincidentally, this is one hidden motivation behind 'events for
parameters' I havn't yet mentioned: if you multiplex all the controls to
a single event buffer[1], you get dynamic parameters for free, in what
seems to be a dramatically simpler way than trying to erect a bunch of
API specifically for it.

It's probably worth mentioning that Ingen-as-an-LV2 can be *entirely*
controlled via the plugin's event input port using the above mentioned
atom messages. The patcher UI works through this mechanism, i.e. there
is no sneaky business happening outside the plugin API to make that
work. Precisely the same protocol, but in a text serialisation instead
of binary (as in the LV2 port) is used when you run Ingen and its UI as
separate processes over the network (it looks suspiciously like
fragments of the save files...)

Generic POD events are an extremely good thing, because it lets crazy
people like me do things like this :)

-dr

[1] e.g. struct ControlChange {
    int param_num; /* Allows MUX */
    int time;
    float value;
    int next_time;
    float next_value;
}; /* or whatever... */

_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@email-addr-hidden
http://lists.linuxaudio.org/listinfo/linux-audio-dev
Received on Wed May 30 04:15:02 2012

This archive was generated by hypermail 2.1.8 : Wed May 30 2012 - 04:15:03 EEST