Re: [LAD] LV2 Achievement of GMPI Requirements

From: David Robillard <d@email-addr-hidden>
Date: Sat Aug 04 2012 - 05:27:05 EEST

On Sat, 2012-08-04 at 10:10 +1200, Jeff McClintock wrote:
> > I think you are in error considering these things mutually exclusive.
> > Yes, hosts dealing with MIDI binding is how things should be done, but
> > crippling a plugin API to not be able to handle MIDI is just that:
> > crippling. Maybe I want to patch up a bunch of plugins to process MIDI
> > events, or have some MIDI effect plugins: these are certainly
> > reasonable things to do.
>
> Hi dr,
>
> I think we mis-communicated. MIDI is *fully* supported, including SYSEX,
> delivered to plugins as raw unmolested bytes. Plugins can and do function as
> MIDI processors.
> The 'Guitar de-channelizer' is supplied as an example MIDI processor with
> the SDK, as is the 'MIDI to Gate' plugin.
>
> The idea of binding MIDI to the plugin's parameters is a purely optional
> alternative.

Sure. Sorry, I was basically ranting against the concept of not
supporting MIDI whatsoever, which comes up, not what you've actually
done.

> > LV2 UIs are also like this, though there is an extension to provide a
> > pointer to the plugin instance to the UI.
> >
> > In theory this should only be used for displaying waveforms and such,
> > and always be optional.
>
> How I display waveforms is the API has a function sendMessageToGui(), that
> sends an arbitrary bunch of bytes to the GUI in a thread-safe manner. You
> can build on that to send waveforms etc. Neither DSP nor GUI needs a
> pointer to the other (but they can if they *really* want to).

Yes, even for this using a proper mechanism is better, though when the
possibility of cross-process or cross-machine comes up, you have to
start worrying about how much data is being tossed around here. My
serialization stuff does not currently deal with actual audio. This is
largely untreaded territory for LV2, though somebody is tinkering with
it right now. That said, in most hosts (that do not have a process
barrier there) doing this will simply work because they don't care what
those blobs of bytes are whatsoever. So in this respect it is the same
as your API.

To be honest I care *dramatically* more about more important things than
whiz-bang GUIs. Notably that big red section of the table that stems
from inheriting LADSPA's control ports mistake and not having parameter
events is way up there.

> > Your argument sounds very obviously right because it's about numeric
> > parameters, but note and voice control is trickier. That involves
> > inventing a new, better event format.
>
> I will disagree and say MIDI note and voice control is pretty good,
> *provided* you support MIDI real-time-tuning-changes ( this is an existing
> MIDI SYSEX command that can tune any note to any fractional pitch in
> real-time. AKA micro-tuning) ...and.. support "Key-Based Instrument Control"
> (another little-known MIDI command that provides 128 per-note controllers ).

There is no voice control in the sense that I mean. When I think
post-MIDI note control, I am thinking things like being able to
independently draw note vectors in a sequencer and bend each note
independently around, and likewise for parameters other than frequency.
This also maps well to controllers like the Continuum[1] or Soundplane.

> By supporting these two MIDI commands you get the familiarity of MIDI with
> the addition of:
> * Fractional Pitch.
> * Per note controllers.

Fair enough, I suppose that is more or less true (if you count sysex,
anyway), but not *quite*.

> By binding MIDI to the plugin parameters as 32-bit 'float' via meta-data,
> you remove the need to support MIDI explicitly, you kill the dependency on
> MIDI's 7-bit resolution, and you remain open to in future extending the API
> to support OSC.

I agree entirely that using MIDI directly in plugins for parameters is a
terrible idea.

> GMPI parameter events include an optional 'voice-number', this extends the
> MIDI-binding system to note events and polyphonic aftertouch. I can build an
> entirely MIDI-free synthesiser, yet the metadata bindings make it fully
> MIDI-compatible.

I will look in to your events to make sure I don't miss anything...
though I am strongly drawn to making the events themselves dictionaries
so it doesn't matter if I miss anything... but that will be less popular
than a rigid static struct...

One could argue that you, and I, could and perhaps should just use OSC
for this, though. It is a bit hand-wavey to say you can support both,
or new protocols entirely, by virtue of having added yet another event
type. Really you're just making the host translate from X to Y, and
added a new Y to the pot to deal with. No judgment really, because I
have done the same, and will likely do more of the same, but that sales
pitch is questionable :)

OSC unfortunately sucks entirely for high level control (and is annoying
to build in real-time since you must know how many arguments a message
is going to have to be able to construct it at all, making realtime
append-only APIs impossible). My stuff is a lot more of a JSON style
thing, which works much better (by "high level control" I mean stuff
like loading waveforms, expressing complex state, arbirarily complex
messages, and so on). However the case for using it for at least note
stuff, and perhaps even parameters, is a pretty good one. At least it's
kinda sorta an existing standard.

Difficult decisions here, which is why LV2 doesn't have such events yet.
I really loathe to create something that isn't already serialisable by
the existing mechanism (the main important consequence of which means
UIs and plugins can talk however they want), but one can at least
hypothetically expect other programs to send OSC. Perhaps your "let the
host deal with mapping to what the plugin spec wants" solution is best
after all, but I have to play extremely carefully with saying "oh well
the host can do all this fancy stuff" because in reality that means host
authors get pissed off at me and LV2 and complain about it. Selling to
yourself is much easier :)

(Support libraries are the solution to this, naturally, but I have
learned that people have a powerful aversion to using *anything*, no
matter how completely unoffensive a dependency it is. To address this,
LV2 will probably reluctantly grow an "SDK" that includes essentially
all my libraries soon, but I remain completely convinced that this is
awful, and the spec should be more timeless and have no ABI issues
contained in it whatsoever, and hate this idea with a passion. Nobody
gives a shit about specifications, or implementation independence. The
better part of a decade later there's still precisely one left alive:
mine. Time to give up those ideals I suppose)

> > This is precisely the kind of reason why monolithic non-
> > extensible specifications suck.
>
> GMPI is extensible too. For example MS-WINDOWS GUI's are provided as an
> extension (so the core spec can be adapted to other platforms), as is
> support for some SynthEdit specific feature that don't really belong in the
> core spec.

Kinda sorta. Better than most, I'll give you that :)

Cheers,

-dr

_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@email-addr-hidden
http://lists.linuxaudio.org/listinfo/linux-audio-dev

Received on Sat Aug 4 08:15:01 2012

This archive was generated by hypermail 2.1.8 : Sat Aug 04 2012 - 08:15:01 EEST