Subject: [linux-audio-dev] Re: Quasimodo (Was: Re: LADSPA GUI)
From: Paul Barton-Davis (pbd_AT_Op.Net)
Date: Sun Mar 12 2000 - 21:16:43 EST
>> OK, so take the construction:
>>
>> if (variable) {
>> output = some_opcode (arg1, arg2, arg3);
>> } else {
>> output = some_other_opcode (arg3);
>> }
>>
>> and explain to me how to represent this in a "flat" format that is not
>> essentially just assembler, i.e.:
>
>i guess this has already been answered.
>namely, compute both branches and then
>set:
>
> output = variable*branch1output + (1-variable)*branch2output
yeah, i saw David's answer. But this has to be a joke! If some_opcode
just added some numbers, then maybe this approach is OK. But if
some_opcode computes a 2048 bin FFT of a signal, and some_other_opcode
does convolution with an IR, this is way beyond absurd.
>i think you misunderstood what i meant by flat. i'm not claiming the
>user shouldn't take advantage of scripting languages. i'm just
>saying that the scripting languages should all compile down to the
>same (flat = non-scripted) LADSPA netlist rather than being imbedded
>in the plugins.
OK, thats fine. Thats all that Quasimodo does when compiling. That
means that you have to have nodes in the netlist (or whatever we're
calling it today) to do conditional branches, etc.
It just so happens that Quasimodo allows elements of the top-level
netlist to themselves be hosts of other netlists. The top level part
doesn't know this; it just calls run() ("execute()" in quasimodo right
now) for each member of the top level. Not only that, it *should* not
know this.
Why not ? Imagine that I implement some soft synth as a VST
plugin. Further imagine that I decide I want to be able add new
modular elements to the soft synth. I say to myself "I need a plugin
API". Why not just go ahead and use VST ? Now, the kinds of VST
plugins that the soft synth will expect to run will almost certainly
be different than the ones the main VST host will typically
handle. This is not enforced by the API, but by where/how each VST
host (the top level host, e.g cubase, and the lower level one,
eg. softsynth) go about looking for and loading plugins. The softsynth
may, for example, look for its plugins only in a certain directory.
I don't see this nesting being at all unreasonable, and in fact, it
wouldn't suprise me if its already done. I think its possible that
Pluggo for Cycling '74 does just this: it turns Cycling '74, which is
itself a VST plugin, into a VST host as well (simultaneously).
Your worries about duplication of code seem unlikely to be relevant to
me, because the actual plugins used will be different (though of
course, they need not be, since we have a standard API).
>so this technique is usefull for porting
>existing plugins. however, my argument was
>that large, opaque plugins are not optimal
>from a host/plugin users standpoint ....
>
>instead they should be transparently composed
>of smaller, basic, opcode-sized plugins (think
>CISC vs. RISC). i think the composition is
>possible with nothing more than a netlist.
yes, i agree, but i think you've already agreed that the netlist is
not the natural way to express the composition for a user. so, either
they have to build *everything* with some sort of GUI builder (hello,
MAX!), or there is a script language that lets them (or some wizard)
pre-assemble things into more useful sized objects (hello Quasimodo,
SuperCollider, and many others).
the other problem with this is: what defines the set of smaller, basic
opcode-sized plugins ? more comments on this below.
>the advantage:
>
> 1. you don't have to carry around multiple
> copies of some large chunk of initialization
> code with every plugin.
True. But one alternative is what DeSPider and Cycling '74 does: the
plugin is the whole script interpreter thing, and it loads its own
plugins using some API. What we want to avoid is not this nesting, but
the use of some "internal API". We want everything that loads plugins
to use the same API, so that we can mix-n-match. So, the
script-based systems either have to do the host-within-plugin thing described
above, or their plugins (which are written in a language that makes
producing .so's tricky) have to be converted into something else.
> 2. you don't have to worry about running
> things like compilers or interpreters on
> every host (my hypothetical nord-lead
> clone)
i don't understand your point here. if i have a scripting langauge,
call it XXX, and I write something in XXX that does something useful,
and I want to use that thing in an audio app, what can I do ? I can
either make the audio app able to understand the script language, or I
can convert the script into some other form that the app can handle.
i wrote quasimodo mostly as a pulsar/nord lead clone, and i consider
that to do this job well (actually, better than either of those
systems), there has to be script language for connecting the small
pieces (simple DSP opcodes) into useful programs.
> 3. the chance that i'm going to dive
> into the source code for some giant
> plugin written in C or a custom scripting
> language just to fiddle with it is less
> likely than if my desktop host lets me view
> it as a bunch interconnected opcode-sized
> plugins which i already know how to use.
Fine, but what happens when you want to use a plugin which is itself
designed to use plugins arranged in an interesting fashion ?
MAX and the Kyma have an interesting take on this, whereby they
collapse sections of the netlist down to a "black box". But I am still
adamantly opposed to forcing people to build everything graphically -
its slow, tedious and not very powerful. The only alternative I can
see is some sort of language to build them, and then we're back to
square one, right ?
> 4. for the reason above, transparent,
> compositional plugins promote code reuse.
> if everyone writes giant, monolithic
> plugins rather than reusing more basic
> parts, they will spend a lot of time
> duplicating each other's efforts.
Absolutely. Hence our effort here.
> 5. transparency makes possible runtime
> optimization. if the three or four
> rack-sized plugins you're using are
> represented to the host as one big
> network of low-leve operations, it
> can optimize away redundant computations.
> with large, opaque plugins, you could
> be computing the FFT of the same signal
> inside multiple plugins when you really
> should only be computing it once.
I think its good for us to dream. But I don't know of any host of any
plugin API that could come close to this. Its an interesting CS
challenge, but it strikes me as far over the horizon. When I think
about the most complex instruments I have ever seen in Csound, I don't
think I have ever seen a single example where the existence of two
different complex instruments could be reduced to some common
operations. And thats with extremely low level opcodes. The only
exceptions are what I would call "sensing opcodes", like the Csound
midi opcodes that return information about the MIDI environment. In
Csound, these are effectively collapsed into the host; the opcodes
themselves just know where to get the (cached) information from the
host.
Yes, I supposed that routing a signal through 2 FX plugins both of
which needed to compute the FFT would give rise to the possibility of
doing this. But to *require* that plugins are composed of a things
("opcodes" ?) drawn from a common set is to make a vastly more
restrictive API than we're talking about with LADSPA. It means that
any time something wants to compute an FFT, they put a (known) FFT
element in the right place in the netlist. This is really no more than
a variant on a system like Csound/Quasimodo/SuperCollider etc., rather
than a generic plugin API.
Obviously, I am all for such things, but I think that its different
than the goals of LADSPA.
--p
This archive was generated by hypermail 2b28 : Mon Mar 13 2000 - 04:55:11 EST