Subject: Re: [linux-audio-dev] No IPC in LADSPA?!
From: Paul Barton-Davis (pbd_AT_Op.Net)
Date: Wed Mar 29 2000 - 04:26:16 EEST
In message <38E0F7CB.D92E4B01_AT_av.com>you write:
>Paul Barton-Davis wrote:
>>
>> The problem with "delivering individual events as fast as it
>> can" is demonstrated rather nicely in Quasimodo. Because of the way it
>> implements "patches" between modules, external events (GUI stuff, MIDI
>> data, etc.) that alter parameter values happen *too* fast.
>
>This is a good point, but I don't have any data on the response time
>of my external synths (I suppose I could write a little test program).
>During the realtime takes, I want messages to go through the
>flowgraph as fast as possible, but during playback I do want the delay
>to be taken into effect. However, I'm already used to fiddling with
>offsets manually to acheive the latter (I'm sure most other folks are
>too).
No, this is not related to "response time". Thats a totally separate
issue, and is dictated mostly by the amount of audio buffering being
done before the output h/w (either a D/A converter or a direct digital
out), plus a minor effect from any MIDI latencies caused by h/w or
drivers.
The effect I was describing is a design "error" that allows parameter
changes to take effect in a way that is not synchronized with audio
synthesis. It allows a change to take place in the middle of the
engine's cycle over all the current plugins/opcodes/ugens/flavor-of-the-week.
This is incorrect from an audio point of view - parameter changes should
happen only at the beginning (or end - equivalent for the most part)
of this cycle. The result can be bizarre artifacts in the audio output.
>> For point-to-point communication, if the data flow rate is not an
>> issue (ie. its not audio or video), than a Unix pipe will probably
>> work just fine. If you need to multiplex, then use shared memory
>> queues. In short, use the existing IPC mechanisms, since what you
>> want is in fact *IPC* not a plugin API.
>
>Thoroughly agreed, but what happens when I want to talk to a program
>designed around LADSPA or MuCoS? Will either of those offer an easy
>(out of the box) way to talk to a pipe, socket, or shared memory
>queue? If they go out of their way to accomodate me, then I don't
>have to go out of my way to accomodate them; I just didn't expect
>to get off that easily. :-)
It depends what you want to say to them. If you want to relay audio
data, then a pipe or socket won't work very well. Both LADSPA and
MuCoS are, as Kai has put it in a very straightforward way, function
callback API's. They have nothing to say about communication between
threads, processes or hosts. The same is true of VST, TDM, MAS and all
the rest. Don't confuse LADSPA and its cousins with a generic
framework for writing audio applications. Its a way to write plugins:
pieces of code executed by a host application. The host may or may not
implement some other standard to allow it to inter-operate with other
applications on the same or other hosts, but whether or not it does so
has *nothing* to do with LADSPA.
--p
This archive was generated by hypermail 2b28 : Wed Mar 29 2000 - 05:31:37 EEST