Re: [linux-audio-dev] Re: Plug-in API progress?

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] Re: Plug-in API progress?
From: Benno Senoner (sbenno_AT_gardena.net)
Date: ke syys   22 1999 - 19:11:03 EDT


On Wed, 22 Sep 1999, Paul Barton-Davis wrote:
> David writes:
>>
> Because, like the new VST GUI, Quasimodo provides its own GUI language
> that the plugins use to specify their interface. This currently
> provides for about a dozen or so visual elements along with way to
> position them and control their ranges/values etc. However, I'm
> working right now, for example, on a dynamics module which does
> compression/expansion/limiting in a single unit (since they're really
> all the same thing). There is no way to do this without writing the
> whole widget in the native UI language (e.g. Gtk-- or Python), because
> the internal UI stuff is just too limited to handle it. The end result
> is a new UI element that can be used by other modules - it will be
> some kind of customized curve-drawing widget with range controls,
> etc. The same applies to my rather funky (and I think very cool :)
> envelope UI element which started out as a necessary item for the
> sampling oscillator module, and is now part of the UI element suite.
>
> So no, there is no fundamental limitation, more just a trade-off
> between always writing new visual elements in the native UI form, or
> using a more limited subset. its pretty much like VST in that regard.

Agreed, IMHO the best is to take advantage of both,
if the modules is available in the UI language use it,
if not, do your custom UI but only for the needed elements.

>
> >2) Native GUI built into the plug-in. (100% Platform dependent...)
>
> How does this work ? What is handling GUI event callbacks ? Is the
> plugin a separate thread or task ? Or is it all running as part of the
> engine ?

I'd prefer to separate the things , or you will end up with low-latency
problems.
A GUI hasn't to be very realtime therefore, run the GUI thread just
with normal priority.
>
> Network transparency is an illusory goal (for audio) in my
> opinion. However, even if you want to pursue a more limited form of
> it, note that by using X and simply forcing the plugin's cycles to run
> on the same host as the engine, you do have a certain polarity of
> network transparency.

Yea, it would be nice to be able to walk around the studio,
and have a way to fire up your mixer GUI on your preferred workstation,
while the audio engine still runs on the BIG-BOSS machine
( 4way K7 @ 800Mhz :-) ).

But for low-latency distributed processing you are rights,
the network latency is too high, which makes Quasimodo on beowulf
not very flexible.
But distributing harddisk audio track among multiple machines could
work well, since you don't need low-latency, just like on the local disk case.
IMHO it's a waste of resources to dedicate a 2nd machine to
just recording/playing additional audio tracks.

>
> As for the sync problem - I don't believe there is a sync
> problem. This is precisely what Quasimodo does now: the UI runs
> in a different thread than the DSP emulation, and it fiddles randomnly
> with the data used by the DSP; there is *no* synchronization between
> them at all. I have had no problems with this, and I cannot find a
> theoretical reason why there should be, because of the atomicity of
> the instructions to store and load floats and integers (which is what
> a parameter change fundamentally reduces to). So I think this is a
> non-issue too. The only sense in which I worry about this is if
> Quasimodo was ever ported to run on an on-card DSP, access to the
> parameter memory might not be so straightforward. But with the new AMD
> Athlon/K7 out in a month or two, on-card DSP's are not looking very
> attractive any more :)

Agreed I can't wait for the K7,
but meanwhile I will get a dual Celeron @ 550 to do my multithreaded DSP hacks.
nice DSP power at 1.1GHz ?
:-)

> >With an event based system, you get automation virtually for free. Many
> >plug-ins won't even need to use custom events for their processing <-> GUI
> >communications, which means they can just send their "parameter change" events
> >and let the engine record the communication if desired. Custom events could be
> >split into two groups - "automation enabled" and "private".
>
> yes, this is certainly a nice feature, but I am afraid that its too
> slow. if changing the values of a parameter involves any more than
> just a fairly low-cost function call (and particularly if it involves
> a context switch between the UI thread and the engine thread), I have
> some real concerns (based on experiences with Quasimodo) that you
> can't do it fast enough.
>
> Particularly not if you hook up the UI to an external controller/fader
> box, which I do all the time. I *sometimes* end up with a full MIDI
> wire (8 MIDI faders all ramped to zero and then back up again to a
> 7bit value). This can translate into about 1 thousand parameter
> changes per second, and this is supposed to be handled asynchronously
> with the engine.

IMHO the GUI thread should queue up incoming events (parameter changes),
and it the event stream gets too dense, just thin out the stream or evaluate
only the most recent events.
It makes no sense to me to send 1000parameter changes/sec to a GUI fader
while the graphics card refresh rate is only 70Hz or so.
I know that you can't ignore all events like volume fader changes.
But with a smart system even when there will be dense event streams,
you can keep the GUI still snappy.

>
> You're not going to be able to handle this by making an IPC call from
> the UI thread to the engine, and if its checking for parameter changes
> a thousand times a second in some shared memory queue, its going to be
> wasting a lot of cycles that would otherwise be used for DSP. I don't
> understand what kind of event system other than some kind of IPC that
> you might be envisaging ....

The DSP would of course run at higher priority as the GUI thread, therefore
the dense event streams would not have may impact on DSP performance,
but would slow down other processes running at regular priorities.
( thin out the events or keep track only of the most recent events)

Just for curiousity: what does Quasimodo actually to display the changes ?
polling at regular intervals ?

>
> I am currently investigating the XRecord extension for automation of
> Quasimodo. I don't see any need to reinvent the wheel here -
> automation of X interfaces is something that has been worked on for 10
> years or so. The X model is fairly nice because the server will buffer
> your events for you until you are ready to handle them, to a
> point. So, you can go through a huge burst of UI changes, and then
> pick up the event stream for automation recording once things calm
> down.

Interesting, but IMHO the GUI elements should be passive , thant means,
the audio engine's automation should drive the elements through sending events.

>
> Well, I think you should start by handing the plugin a connection to
> the X server and leave it at that. Any native GUI API worth anything
> can take such a connection and get started from there.

see my GUI toolkit-handler proposal ..
I would opt for this.

regards,
Benno.


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:27:12 EST