Re: [linux-audio-dev] Re: Plug-in API progress?

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] Re: Plug-in API progress?
From: David Olofson (audiality_AT_swipnet.se)
Date: ke syys   22 1999 - 18:28:28 EDT


On Wed, 22 Sep 1999, Paul Barton-Davis wrote:
> [ ... VST ... ]
>
> >2) Native GUI built into the plug-in. (100% Platform dependent...)
>
> How does this work ? What is handling GUI event callbacks ? Is the
> plugin a separate thread or task ? Or is it all running as part of the
> engine ?

Platform dependent. The "average" description:

1) Somehow, you get a window handle. (You may have to create the window
yourself.)

2) Event handling is done the normal native GUI way. (AFAIK, the plug-in is
expected to handle GUI/processing communication sync issues. No explicit API
for that.)

3) All plug-in GUIs seems to run in the host application's GUI thread, but it's
possible to implement that in other ways. (*Everything* may actually run in the
same thread on non real time hosts...)

> >Two problems: Synchronization of the GUI and processing code, and
> >network transparency.
>
> Network transparency is an illusory goal (for audio) in my
> opinion. However, even if you want to pursue a more limited form of
> it, note that by using X and simply forcing the plugin's cycles to run
> on the same host as the engine, you do have a certain polarity of
> network transparency.

What I have in mind here is using one machine to run the client application
and another machine, or possibly a cluster to do the processing. Distributing
GUI code all over the place gets a bit too messy for my taste... And it
requires the GUI code to actually work on the same system as the DSP code -
while you most probably don't want to load Qt, GTK+ or whatever on every node
of a Beowulf. And what if you decide to build a cluster of Alpha or PPC boxes?

> As for the sync problem - I don't believe there is a sync
> problem. This is precisely what Quasimodo does now: the UI runs
> in a different thread than the DSP emulation, and it fiddles randomnly
> with the data used by the DSP; there is *no* synchronization between
> them at all. I have had no problems with this, and I cannot find a
> theoretical reason why there should be, because of the atomicity of
> the instructions to store and load floats and integers (which is what
> a parameter change fundamentally reduces to). So I think this is a
> non-issue too.

Not if you plan to support more complex data types than integers and floats.
Which is the case with the new plug-in API...

And, what about timing? How do you handle sample accuracy without going to
single sample buffers? Events handle that in a clean and low cost way.

> The only sense in which I worry about this is if
> Quasimodo was ever ported to run on an on-card DSP, access to the
> parameter memory might not be so straightforward. But with the new AMD
> Athlon/K7 out in a month or two, on-card DSP's are not looking very
> attractive any more :)

Agree. I hope things will keep evolving in this direction, so we can all
concentrate on getting the code to do what we want, rather than fiddling with
lots of different little processors everywhere. :-) (I got pretty tired of that
kind of programming on the Amiga. Fun for a while, but then you start to
realize that a faster CPU is always more flexible. At least with an RTOS...)

> > But other than that, it does give plug-in
> >developers full control over their plug-in <-> GUI
> >communication. What I *don't* like is that this kin d of solution
> >results in plug-ins needing dual interfaces (at least) - one for the
> >GUI and one for automation.
>
> >With an event based system, you get automation virtually for free. Many
> >plug-ins won't even need to use custom events for their processing <-> GUI
> >communications, which means they can just send their "parameter change" events
> >and let the engine record the communication if desired. Custom events could be
> >split into two groups - "automation enabled" and "private".
>
> yes, this is certainly a nice feature, but I am afraid that its too
> slow. if changing the values of a parameter involves any more than
> just a fairly low-cost function call (and particularly if it involves
> a context switch between the UI thread and the engine thread), I have
> some real concerns (based on experiences with Quasimodo) that you
> can't do it fast enough.

It doesn't even involve a function call per event. Events are written to Event
Ports using inline code and dynamic memory allocation is done from heaps with
limited life time; also using inline code. The only context switches/sync
points are the global ones for client/engine communication.

> Particularly not if you hook up the UI to an external controller/fader
> box, which I do all the time. I *sometimes* end up with a full MIDI
> wire (8 MIDI faders all ramped to zero and then back up again to a
> 7bit value). This can translate into about 1 thousand parameter
> changes per second, and this is supposed to be handled asynchronously
> with the engine.

Shouldn't be a problem. Also note that the events can be handled with sample
accuracy no matter what buffer size you use. If the MIDI interface plug-in can
extract exact timing info from the MIDI data, that can be used to timestamp the
events, so that plug-ins may use the information if desired.

> You're not going to be able to handle this by making an IPC call from
> the UI thread to the engine, and if its checking for parameter changes
> a thousand times a second in some shared memory queue, its going to be
> wasting a lot of cycles that would otherwise be used for DSP. I don't
> understand what kind of event system other than some kind of IPC that
> you might be envisaging ....

I'll try to get a more detailed spec together this week. Basically, it's about
writing events to buffers that are handled pretty much like audio buffers, and
it certainly doesn't involve any IPC calls at all in itself - on sync point for
the whole client/engine interface, as I said.

(That is, the word "Event" might be a bit missleading, considering the
underlying infrastructure...)

> I am currently investigating the XRecord extension for automation of
> Quasimodo. I don't see any need to reinvent the wheel here -
> automation of X interfaces is something that has been worked on for 10
> years or so. The X model is fairly nice because the server will buffer
> your events for you until you are ready to handle them, to a
> point. So, you can go through a huge burst of UI changes, and then
> pick up the event stream for automation recording once things calm
> down.

Well, that actually reminds a lot of my event system... (Although I hardly think
the X event system would be the right thing for anything lower level than the
GUI.)

[...GUI API...]
> Well, I think you should start by handing the plugin a connection to
> the X server and leave it at that. Any native GUI API worth anything
> can take such a connection and get started from there.

Yes, I think so too. Most Open Source developers will probably use GTK+ or Qt,
but I'd guess most proprietary plug-ins would use other toolkits, and/or do a
lot of work directly on X. It's a strange world...

My point is; we'd probably just be wasting time designing something, or build
it around some existing toolkit; there's just too big a risk it won't ever be
used. Plain X should do.

(BTW, wonder if developers are actually using the VST 2.0 GUI stuff for
release plug-ins...)

//David

 ·A·U·D·I·A·L·I·T·Y· P r o f e s s i o n a l L i n u x A u d i o
- - ------------------------------------------------------------- - -
    ·Rock Solid David Olofson:
    ·Low Latency www.angelfire.com/or/audiality ·Audio Hacker
    ·Plug-Ins audiality_AT_swipnet.se ·Linux Advocate
    ·Open Source ·Singer/Composer


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:27:12 EST