[linux-audio-dev] Re: [l Re: Plug-in API progress?

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: [linux-audio-dev] Re: [l Re: Plug-in API progress?
From: Benno Senoner (sbenno_AT_gardena.net)
Date: ke syys   22 1999 - 18:25:52 EDT


On Wed, 22 Sep 1999, David Olofson wrote:
>
> I don't know the details about TDM, but VST 2.0 uses 3 different methods.
>
> 1) The simple host UI: Parameter names, values and unit are strings that the
> host can request and display any way it prefers.
>
> 2) Native GUI built into the plug-in. (100% Platform dependent...)
>
> 3) The new VST GUI; a "powerful" cross platform toolkit. Used for building GUIs
> within the plug-in, but platform independent as opposed to 2).

We should too allow 2) and 3),
like quasimodo does we should provide a nice description-language
for providing the most commn audio-GUI elements (faders,knobs,buttons, labels
etc)
maybe Paul's proposed XML-style description language would be nice.
We should provide a way to allow a combination of internal GUI elements,
and custom GUI code.
An example could be a mixer with a custom FFT display.
You could implement all the mixer elements (volume and pan faders),
by using the standard description language, and code the FFT analyzer
using your favorite toolkit. ( Gtk,Qt etc).

I would opt for the following approach:
run the entire description-language GUI part in one thread,
and the custom GUI part as a separate thread, by linking .so modules,
and calling the object's constructors/destructors to show/delete the GUI
elements.
That means the standard GUI code would hardly crash, since it's
interpreted/precompiled.
If an element of the custom GUI code crashes, than it will take down all other
custom GUI elements, (assume you have the buggy FFT-analyzer and a bug-free
custom Scope module, that means if the FFT crashes, the scope will disappear
too) but as soon the engine detects this, it could restart this
thread which would result in a short flashing of some GUI elements on the
screen, but without real implications. (as long your FFT plugin doesn't crash
every second).

An other solution could having a separate thread for each plugin-GUI-panel.
In this case if one of the plugin-panels crashes, the rest would be unaffected.
But I strongly dislike the idea of having 30 threads running only for doing GUI
handling.

The ideal would be a way to handle several toolkits.
Since it would perhaps very hard to mix 2 toolkits in one thread,
we could implement for example implement a GTK-GUI handler and a Qt-GUI
handler.
This would keep the number of running threads low while allowing maximum
flexibiliy.
Isn't that nice ?

> > Just to float an idea: instead of having the GUI run by anything
> > related to the engine, put the parameters in shared memory (*). Then
>
> Two problems: Synchronization of the GUI and processing code, and network
> transparency. But other than that, it does give plug-in developers full control
> over their plug-in <-> GUI communication. What I *don't* like is that this kind
> of solution results in plug-ins needing dual interfaces (at least) - one for
> the GUI and one for automation.

Yes,

>
> With an event based system, you get automation virtually for free. Many
> plug-ins won't even need to use custom events for their processing <-> GUI
> communications, which means they can just send their "parameter change" events
> and let the engine record the communication if desired. Custom events could be
> split into two groups - "automation enabled" and "private".

Agreed, I like this idea,
for example the volume-slider (the GUI part) of a gain-control plugin just
installs an event-sending port and sends the events to the plugin DSP code.
Plus the DSP code installs an event-sending port and connect the event-receiver
port of the volume-slider.
That means if someone (a sequencer/ or arbitrary plugin) changes the gain value,
the fader would move automagically in the correct position.
Or is this suboptimal ?

Do you plan network-transparency to high bandwidth streams too or only
for the event system.
networked evens make sense to me, like sending MIDI events over network.
But sending 60 tracks over the net is a bit harder.
We should take an approach similar to X:
it we are forced to use the network use it,
if there are better alternatives like shmem, use the latter.

>
> We could propose a GUI API to use for this, but 1) that would have to be
> implemented on all platforms we want to support, 2) it has to be powerful to
> gain any acceptance among serious developers and 3) there are already lots of
> "standard" toolkits... GTK+ is nice, but I would prefer something like
> wxWindows for portability reasons. But no matter what, it's probably close to
> impossible not ending up where VST is; forcing developers to either accept the
> limited VST GUI, or hack platform specific GUI code if they need more control.

Agreed, but keep a door open for multiple toolkits (see my proposed multiple
GUI handlers), since I wouldn't code in GTK afer falling in love with Qt.
:-)

If you need a writer for the Qt-GUI handler .. I'm here ...
:-)

Benno.


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:27:12 EST