Subject: RE: [linux-audio-dev] Small example of GUI in .so-files and a more thorough descripting of previous post
From: Lars Thomas Denstad (larsde_AT_telepost.com)
Date: pe maalis 10 2000 - 19:09:28 EST
> There's no question that doing this with a given toolkit is easy -
> I've done this before, and sure, it works well.
>
> The challenge I want to solve is where the plugin cannot assume that
> the host runs any particular toolkit. The kind of thing I would love
> is something a bit like Glade's "abstract" specification of the GUI
> that gets turned into real GTK objects when loaded, but I want to do
> it without *writing* another "layer" over one or more toolkits.
Ok, that's basically what I suggested. GUI generation in the following
order: (sorry if I'm repeating myself)
1) Check if the plugin has a frontend assigned to it. The host determines
whether or not it currently supports the requested toolkit, based on
requirements grabbed from plugin-implementation.
2) If not (or if the user so desires) it checks for an XML-description of
the UI.
3) If no description is specifies, the host can at will try to present the
user with a UI that's based on plugin internals.
Talking about abstract specifications, to my knowledge GLADE uses XML (just
like yourself), and extending the description to include widgets should be a
"simple task", at least from a language-definition point of view.
Personally, I would like to see something that populates a GNOME-canvas (or
similar) based on a detailed XML-description for most plugins. This is
obviosly not a good way to do 3D, tho.
If I understand you correctly, you're saying that you would like to fatten
the second layer, by making the host-application add widgets to the user
interface. I think this is a great idea.
[snip]
> I have created a program that, with the help of a GtkSocket, embedds a
> window of another application into it. In this program, I
> would like to be
> able to show a menu, if the user clicks with some button and some
> modifiers somwhere in the embedded window. It is an easy task
> to set the
> event mask for that window, but the Gtk main-loop will happily discard
> that event, whilst it does not come from a widget, and is not
> a property
> delete event. How do I work around this, or is it something
> I've overseen?
[snip]
This looks like multiple processes to me, do you/we really want to start
with IPC just to give the plugin a graphical user interface? Am I focusing
on the wrong part of the post here? Making the plugin-frontend implement its
own event-collection and passing it through a socket-interface (or pipes, or
whatever) seems like a rather strange approach to me.
One of the goals should probably be to have the plugin and the
plugin-frontend run in the same process-space?
> >Also, the requirements passed from the plugin should/could be
> considered an
> >OR-list, that is, the user should in the host application be
> able to specify
> >which toolkit he/she prefers. This leaves you with the possibility of
> >implementing for instance both a GTK+ and a Qt-interface in the same code
> >(and the host-application needs not link to both GTK+ and Qt-libraries if
> >you specify RTDL_LAZY).
>
> I don't think this can work. Both of these toolkits assume that
> *their* inner loop is running for event collection and dispatch. That
> implies 1 thread per toolkit PLUS a clever low-level X event
> dispatcher than sends X events to the right place. That isn't easy,
> not easy at all :) Yes, someone has recently done a GTK wrapper for Xt
> widgets, but I suspect that if we told someone in either the GTK group
> or the Qt group that we wanted to write a program that used *both*
> toolkits, we'd be laughed at, really hard, deep down belly laughing :)
(I don't mind being laughed at. ;)
(Sorry if the post is biased towards GTK+ and Qt, they could be any
widgetset.)
What I meant is _not_ that the host application implements or links to
multiple widgetsets, even though there are possibly cases where this can be
done (especially if one takes the step to abstract multiple interfaces into
a single API).
What I meant is that _the plugin_ can implement calls to both GTK+ and Qt
and not worry about linking issues. (The plugin obviosly does not implement
inner loop for event collection and dispatching.)
You implement this by including Qt-header files and make calls to Qt from
the .so-file from the same .so-file that makes calls to the GTK+-file. All
you have to make sure of, is that the plugin knows which widgetset it's
supposed to be calling before you call the initialization for that
widgetset. This can be done because the calls to Qt will not be tried linked
before the actual call is done with RTDL_LAZY (you know this of course).
If what I mean is still unclear, point me to a good Qt-primer, and I'll make
a second host and make the my plugin a little fatter.
The approach will have the following benefits:
Separate UI and algorithms. Widget-uniform interface to plugins. Single
plugin-file for all pluginfrontends, no matter which widgetset used.
The approach has the following drawbacks:
You cannot use different widgetsets at the same time, and you have to use
the same widgetset as the host-application. If a plugin-frontend only
implements GTK+, Qt-based host applications are free to fall back to doing
the XML-approach and/or just the displaying the parameters.
>From my point of view it makes sense, I'm not really sure how many plugins
you really see needing to implement their own plugin-frontends (you obviosly
know a lot more than me about this, having done a lot of work on Quasimodo),
but looking at Buzz, for instance, which only uses the third option, I would
think XML will go a long way for most plugins.
Thanks again,
LT.
This archive was generated by hypermail 2b28 : su maalis 12 2000 - 09:14:06 EST