Re: my take on the "virtual studio" (monolith vs plugins) ... was Re: [linux-audio-dev] ardour, LADSPA, a marriage

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: my take on the "virtual studio" (monolith vs plugins) ... was Re: [linux-audio-dev] ardour, LADSPA, a marriage
From: Stefan Westerfeld (stefan_AT_space.twc.de)
Date: Fri Nov 17 2000 - 19:15:05 EET


   Hi!

On Fri, Nov 17, 2000 at 09:54:06AM -0500, Paul Barton-Davis wrote:
> >Anyway if we can develop such a model where processing / mixing / routing
> >can be delegated to the virtual studio API, then
> >HDR apps become nothing more than a disk streaming engine, a GUI
> >and a bunch of plugins which do the processing.
>
> seriously underestimate whats needed to give the user a good
> experience. if you want a specific example: how will you implement
> "rewind" and how will the user see that its happening ? generalize
> that for any tape motion at all ? how will you implement monitoring ?
> how about metering ? there are answers to all of these questions
> (obviously), but if the results are to be nicely integrated into HDR
> software, things are *very* much more complex than the example above
> shows.
>
> I am not thinking very clearly about this, and I'm only superficially
> interested in this (once again) abstract discussion. I say that
> because I'm deep in the middle of actual writing code that implements
> what you are talking about. The lack of a routing architecture in
> ardour is emerging as more and more of a problem, and I'm about to fix
> it once and for all.

Okay, I am also not interested very much in an abstract discussion what
should, could or might be. These take to long ;-)

But I'd like to throw my vision of "virtual studios" under linux in the
debate, and describe which parts are readily implemented in the aRts/MCOP
technology.

What is missing in LADSPA is besides network transparency and IPC the ability
to talk to components in a more complex way than streaming data around. The
HDR issue illustrates: you want to abstract the HDR engine from the
application that is using it. So an HDR engine should be a component like an
equalizer or a freeverb effect.

But, you say, we want to control the HDR engine more complex ways than just
connecting a stream to it. So is an MCOP idl interface that could be the
standard.

struct TimeStamp {
        long sec;
        long usec;
};

interface HDRChannel;

interface HDREngine {
        /**
         * adds a channel to the HDREngine
         */
        HDRChannel addChannel();

        /**
         * removes a channel
         */
        HDRChannel removeChannel();

        /**
         * starts recording
         */
        void startRecord();

        /**
         * starts playing
         */
        void startPlay();

        /**
         * moves to a specific position
         */
        void seek(TimeStamp time);

        [...]
};

interface HDRChannel : Arts::SynthModule {
        /**
         * gets volumes in a certain area
         */
        sequence<long> getVolume(long startseconds, long endseconds);

        /*
         * LADSPA like streams to connect to the flow system
         */
        out audio stream output;
        in audio stream input;
};

and so on. What you see is that interfaces allow you a similar flexibility
than C++ coding, to define the data types and methods you need to specify
an interface for a complex component (like an hard disc recording engine).

It does however also generate automatically network transparency, so that
the HDR process is controllable from somewhere else, and it provides a
streaming network, to be able to interconnect modules. In fact, using such
a component is as easy as using a C++ class (you just call methods),
although it may run in another process, on another computer, or whatever.

Well, I'd really like to see such an abstract HDR interface, and a concrete
(maybe ardour based) implementation in aRts anytime soon. I know that aRts
based sequencers (like Brahms) really would gain a lot of having harddisc
recording. And doing it that way, both, the sequencer and the HDR engine
become interchangeable and less complex.

I would really recommend anybody who thinks that LADSPA is not sufficient
for object interaction and to make "virtual studios" fly, to have a look
at the aRts documentation, very useful is the chapter in the KDE2 development
book

  http://webs1152.im1.net/kde20devel/Files/PDF/chapter14.pdf

and of course the normal developer docs

  http://space.twc.de/~stefan/kde/arts-mcop-doc

And as you can see in KDE2, all that aRts/MCOP component technology is
pretty stable and pretty well tested, it probably runs on a few million
desktops inside artsd now.

  Cu... Stefan

-- 
  -* Stefan Westerfeld, stefan_AT_space.twc.de (PGP!), Hamburg/Germany
     KDE Developer, project infos at http://space.twc.de/~stefan/kde *-         


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Fri Nov 17 2000 - 20:00:11 EET