Re: [linux-audio-dev] audio application mixing/routing arch

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] audio application mixing/routing arch
From: David Olofson (david_AT_gardena.net)
Date: Fri Mar 31 2000 - 04:45:01 EEST


On Tue, 28 Mar 2000, David Slomin wrote:
> Paul Barton-Davis wrote:
> >
> > A) I doubt that you could run the above in real time with low enough
> > latency to make the synth playable from a MIDI controller. Although
> > modern processors are fast, there are limits :)
>
> That's not a good sign. At least for me, a soft-synth is utterly
> useless if it cannot be recorded while being played from a MIDI
> controller in realtime. Does this mean that every soft-synth and
> soft-sampler must double as a single-track HD recorder?

No, but it means that they need to break out their processing into
one or more plugins that you can run in a suitable host, while remote
controlling it via it's custom GUI. When running stand-alone, they may
use their own hosts.

> "realtime" ,--> audio output
> MIDI ------> soft-synth ---<
> input `--> disc
>
> That's definitely a bare minimum though, unusable for more than
> the most trivial hobbiest work. A more realistic minimum would use
> a standalone multitrack audio editor with a decent interface (not
> a minimalistic HD recorder tacked onto the soft-synth).
>
> "realtime"
> MIDI ----> soft-synth --. mix to
> input \ multitrack ,--> audio output
> >---> audio ---<
> existing / editor `--> new track
> audio ----' to disc
> tracks
>
> I find it very hard to believe that this kind of setup would
> necessarily result in unusable amounts of latency. It's such
> a fundamental thing that without it, there wouldn't be any
> concept of using computers in a music studio.

It's possible, but you need to do it through cooperative
multitasking, either in the same thread as plugins, or in separate
processes using IPC based "manual scheduling" to ensure that the
applications control the time sharing.

The alternative is HZ >= 1000 and a very much improved lowlatency
patch. Or RTLinux... (BTW, RTAI, another real time Linux similar to
RTLinux, supports scheduling to user space. Protected memory and
still µs latencies, that is. :-)

> Inserting effects into the flowgraph is something that I
> personally don't have a need for during the realtime recording
> stage (I can do it offline later), but other people on the list
> have asserted that they can't live without it. That gives us:
>
> "realtime"
> MIDI ----> soft-synth --. mix to
> input ) ,---> audio output
> ,-- effects <--' (
> / `--------------.
> \ \
> \ more /
> \ multitrack ,--> effects --'
> >-----> audio ----<
> existing / editor `--> new track
> audio ----' to disc
> tracks
>
> I agree that this is starting to look scary, but it is indeed
> a minimum requirement for many people. If the only way to do
> this on a current-generation processor is to have all of the
> pieces run in a single process, then that single process will
> no longer have the design problems of an application; it will
> be a miniature operating system.

Yes. That's why I prefer to view hosts as plugin schedulers rather
than as DSP engines with plugin support. Also, this is why I consider
plugin API/IPC transparency (ie using the same event system for both)
very important - this is what allows you to put plugins where they
should be executed, rather than where your GUI happens to be.

> It will need careful
> coordination not only at the level of flow routing, but also
> in presentation of the UI, not to mention code
> compartmentalization (so that different developers can work on
> different pieces). Designing this mini-OS is a task that I
> doubt anyone can do in a way that all users and developers
> will agree upon.

It's a big problem, but there's no way around it with current
hardware and any desktop OS.

> Thus I really hope that it is possible to run a nontrivial
> flowgraph in separate processes; otherwise we're not ever
> going to get anywhere.

It's either very deep kernel hacks, latency, or remote controlled
plugins. The last alternative is the only acceptable compromise,
IMHO.

//David

.- M u C o S --------------------------------. .- David Olofson ------.
| A Free/Open Multimedia | | Audio Hacker |
| Plugin and Integration Standard | | Linux Advocate |
`------------> http://www.linuxdj.com/mucos -' | Open Source Advocate |
.- A u d i a l i t y ------------------------. | Singer |
| Rock Solid Low Latency Signal Processing | | Songwriter |
`---> http://www.angelfire.com/or/audiality -' `-> david_AT_linuxdj.com -'


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Fri Mar 31 2000 - 07:36:01 EEST