Re: [linux-audio-dev] Audio routing issues for linux..

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] Audio routing issues for linux..
From: Juan Linietsky (coding_AT_reduz.com.ar)
Date: Mon Jun 10 2002 - 22:49:22 EEST


On Mon, 10 Jun 2002 09:36:30 -0400
Paul Davis <pbd_AT_op.net> wrote:

> >Yeah, but what I mean is 2 things, first that you should be able to
> >change transparently where the app is sending the data, without the
> >app noticing, and second that such configuration shouldnt be stored
> >by the app but from an abstracted app/interface that handles
> >connections.
>
> You're asking for something that can't be done satisfactorily
> without an API change. You're resisting the particular API change
> that is required. Then you say:
>
> >1-Saves a _huge_ amount of time to programmers since the only thing
> >they have to do is register audio in/out slots, and then route the
> >external sources. How many times do we see programs use the same
> >code over, over and over again? (freeverb/chorus/flange/ladspa
> >chains/equalizers/normalizers/mixers/vu bars/etc). Well, this would
> >put an end to that, and audio programming becomes a lot easier.
>
> You've just defined half the rationale for JACK.
>
> >2-Saves enormous time to the user. Why capturing/dumping/editing if
> >your CPU can do everyting at once? just chain your favorite
> >programs! It actually even gives you the ability to build up your
> >own chains of modifiers for program to program.
>
> You've now defined the other half.
>
> >3-Encourages program interoperability, ala good old unix way. It's
> >easy, most programmers, and specially the new ones, dont care about
> >side libs such as JACK/Arts. They want to go straight to the
> >official api first.
>
> You're now continuing on with your apparent habit of partly
> recognizing the need for an API change and then continuing on as if
> you hadn't:
>
> >What I propose is:
> >-Audio routing/data sharing for ALL programs. For most it should be
> >audio routing via api calls.
>
> This is an API change. OSS and ALSA contain no such calls.
>

I'm afraid i didnt make myself clear. I tried to expain this in
previous mails, but I think i'm failing so far.
I perfectly understand what JACK is, but as I said before,
it's primarily meant for low latency stuff.
So my proposal consisted in two things.

1-The first one is to proovide transparent audio routing using
_existing apis_, this does work since most apps do proovide
standard buffersizes (100/200) ms latency. (As fun as this sounds,
many VST/DXi plugins work at these rates using the "windows kernel
streaming" hehe :). This would be very useful for working with apps
where you dont need low latencies such as 8-10ms (like audio editors)
nor syncronous execution. Also great for post processing the output of
any program o improve it's sound (a movie player, and audio player, a
game, cool old audio programs or ones that only support native
apis,etc).

2-I also DO aknowledge a _new_ API for doing this low latency, JACK
works perfect for this. I've never said it doesnt. When I said that
JACK should become part of Alsa-lib i've meant that jack could go in a
lower layer than it currently goes (maybe a driver level or
something?) so it can capture and automatically "jackify" the data
from existing apps that use the native api (alsa/oss emulation).
The stream doesnt need to go "low latency" (and from what I know JACK
should support normal latency apps fine) but you can still route
existing apps and share the the device.
How realistic do you think this approach is?

Juan Linietsky

This what I first proposed, and i think it's a key factor.


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Mon Jun 10 2002 - 22:39:50 EEST