[linux-audio-dev] Audio routing issues for linux..

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: [linux-audio-dev] Audio routing issues for linux..
From: Juan Linietsky (coding_AT_reduz.com.ar)
Date: Mon Jun 10 2002 - 07:34:31 EEST


Here's a problem I commonly find in existing audio apps or in
programming audio apps: Audio routing.

The way things work now, it's hard for apps to implement a standard
way of:

-Route audio from an app to another
-Share audio devices/routes
-Apply Audio modifiers (Effects)

LADSPA is great for integrating to your programs and very fast too,
but still not what I'm refering to..

For example, what would you think of the following audio setup?

I choose my favorite audio composing app, say muse.
Now i choose my favorite softsynth, iiwusynth.
Alsa works great for this... now
Using a theorical audio routing api Iiwusynth will
proovide me with the following audio sources: A stereo channel (gobal
mix) 16 more stereo channels (for the instruments) and 2 more channels
(effect send buffers). Now, I could create addition object and connect
channels 1-5 of iwusynth to it. I can also get one of the send effect
send buffer channels, connect it to a reverb object, and connect that
to the final mix. I could also get channel 6 of iiwusynth, connect it
to a distortion object and connect it back to the mix.

Now let's say that, since IIwusynth performance isnt that great and
I'm using so many channels that i'm running out of CPU (dont kill me
josh/peter ;) Seems that I want to do a typical multichannel dump:
Muse will proovide me with an output plug to where i can connect all
the output of the network i did before. This way I dont need to have a
sound card that proovides recording from output (and even if it does,
many do it using a da/ad conversion where there is a certain quality
loss or it just adds noise/distortion).
Now let's say that I want to use saturno besides iiwusynth as synth
output (and the same approach with buffers on it). This helps me,
because I just couldnt do it if I had a soundcard without multichannel
output.

Ok, done with the output, now let's say i have a nice base going and I
want to play my guitar over it. Using the same approach, i'll connect
the guitar to the soundcard line in, then in the audio network, the
line in to an object or program that proovides me special kind of
distortion, then flanger,etc. Maybe the line in of my computer is a
bit noisy, so i'll probably want to go thru a noise gate first.
At the end of the chain, I'll pug it to an input in MUSE.
Now i can play with my guitar over what i'm doing!

For a final touch, I can connect all the outputs to a mixer, and
adjust everything until I like it.

Yes, I know programs such as ARTS/JACK can do this kind of
thing, but there are some issues with this.

1-The application has to be able to "proovide" inputs and outputs,
which may be used or not. By default an app may connect directly to
the output or just not conect at all, expecting YOU to define the
audio routes. Most of the times, unless using a direct output or known
audio modifier objects, an app will not want to connect somewhere from
whithin it. You will do that from an abstracted interface.

2-Jack is great, but if you want to run a certain synthethizer that
doesnt use jack, together with one that does and you have a consumer
soundcard that doesnt support multichannel output, you are dead.

3-You may also want to put just any program that uses native OSS/ALSA
through this. Imagine running xmms and wanting to put the sound thru a
better equalizer than the one included. Instead of botherering in
writing a SPECIALIZED equalizer plugin for xmms, you just redirect the
output to an equalizer program that takes many inputs/outputs.
Or better yet, imagine you want to play a game or watch a movie and
you want special audio settings, you just yet again redirect to such
object.

4-I know jack likes to give root privileges to apps that need low
latency I imagine that for normal apps this isnt an issue, so this
should be considered.

5-You know you cant force application owners to convert their stuff to
jack/arts/etc. You'd also rather not waste your time converting their
applications to that, and the application owners would rather not
having to support multiple apis. So, this saves us time to all.

Probably the easier and more natural approach to this is just
integrating JACK to ALSA in some way.

What do you think?

Regards!

Juan Linietsky


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Mon Jun 10 2002 - 07:23:20 EEST