Re: [linux-audio-dev] Ways to make Linux THE ULTIMATE Multimedia Processing System

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] Ways to make Linux THE ULTIMATE Multimedia Processing System
From: Josh Green (jgreen_AT_users.sourceforge.net)
Date: Sun Dec 16 2001 - 04:15:07 EET


On Sat, 2001-12-15 at 06:42, Paul Davis wrote:
> >=========================================================
> > Why don't we strive to use or create inter-program protocols!
> >======================================================================
>
> what do you think JACK is? what do you think the point of a protocol
> that doesn't run in sample sync is? did you know about the alsa-server
> daemon?
>
> --p
>

Hey.. Audio isn't the only type of valuable inter-process communication.
What about MIDI, what about arbitrary control information (like what you
are developing for LCP?). Besides I didn't write that email to say that
there isn't such things in development, it was more of a call to try to
bring them all together. Even more so, it was for myself to try to be
more a part of this process.

Nothing I said was new of course (as I tried to note several times), I
was simply trying to add my own enthusiasm to this topic. Perhaps you
didn't get that though, at least I gather from your response.

I think what IS currently lacking in the Linux audio scene is a concise
overall picture of what types of inter-application communication are
desirable and what is currently available. I'm not saying that I should
necessarily be taking the torch for such a resource, in fact there are
already several good resources for Linux audio information that we all
know. I do believe we need to make some sort of official project or
presence that discusses protocols and provides a central location to get
API information etc.

There are many projects to develop inter-process protocols, but it would
be nice if some of them end up on standard Linux systems and are
flexible enough to be usable for times to come.

When I think of where Jack comes into the picture. I realize that it is
trying to accomplish a similar thing as LADSPA or aserver. That is,
streaming audio data to other applications. Now I know you have stressed
time and time again that you are going for sample sync, low latency,
standard floating point streams, but the overall object is the same.

When I think of LADSPA I realize that the GUI manipulating the
processing stream could be anything, it could be a full blown
application. It is also in the business of connecting audio between
applications.

If we create multiple standards for the same desired effect, we are
going to be fragmenting application development.

ALSA is going to be the new audio system for Linux. So in the future
that is what you will find on any standard Linux box.

Can something be done to somehow unify these ideas to create a standard
audio API for those applications that don't care about the underlying
hardware? An API that satisfies all our needs (or at least as close as
possible). I'm convinced that this API needs to be closely married to
the Linux sound system at least in distribution. If aserver exists, and
LADSPA exists and JACK exists...

How about integrating the ideals of JACK with LADSPA? They are both
constructed around the idea of networks of audio processing nodes. Could
LADSPA 2.0 be JACKed :)

Perhaps there are some grave errors in my thinking, no doubt you will
point them out. But please realize that I am just trying to be a voice
of moderation here. I don't want to see the audio world in Linux be
fragmented like the Desktop world is already becoming. The underlying
protocols should be standard, flexible and independent.

-- 
    Josh Green
    Smurf Sound Font Editor (http://smurf.sourceforge.net)


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Sun Dec 16 2001 - 04:14:29 EET