Re: [linux-audio-dev] APIs

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] APIs
From: David Olofson (david_AT_gardena.net)
Date: Thu Feb 14 2002 - 19:57:06 EET


On Thursday 14 February 2002 17.52, Paul Davis wrote:
[...]
> even two different rates of synchronous data is reasonably OK,
> especially if there is an integer ratio for the number of "data
> units" in one stream versus the other (e.g. audio at 48k, video at
> 30fps => ratio = 1600:1). but make that ratio non-integral and/or

Integral or not - if the ratios can be expressed in any sensible way
at all, synchronization shouldn't be a major issue - in theory.

Actually scheduling nodes to run at the right times (at least without
dedicating one or more CPUs to each "frame rate") seems to me to be
the *real* problem - especially with low latency requirements, where
you can't just set up a network with enough buffering to handle
simple interleaving of callbacks/invocations.

> throw in the need to receive and send random quantities of (say,
> MIDI) data at random times, and things get really hard.

Well, depends on what "random times" are. As long as the MIDI (*)
event timestamps can be translated into some units that relate to the
frame rate of the receiving nodes, it boils down to a matter of where
to convert timestamps.

As long as sending and receiving nodes are both synchronized to the
same "real clock", events can be queued on FIFO like "ports" and
converted as needed using info that should already be available.

Now, if the nodes *aren't* synchronized... Does that really make
sense? Wouldn't it be possible to sync a PLL to one or both nodes, as
required to create a "fake" common time base to use for event
timestamping? That is, basically an improved version of "sending
events to be processed ASAP."

(You wouldn't get closer than that with dedicated hardware in such a
situation - except that dedicated hardware usually runs some kind of
RTOS or RT software, and thus does the timing more accurately.)

(*) ...or rather something more powerful than MIDI. I prefer to
    view MIDI as a protocol for interfacing with external devices,
    rather than something to encapsulate in a serious native
    processing API.

> gstreamer
> would seem like a framework that should have solved this problem
> but i'm not clear from reading about it that they have done so.
>
> if people have ideas on how to do this, that would be great.

I have ideas that would *work* - but figuring out something nice,
simple and comprehensible takes more than theory. I'll hack some more
real code first.

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
`----------------------> http://www.linuxaudiodev.com/maia -'
.- David Olofson -------------------------------------------.
| Audio Hacker - Open Source Advocate - Singer - Songwriter |
`-------------------------------------> http://olofson.net -'


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Thu Feb 14 2002 - 19:46:44 EET