Re: [linux-audio-dev] introduction & ideas

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] introduction & ideas
From: Martijn Sipkema (msipkema_AT_sipkema-digital.com)
Date: Mon Feb 25 2002 - 15:45:07 EET


> The envisioned used for UST in dmSDK/OpenML seems to be merely as a
> way to correct drift, but this could be a rather bad idea if the
> source of UST isn't synchronized with the hardware driving the
> reference output stream. For example, the natural source of UST on an
> Intel system would be the cycle counter, but there is no evidence that
> the cycle counter has less jitter or is more accurate than the sample
> clock in a decent audio interface. If you use the cycle counter, the
> chances are that you will often be correcting for drift, and this
> seems wrong when it appears likely that the audio interface has a more
> accurate clock than the cycle counter (though of lower resolution).

OK. So say I have an application dealing with audio video and MIDI. Most
of the time the audio will be used for the clock providing 'performance
time'.
Video and MIDI will have to be synced to this time. Scheduling video frames
and MIDI messages on this time base is not practical however. So a common
'wall clock' or 'system clock' is needed. This has to be an unadjusted
clock. I
don't know if CLOCK_MONOTONIC is supported yet within linux, but this
is what should be used I think to schedule MIDI messages and video frames.
(You could call this UST). So a mapping between 'performance/media time'
(MSC) and the system time (CLOCK_MONOTONIC/UST) is updated
regularly. If all APIs supported this common clock than synchronisation
between the different kinds of streams would be easier. For my MIDI API I
don't want scheduling on a tempo clock in the API. This should be done by
the application I think. So it will only support ordered scheduling on
CLOCK_MONOTONIC and flush/drain functionality. If the audio API
would give me the time in CLOCK_MONOTONIC on which a certain buffer
(MSC or some other clock) was performed, than MIDI could be synced to
that fairly easily without direct support in the MIDI API for the specific
audio
system.

I also think that MIDI should be kept seperate from audio (ALSA) as much as
possible. That a lot of soundcards have an onboard MIDI interface is more an
implementation detail IMHO.

> >That's what I thought. Too bad. We could do with a standard media API.
> >It would be nice if jack could use the UST/MSC. Does it have something
> >similar?
>
> JACK has an interrupt-cycle-resolution UST source (at least when used
> the existing ALSA PCM driver). It could easily add the cycle counter,
> but that doesn't make any sense given the clock skew that I alluded to
> above.

Can it give a mapping of cycle counter and the JACK media clock?

Another thing, does jack support pitch for hardware that has it?

> >There is a difference in that ALSA is used only aw an endpoint, and
> >OpenML could be used for application interconnection possibly.
>
> actually, this is a widespread misconception. alsa-lib is perfectly
> capable of allowing user-space/application defined PCM devices, and
> thus can allow applications to become endpoints themselves, to
> interconnect with each other, and to share data. the reason JACK
> exists is that some of us felt that the ALSA API wasn't the right
> model for doing this.

I did not know that :)

I do see this as a documentation problem. It's much easier to first read
a document on the design of the API and the driver interface and then
read the source. I may be wrong, but I think there would be more and
better quality ALSA application if the documentation were better (or even
available).

--martijn


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Mon Feb 25 2002 - 15:32:21 EET