Re: [linux-audio-dev] App intercomunication issues, some views.

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] App intercomunication issues, some views.
From: Martijn Sipkema (msipkema_AT_sipkema-digital.com)
Date: Wed Jul 24 2002 - 17:29:07 EEST


> nanosleep isn\'t based on time-of-day, which is what is subject to
> adjustment. nanosleep uses the schedule_timeout, which is based on
> jiffies, which i believe are monotonic.

I\'m not sure how nanosleep() is supposed to handle clock adjustment
but I agree it would probably not change its behaviour. nanosleep()
does sleep on the CLOCK_REALTIME.

> i believe that relative nanosleep is better than absolute sleep for
> the simple reason that its how you would avoid drift in
> practice. consider a JACK callback:
>
> process (jack_nframes_t nframes)
> {
> jack_transport_info_t now;
>
> /* find out the transport time of the first audio frame
> we are going to deal with
> */
>
> jack_get_transport_info (client, &now);
>
> /* get the set of MIDI events that need to be
> delivered during the period now.position to
> now.position + nframes
> */
>
> event_list = get_pending_midi_events
> (now.position, now.position + nframes);
>
> foreach event in event_list {
> queue_midi_event (event);
> }
>
> ... anything else ...
> }
>
> now, what is queue_midi_event() going to do? if you schedule the MIDI
> data for delivery based on an absolute time, you\'re suddenly dealing
> with long term drift again. instead, you just schedule it for an
> offset from \"now\", typically within the next couple of msecs. this
> way, the drift is then limited to whatever variance there is between
> the rate of the clock used to deliver the MIDI data and the audio
> clock, which over that time period (as you noted) will be very, very,
> very small.

If I use an absolute sleep there is basically no difference. The drift
will be the same, but instead of scheduling events from \'now\' I can
spcify the exact time. So a callback would then be like:

- get the UST and MSC for the first frame of the current buffer for input
- get the MSC for the first frame of the current buffer for output and
    estimate the UST for that frame.
- calculate the UST values for the MIDI events that are to occur during the
    output buffer.
- schedule the MIDI events (the API uses UST)

This has two advantages:

- since you get UST for the input buffer you have a better estimation of
    when the output buffer will be performed.
- the MIDI messages will be queued and thus will need an absolute timestamp.

> of course, the above simple example doesn\'t take audio latency into
> account, but thats left as an an exercise for the reader :)

Using MSC for every buffer, the latency is the difference between output
MSC and input MSC.

> now, the truth is that you can do this either way: you can use an
> absolute current time, and schedule based on that plus the delta, or
> you can just schedule based on the delta.

But how can this be done in another thread at a later time?

> either way will work, but
> the second one works right now without any extra POSIX clock support.

also an added problem with relative sleeps is that they can be less accurate
when because you can get preempted before nanosleep(), allthough in a JACK
callback this is not very likely.

--martijn

Powered by ASHosting


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Wed Jul 24 2002 - 17:25:06 EEST