Re: [linux-audio-dev] App intercomunication issues, some views.

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] App intercomunication issues, some views.
From: Martijn Sipkema (msipkema_AT_sipkema-digital.com)
Date: Tue Jul 23 2002 - 20:38:25 EEST


> >Does that mean that MIDI output can only be done from a callback?
>
> No, it would mean that MIDI is only actually delivered to a timing
> layer during the callback. Just as with the ALSA sequencer and with
> audio under JACK, the application can queue up MIDI at any time, but
> its only delivered at specific points in time. Obviously, pre-queuing
> leads to potential latency problems (e.g. if you queue a MIDI volume
> change 1 second ahead of time, then the user alters it during that 1
> second, you\'ve got problems).

The only problem i have with this is latency. For applications that
only do MIDI output this is fine. For a software synth, only taking
MIDI input, there is also no extra latency, since you already need it
to avoid jitter.

For a complex MIDI application that does MIDI input -> MIDI output,
this adds latency. I am working at the moment on a low level MIDI I/O
API and a daemon for IPC routing. This will support sending either
immediate ro scheduled MIDI messages. It will take probably some time
still to get a working version. All scheduling is done to UST (,
allthough Linux does not support this yet).

[...]
> >Why not have a seperate API/daemon for MIDI and
> >have it and JACK both use the same UST timestamps?
>
> you can\'t use any timestamp unless its derived from the clock master,
> which UST by definition almost never is. the clock on your PC doesn\'t
> run in sync with an audio interface, and will lead to jitter and drift
> if used for general timing.

The mapping between UST and audio time (frames) is continuously updated.
There is no need for the UST to be the master clock. If JACK would
provide on every process() callback a UST time for the first frame
of the (input) buffer, then MIDI could be very accurately synced to JACK
audio by scheduling it on UST. An application doing JACK audio output
and MIDI output would most likely estimate the UST for the output buffer
using the UST of the input buffer and schedule MIDI messages for that
buffer in the callback also. So this then looks much like your proposal.
But there is also still the ability to use send MIDI messages for
immediate transmission.

Using UST would also enable syncing to video or some other media
stream without it all residing in the same API.

--martijn

Powered by ASHosting


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Tue Jul 23 2002 - 20:34:30 EEST