Subject: Re: [linux-audio-dev] introduction & ideas
From: Martijn Sipkema (msipkema_AT_sipkema-digital.com)
Date: Tue Feb 26 2002 - 14:36:53 EET
> All OSS and ALSA raw MIDI devices support the same API:
> open/read/write/close. There seems to be no need whatsoever to do
> anything but use this API. There is no need to write any drivers. Just
> deliver the data to them with write(2), and let them do their (best
> effort) thing.
This is what a lot of applications would actually use. So in the API I'm
working on this is provided by using the 'immediate' mode 'path' to
a port on an interface.
> >It would not really need to do scheduling also but this would enable much
> >better timing on professional mulitport serial/usb MIDI interfaces, since
in
> >order to send MIDI messages on all ports at the same time they will have
> >to be sent ahead of time.
>
> This is not of any general use. You cannot queue MIDI data in advance
> unless you know what it is going to be. This is either impossible
> (such as in applications like SoftWerk), or causes "latency" when
> doing real-time editing of the MIDI data.
>
> However, I will admit that it works in enough cases to be useful :)
Indeed the importance of no extra latency for 'MIDI THRU' applications
or low latency for something like SoftWerk would require the MIDI data
not to be scheduled. So in my API I use paths to port(s) to an interface,
where a path can support various modes (scheduled, immediate) and
typed of data (voice, system common, system exclusive, system realtime).
A sequencer application would use an immediate path for realtime/thru and
a scheduled path for the data it knows in advance. Especially the latter can
be a lot of data that has to be sent on various ports of a multiport
interface
at the same time.
> Seriously though: all those interfaces really have over Linux on a
> dual CPU system is a timer that runs at exactly the MIDI data
> rate. The old GUS cards used to have such a thing, but AFAIK, no other
> audio/MIDI interface has ever provided this. If we had such a timer,
> we could do *everything* those devices do within the context of the
> OS. As is, we are stuck with nearest-multiple timing, which isn't
> really quite good enough in a head-to-head competition with those
> things.
The Emagic AMT interfaces can use a technique to allow for simultaneous
transmission of data on multiple ports. Since they have not yet been willing
to supply me with the specifications I don't know how it works exactly, but
It's something like: you can either send data immediately, or you can have
it
put on a small per port fifo and then later have on or more of these
messages
transmitted by sending a single byte to the interface. The communication to
the Emagic interface(s) is at 115kbaud when using rs232. This is not even
4 times the MIDI baudrate, so delays can get quite large when sending on
all ports at the same time, especially when daisy chaining 8 AMT interfaces
(for 64 MIDI out ports).
Steinberg uses a different technique where timestamped messages are sent
to the interface at some rate and the interface runs its own clock from
these.
> >For the same reason as mentioned above I can't implement the API on top
> >of rawmidi and that is also my problem with the ALSA sequencer (that and
> >the fact that it is in the kernel).
>
> which reason? I am not following ...
In order to be able to have full support in the API for multiport interfaces
with
a special timing protocol (Emagic, Steinberg, not the Midimans I think, they
just
add some latency, perhaps Protools in the future for their new MIDI
interfaces).
--martijn
This archive was generated by hypermail 2b28 : Tue Feb 26 2002 - 14:27:07 EET