Re: [linux-audio-dev] introduction & ideas

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] introduction & ideas
From: Paul Davis (pbd_AT_Op.Net)
Date: Tue Feb 26 2002 - 16:03:46 EET


>> All OSS and ALSA raw MIDI devices support the same API:
>> open/read/write/close. There seems to be no need whatsoever to do
>> anything but use this API. There is no need to write any drivers. Just
>> deliver the data to them with write(2), and let them do their (best
>> effort) thing.
>
>This is what a lot of applications would actually use. So in the API I'm
>working on this is provided by using the 'immediate' mode 'path' to
>a port on an interface.

whats the difference between this and the ALSA sequencer? it has a
"direct" mode that simply delivers the data to the client without any
queuing.

>Indeed the importance of no extra latency for 'MIDI THRU'
>applications or low latency for something like SoftWerk would require
>the MIDI data not to be scheduled. So in my API I use paths to
>port(s) to an interface, where a path can support various modes
>(scheduled, immediate) and typed of data (voice, system common,
>system exclusive, system realtime). A sequencer application would
>use an immediate path for realtime/thru and a scheduled path for the
>data it knows in advance. Especially the latter can be a lot of data
>that has to be sent on various ports of a multiport interface at the
>same time.

it sounds to me as if your time would be better spent helping to move
the ALSA sequencer into user space. if not, you'd probably be better
off using MidiShare, which is an existing user-space MIDI
sequencer/multiplexer/router, runs on linux/windows/macos and
others, and does most of what I think you'd want.

>The Emagic AMT interfaces can use a technique to allow for simultaneous
>transmission of data on multiple ports. Since they have not yet been willing
>to supply me with the specifications I don't know how it works exactly, but

i don't believe that its "simultaneous". i'd be willing to bet that
its "so close together in time, its all within the transmission time
of a single MIDI byte (or even bit)". you could do this pretty easily
on existing hardware under linux as long as the h/w FIFO's are empty.

>> >For the same reason as mentioned above I can't implement the API on top
>> >of rawmidi and that is also my problem with the ALSA sequencer (that and
>> >the fact that it is in the kernel).
>>
>> which reason? I am not following ...
>
>In order to be able to have full support in the API for multiport
>interfaces with a special timing protocol (Emagic, Steinberg, not the
>Midimans I think, they just add some latency, perhaps Protools in the
>future for their new MIDI interfaces).

again, since the ALSA sequencer is going to have to deal with these at
some point, it seems more sensible to me to solve this at that level.

i prefer to focus on an API that has precise timing semantics and not
think (at the same time) about how it implements it. if we are able to
get the specs on those devices, perhaps we can use them to provide
precise timing. perhaps someone can figure out a way to generate an
interrupt at the MIDI data rate. it seems to me that it shouldn't be a
visible part of the API, which should just accept pre-queued data and
ensure that it is delivered to the wire as close to the correct time
as possible. the ALSA sequencer seems to me to do this.

--p


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Tue Feb 26 2002 - 15:54:35 EET