Re: [linux-audio-dev] XAP spec - early scribbles

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] XAP spec - early scribbles
From: David Olofson (david_AT_olofson.net)
Date: Tue Feb 25 2003 - 19:20:40 EET


On Monday 24 February 2003 11.24, torbenh_AT_gmx.de wrote:
> On Fri, Feb 21, 2003 at 04:07:54PM +0100, David Olofson wrote:
> > Hi, and wellcome! :-)
> >
> > BTW, do you know about the GMPI discussion? The XAP team is over
> > there as well.
> >
> > http://www.freelists.org/cgi-bin/list?list_id=gmpi
>
> oh no... another mailing list i must read :)

*hehe* I switched to full height folder tree a while ago for this
reason... Way too many lists.

[...]
> My problem is now: How can i map something like the clock
> which is a pure event generator to the XAP model ?
>
> it must get called somehow if the event delivery does not
> provide for a callback.

There's only one call; process(). (Or run(), as some prefer to call
it.) This is called once for each block, whether or not the plugin
processes audio. Might sound odd, but considering that the unit for
event timestamps is audio frames, it turns out to be really rather
logical - in theory as will as implementation.

> Do you have something like that in
> audiality ?

Well, there's the MIDI loader/player (which effectively takes no input
at all currently), and the "patch plugins". The latter are full-blown
event processors, that take channel events and generate voice events.
They also implement envelope generators, voice management and that
kind of stuff.

[...compressor...]
> revelation !
> (ah... now i got your idea: this is very cool)

Well, the idea is really rather simple: Just think of the event
"streams" as somethig very similar to audiom streams. They're just
structured and more flexible. Unit graphs can be constructed
according to the exact same rules.

> > 3) Doing anything like this just to *slightly* increase
> > the chances of soft real time worker threads finishing
> > within a single buffer cycle, while the audio thread
> > is working, in a way that can only work on SMP machines,
> > is rather pointless. If you fire a worker callback, you
> > do it because you suspect it might take so long to
> > finish that doing the work in the audio thread would
> > cause a drop-out. That is, you're not expecting a
> > result until one or more blocks later.
>
> The example i was thinking of, was a galan like clock
> running at 10Hz or so sequencing a list of wav file names.
> with the galan like clock the event is inserted into
> the queues 100ms before it is due.

That means it's not really a real time unit, since you know long ahead
what it's going to say.

Or do you?

No. If you stop the clock, or change it's rate, the upcoming enqueued
event becomes invalid, and must be modified or replaced for correct
results.

If each plugin only ever worries about the current block for I/O, this
is not an issue, because plugins would never generate events that
they may need to take back later.

> if the event processing would track the event processing
> queue through the wav file list to the sample loader.
> it would know that in 100ms it must load sample x in 0seconds.
> This is where i want the worker thread to be fired.
> it has 100ms to get the wav loaded.

Yeah, but what if the clock runs at 100 Hz...? How does 10 Hz
automatically mean that the next event is known 100 ms ahead of time?

I think you're looking in the wrong place for "look-ahead" data. If
you need to know something X ms ahead of time, you simply need a
protocol that allows the sequencer to send "prepare" events of some
kind X ms before the real event.

Obviously, this can only work with sequenced data. If you're
generating events from like MIDI input or similar, there's just no
way to do it.

This suggests to me that "look-ahead" is a special case interface for
some specific cases; not an idea to build a real time API upon.

> this seems to be not feasible, because the event peeker would be
> fairly complex it must not follow the code path of the clock
> posting events to itself because of the infinite loop etc... not
> taking into account bigger event feedback loops....

The "event peeker" has to be part of the clock and send events that
are part of a special protocol for this task.

We have something similar in XAP, meant for hard disk recorders and
other plugins that need pre-buffering time before they can start
playing from a random song position. A sequencer (or rather,
timeline) can inform such plugins about loop and jump points
beforehand, so that specific pre-buffering can be used to make it
possible to jump to these points with zero latency.

[...]
> > > the synth could tell that it wants to be processed when the
> > > event it just received becomes due.
> >
> > How? The host won't see any events, as they're normally just
> > added directly to the synth's event queues by inline code in the
> > sender... Besides, even if the host snooped every single control
> > of the synth (ie all incoming events), it wouldn't know what's
> > relevant and what isn't. Only the synth can know what
> > combinations of values actually produce sound.
>
> yes... But due to the event peeking code it would get the event
> 100ms before it is due. The event peeker is too complicated though.

In fact, it's not even possible to implement in a generic way. (See
above.)

> I have now understood the XAP model and wont be coming up
> with stupid ideas.

Well, supporting plugin "actions" that can't be performed with "zero
latency" isn't a stupid idea - but it's rather tricky to get right.
It's easy enough to just have plugins send off such jobs to worker
callbacks, but that makes the responses soft real time.

That said, hardware synths, and to even greater extent, samplers, have
this problem as well, and it's no major issue in real life. It
*could* be dealt with, but it's probably not worth the effort or
added complexity for things like loading samples or rendering
waveforms. Just don't load new sounds in the middle of the song. Do
it before starting playback.

[...]
> > > at the Moment it uses one global event queue which
> > > makes the performance bad if you had 2 unrelated high frequency
> > > clocks in the mesh. But the XAP model of individual event
> > > queues would fix this with a not small coding effort involved.
> >
> > Probably. It also makes it possible to avoid explicit event
> > dispatching inside plugins with more than one inner loop, and it
> > eliminates the need for sorting events as they'le sent.
>
> yes right... at least the API in galan is clean enough to make the
> changes...
>
> the problem is only with the event sideeffects.
> A pure event plugin does not get REALTIME callbacks.
> because the plugin gets the callback upon receiving the event.

I see. But then, how is an event processor able to "generate timing"
of it's own, as in generating events that are not direct and instant
responses to incoming events?

An example would be the Audiality MIDI player. You just start it, and
then it generates events with sample accurate timing, until the song
ends, or until you stop it. It's generating it's own timing
internally, and it derives it from the audio time, through the
process() call. It's as simple as "Do your thing for N frames!"

[...]
> i will make my calls inline too...
> But how do i remove a sorting send function ?
> i dont get that yet.

Took me a good while to figure that one out as well... :-)

In Audiality, it's actually quite trivial. These are the rules that
make it work automatically:

        1) Senders must send events in timestamp order.

        2) Each queue must receive events from only one sender.

That's it.

As to XAP (and most probably future versions of Audiality), it gets a
little more complicated, since the second rule will be violated on a
regular basis. The first step towards solving this is to look at how
connections can be related:

        1) One plugin sends to another plugin.
                No problem. Events are sent and received in
                order.

        2) One plugin sends to multiple plugins.
                Still no problem. The fact that events are
                dispatched across multiple queues by the
                sender doesn't mean they get out of order.

        3) Multiple plugins send to one plugin.
                Now we're in trouble... We'll have multiple
                ordered chains of events that must form a
                single chain on the same queue.

(Here, we are assuming that each plugin has only one input event
queue, and sends events from only one loop, thus guaranteeing that
*all* events output from one plugin will be in timestamp order.)

What we do about this is have the host connect all (or all but one) of
the senders to one or more "shadow queues", so that the host can
sort/merge as required, in between process() calls. Sort/merge is a
really rather low cost operation compared to full sort or (even
worse) per-event insertion sort, and it means plugins - and the event
handling inlines - don't have to worry about this.

There's another problem, though, specifically for XAP: Plugins may
have multiple input event queues and/or multiple internal event
sending loops. (This is mostly for "monolith" plugins like
multitimbral synths and the like, which are most likely never going
to be implemented with a single event/audio compound loop.)

The solution is to stamp all control inputs and outputs with "context
IDs", so that the host can treat these queues and loops as separate
plugins WRT event routing. The rest is handled the way described
above.

[...]
> Well no need to look at the code i think.
> The XAP design is far superior than galan.
> (considering performance at least)
>
> The API of galan is worth a look though.
> And now that i have understood XAP i will have a try on changing
> the implementation to be more optimal. I am tempted to think i can
> do it without API change.
>
> shit another BIG todo item :)

I have plenty of those... *hehe*

> i will have a go on audiality also
> why is there no ebuild ?

One might as well ask, why ebuild?

I've hardly heard about ebould before, although I've been confronted
with a few other make alternatives before. However, autotools are the
defacto standard, and I know them well enough to get the job done. I
just don't have the time and motivation to investigate all the
alternatives...

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -'
   --- http://olofson.net --- http://www.reologica.se ---


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Tue Feb 25 2003 - 19:33:13 EET