Re: [linux-audio-dev] XAP: Ansynchronous Streaming Protocol

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] XAP: Ansynchronous Streaming Protocol
From: David Olofson (david_AT_olofson.net)
Date: Fri Dec 13 2002 - 20:46:37 EET


On Friday 13 December 2002 18.53, Steve Harris wrote:
> On Fri, Dec 13, 2002 at 05:28:23 +0100, David Olofson wrote:
> > Anyway, what made me consider this is the Linuxsampler project,
> > which (to me) demonstrates that it may make a lot of sense to be
> > able to split the waveform loading/streaming from the actual
> > sampleplayer. Both are pretty generic units that could be reused
> > - but only if there is a common interface for their interaction.
>
> Funny. I was going to use linuxsamplee as an eaxample of why you
> cant normally seperate them :)

*hehe*

> Linuxsampler requires very tight interaction between the streaming
> code and playback code to work.

Yes. You need to be able to tell the "butler thread" which files to
load/cache, and it may need to know the maximum bandwidth you will
use, so it can set up sufficient caching - if there's a point in
being that picky, that is. (Streaming from disk is nondeterministic
enough that the difference in latency between starting a 48
ksamples/s stream and a 96 ksamples/s stream becomes insignificant.)

Next, you need to request at least the number of frames required for
each waveform, that you can produce N buffers of finished output at
the highest allowed pitch for the respective waveform, where N is the
feedback latency caused by your sending the request events
"backwards" in the net. Normally, this will be exactly 1.

When the requested data has been delivered to you, you're ready to
go. You're basically a standard sample player, and in fact, you could
just have requested *all* of the waveform data right away, and be
done with it - but then this wouldn't be all that interesting, would
it? :-)

So, you have only the first fraction of each waveform; just enough
that you have time to ask the disk butler plugin for the rest - and
get it - before you run out of data to play.

Since the disk butler is aware of what it's doing (acting as a
streaming server with QoS "guarantees"), it will obviously have yet
some more data loaded and prepared for delivery. The disk butler is
the right place to keep track of the amount of buffering required for
streaming a suitable number of files at once, be it for traditional
hard disk audio playback, or direct-from-disk "zero latency" playback.

When you start playing a note, you simply start playing the data you
have (like any sampler), but at the same time, you also push your "I
may need data until this point in the waveform N blocks ahead"
variable, and tell the disk butler to send the data you do not
already have.

Of course, doing everything inside one plugin would be a bit easier,
but then you have an integrated disk butler in the sampler, which may
well end up figthing with the hard disk recorder's disk butler over
the disk... And of course, running *two* different direct-from-disk
samplers would be problematic as well. You need one physical drive
per plugin that streams from disk, since standard file systems are
terrible at handling multiple tasks reading or writing simultaneously.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`---------------------------> http://olofson.net/audiality -'
.- M A I A -------------------------------------------------.
| The Multimedia Application Integration Architecture |
`----------------------------> http://www.linuxdj.com/maia -'
   --- http://olofson.net --- http://www.reologica.se ---


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Fri Dec 13 2002 - 20:52:06 EET