[linux-audio-dev] more audio client/server considerations (client acts as engine)

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: [linux-audio-dev] more audio client/server considerations (client acts as engine)
From: Benno Senoner (sbenno_AT_gardena.net)
Date: to loka   14 1999 - 17:22:49 EDT


Hi,
some time ago Paul said it would be nice to run one client audio app
as engine for an other client app:

let's say: app1 provides a LP filter
and app2 wants to use this filter.
But this filter is running in app1's thread.
As long there is only a point to point comunication, it's ok for me,
but do you plan to allow arbitrary long chains ?
like app1 generates audio , then sends the data to app2 which applies
an LP filter and then app2 sends the result to app3 which applies
an EQ.

The problem could be latency , since if one app fails to meet a deadline,
or maybe too many apps could cause excessive scheduling overhead (but I'm
confident that with 10 running audio threads there are no major scheduling
problems).
The idiotic case could be a plugin chain made up of 10 plugins,
5 of them running on app1 (the ones in the even positions) , and the other 5
running on app2 (these in the odd positions),
That would cause 10 context switches for this chain.
run a few of these idiotic chains and the scheduling overhead will ruin
the whole CPU performance.
 
I think the ideal solution is still to have point to point comunication between
clients (audio applications) and the audio server.

there could be two classes of clients:
1) the client process the audio/event data and hosts plugins in his own
thread, and sends only the final result (audio data / MIDI events) to the audio
server which mixes toghether/routes the data from/to the clients.

2) the client delegates most of functions to the engine,
the plugins are hosted in the engine thread, that means
audio data from the client is sent to the server and then processed by the
plugin net of the engine, and then mixed/routed as with the clients in the 1)
case.

Now an additional question:
David said "sensor threads" should run at highest priority (to allow to preempt
the engine), to get accurate timestamping.

IMHO it would be better to run all timestamp sensitive sensor tasks (MIDI input
etc) in one single thread to reduce scheduling overhead.

Or are there good reasons for not going this way ?

So what would be a typical figure of the running threads in an enviroment
with MIDI in/out , PCM in/out , server engine, and one client app ?

threads sorted by priority , highest priority first

MIDI i/o thread :select()s on MIDI fds and reads/writes data to the event
system
server engine : read()/write() of PCM data, syncs with clients, processes
plugins and mix results toghether
client app: reads/writes PCM data/event data from/to the engine
(and of course does some processing on the data).

Note that with this priority scheme, a flawed MIDI out as the one on the
SB AWE64, could cause audio dropouts on the audio engine, since the
MIDI out driver does busywaiting.
But we assume that we run our engine on non-flawed hardware.

regards,
Benno.


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:27:59 EST