Re: [linux-audio-dev] VST link (open?)

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: Re: [linux-audio-dev] VST link (open?)
From: Paul Davis (pbd_AT_Op.Net)
Date: Wed Jan 23 2002 - 03:28:50 EET


>Dustin Barlow wrote:
>
>> I agree. So let me rephrase the question. Is there any movement or
>> interest in having a multi-node audio system that can be controlled from a
>> master host that uses a communication protocol that will provide more then
>> just sync info? For example, it would be really sweet to have a node that
>> just does effects processing and has no need to waste any of its cycles
>> driving a gui. It could just be a "server" in the same sense that a
>> black-box commercial effects unit would be.
>
>This is exactly what DMIDI is designed to do, seperate the processing
>from the control surface.

anything based on MIDI is clearly completely unacceptable for a
general control protocol. the data resolution is too small if you
stick with 7 bit values; with 14 bit values, there are some
*hellacious* parsing problems that were clearly not thought out at the
time that Sequential, Yamaha and others were defining MIDI. this has
been long accepted in the "academic" music world, where the
limitations of MIDI have been clear for a long time.

if you look at any of the existing plugin APIs, they all use 32 bit
floating point values (some of them normalize the values to a -1..+1
range) for control data. MIDI can't handle this well at all.

MIDI works reasonably well for distributing relatively sparse
sequences of control messages, and traditional twelve-tone musical
performance data. But thats about it. No existing control system that
I know of prefers to use MIDI as its control protocol, though several
of them do as a second best when USB or direct serial interfaces are
not available.

>I've not looked at any SMPTE sync'ing, only MIDI data.

To repeat something I seem to have to say over and over and over
again. Here I'll paraphrase the ProTools manual.

there are 2 kinds of synchronization:

 a) how fast are we going?
 b) where are we and what direction are we moving in?

a digital audio system cannot operate with only one of these systems
and be said to be "synchronized". (a) is typically satisfied with a
distributed word clock signal (or superclock in ProTools systems),
which is a clock signal typically running at either the sample rate
(on the other of tens of kHz, or for superclock, tens of MHz). (b) is
satisfied by SMPTE, MTC or the proprietary ADAT sync signal, and
typicall has low temporal resolution (on the order of 1/30 second). If
you have only (b), you cannot ensure that the different audio
interfaces in a system are running at the same speed; if you have only
(a), you cannot have one node in a network follow another from place
to place in "time".

if you try to answer (a) with the system used for (b), you have to
build a phase-locked loop, and the system is intrinsically unstable
when viewed at the kinds of real time latency numbers we like to talk
about on this list. the resolution of the clock is too slow, and
departures from a uniform signal too random and often erratic (such as
when transport controls are active).

--p


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : Wed Jan 23 2002 - 03:20:01 EET