[linux-audio-dev] Re: Timed Event Editor Framework

New Message Reply About this list Date view Thread view Subject view Author view Other groups

Subject: [linux-audio-dev] Re: Timed Event Editor Framework
From: David Slomin (dgslomin_AT_CS.Princeton.EDU)
Date: pe elo    20 1999 - 23:24:51 EDT


On Sat, 21 Aug 1999, Andreas Voss wrote:

> Also the end user wants to edit e.g the modulation wheel value and does
> not want to worry about its MIDI representation (LSB/MSB in different
> controller messages).

s/the end user/some end users/

I hate software that thinks it is always smarter than the user. There are
perfectly valid situations where a user may want to deal with raw messages
instead of compound pseudo messages. As a simple example, most of the
time, you would want note-on and note-off events to be combined into
note-with-duration pseudo events, but if you're addressing a drum machine,
it is perfectly valid to have note-on's without corresponding note-off's.
 
That said, I think it is equally important to support the pseudo messages
for common cases, as long as they can be toggled on and off by the user
(and by plugins... I already have an example that needs that capability).

> What I would really like to see/build is a highly configurable, universal
> editor for abstract timed events. I'm thinking of an editor, that is aware
> of timed events that have (mostly) numeric parameters and allows to edit
> these. This editor should not know about the nature of the parameters,
> e.g. it should not be hardcoded, how to present the pitch-parameter of a
> midi note event to the user. This could be described by metadata
> (configuration file, java-beans etc). So this thing could be seen as a
> framework or library, from which you can build specialized editors for
> different tasks.

I love the concept and am all for it from a backend point of view.
However, I worry that it puts the user interface at risk, because it is
infinitely harder to make a general interface optimal than it is to make
a specific one. This view is backed up by the abundance of terrible
interfaces on supposedly professional software that tries to be general
purpose.

For example, Cakewalk was a fairly decent sequencer when all it
did was MIDI. When they tried to mix in audio editing features, their
MIDI support languished (they have not addressed any of the deficiencies
or problems with their MIDI interface in numerous versions) while their
audio support has never progressed beyond mediocre. Someone who valued
the quality of the software would use a better dedicated MIDI sequencer
and a better, separate, dedicated audio sequencer/mixer (like Pro Tools).

I'm all for sharing the backend; Java makes code reuse trivial, and even
in C, writing shared libraries is not a difficult task at all. I think
that the user interface, like with the Unix shell tools, should be small
and optimized to each task so that it can do it well.

<FLAME BAIT>Yes, I use vi, not emacs.</FLAME BAIT>

The reason I'm writing this bugger is because I'm fed up with suboptimal
interfaces specifically for MIDI editing, not because I want to create a
half-hearted attempt to handle everything. I really am shooting for the
truly perfect UI, to the best of my ability.

In any event, I'm done with the initial implementation of the core
backend, ie: the event list structure. Java's object orientation made it
natural to write the backend in the generalized manner you suggested, even
if I wasn't shooting for a generalized frontend. Check out the class
derivation:

Object
   |
   +--Doubly Linked List (of List Elements)
   | |
   | +--Sequence (of Events)
   |
   +--List Element
          |
          +--Event
                |
                +--Event in a Track
                      |
                      +--MIDI Channel Event
                      | |
                      | +--Note On
                      | |
                      | +--Note Off
                      | |
                      | +--Note (Pseudo Event)
                      | |
                      | +--Controller
                      | | |
                      | | +--Raw Controller
                      | | |
                      | | +--Compound Controller (RPN, etc)
                      | |
                      | +--(etc)
                      |
                      +--Raw MIDI Sysex
                      |
                      +--SMF Lyric
                      |
                      +--CSound Note
                      | |
                      | +--Note with Pitch
                      | |
                      | +--Note for "foo" inst of "bar.orc"
                      |
                      +--Audio Clip
                      |
                      +--(etc)

True, I did not write an XML "SongPad event type description file" DTD and
parser, but you can easily add to the heirarchy in Java. (Oh yeah, THAT'S
what all this open source fuss is about.)

> Editing a csound .sco file and a MIDI files is very similar.

I take issue with this. In CSound you specify all parameters for a note a
the time that it is fired. In MIDI, you add or modify parameters over
time after the note has been fired. (This is done in a poorly scoped
manner... try changing the modulation amount for only one note out of a
chord played on a single channel.) This makes a big difference, at least
to me, in how they should be presented in a visual interface.

> Another appliaction would be a mixer for audio files (e.g. mixdown
> singular drum instruments to a complete drum loop). Here you place the
> 'audio events' on time/instrument axis.

Well put, although unintentionally. The same backend could be used in
another SEPARATE application that would do that. This is Unix; let's do a
little multitasking. If you need to use the drum machine together with
the MIDI sequencer, they can be synchronized via MTC/SMPTE or any of a
number of other ways.

> Another useful event type in both worlds (midi + csound) would be a
> phrase event, an event that contains other events. This may be used
> - to structure a song
> - to use predefined patterns or song fragments out of a library
> - invent new event types like a chord event (you place Em7 somewhere
> instead of its single notes)
>
> There could be more esoteric event types, e.g. algorithm events that may
> generate new events or modify existing events.

I thought about this a lot, actually. What it boiled down to, for me, was
that this was best handled in a separate app or scripting language,
because if you integrated it, you ended up with things like MOD trackers
which were over-optimized for a single style of music.

Basically, to do this right, you'd want a complete set of scripting
functionality: conditionals, loops, functions with parameters, etc. These
don't fit well into the linear event list model that's at the core of the
sequencer.

Common Music (non-interactive) and JMix (interactive) are exactly the type
of separate apps I'm talking about which would cooperate beautifully with
the sequencer to fill this niche. I could reinvent the wheel here (which
I'm not, as a rule, adverse to doing), but I don't have the interest nor
expertise in algorithmic composition to do it properly.

> Think of a piano roll editor, a window where you can edit small rectangles
> (note events). x-axis is midi clock, y-axis is pitch. You could use this
> editor to edit any two dimensional data, e.g. you could use this editor to
> arrange a song by replacing the y-axis with the track-no and edit phrase
> events instead of note events. Or you can place MIDI keysignature events
> by replacing y with the possible 12 tunes. Or edit the value of a
> controller over time, or note-velocity over note-pitch (e.g. to make high
> notes louder), or ...

I think you only got one of my older emails to the list. My piano roll
does have two types of graphs... one is a very highly optimized note
editor which handles only Note-on, Note-off, and Note-with-duration
events. This has to be handled carefully, because notes are the most
fundamental and critical part of the music (after time/rhythm itself).

The other is a general purpose graph for other controllers, where
the x-axis is always time, but the y-axis and color-axis are each
assignable to different editable parameters. I'm not yet sure if there
will need to be special cases here for certain types of controllers; I
personally tend to input and edit controllers from a list view not a piano
roll view.

At any rate, this arbitrary controller graph covers many of the cases you
have in mind. What it does not cover is those not tied to time. I have
not decided what to do about those yet, or if they're really in the domain
of what a sequencer should provide. After all, a sequencer is something
that handles sequences... lists of events that are tied to times.

> There should be a general property editor, that allows to access all
> properties of a selection. The metadata should provide a specialized GUI
> component for each parameter type, e.g. the value of a midi program change
> could be picked out of a list of instrument names, the value of the midi
> clock could be typed in as 'bar:step:tick' and so on. This property editor
> is independend of any specific event type an can be used for standard MIDI
> evnets (like note), esoteric events (like chord or algorithm), csound
> events (that may have many parameters) etc.

This sort of thing is pretty common in existing piano rolls by
right-clicking on a note or event to get a pop-up editor for that event.
My plan was to make it even easier to do this multiple times in succession
by having it presented in a stay-up window rather than a pop-up. But I
agree, this is important to have whether or not your editor is
generalized.

> The framework should provide services like
> - general GUI like undo/redo, selections, cut/copy/paste, etc
> - interfaces for different backends (like midi, csound)

Agreed.

> - metadata handling
> - GUI components like axis beans for different editors (pitch axis, midi
> clock axis, table axis for track/phrase arrangement window)
> - complete editors like rectangle-editor (piano roll, arrangement window),
> property sheet, event list, curve painter

Not agreed. Although the term "axis beans" is intriguing, I think that
it would be nearly impossible to make a bean specification that would
handle all cases optimally. Rather than try and fail, I'd rather do them
separately and get each one right.

> - simple algorithms (linear interpolation from start value to end value)

Algorithms tend to come under the category of plugins. This is another
issue I've thought a lot about. Keeping plugin-writing simple and
powerful is the main reason I'm so adamant about keeping the fundamental
data structure a single event list for the whole song (not even separate
lists for separate tracks). The algorithms are far simpler when you're
dealing with linear streams. I've lost a lot of hair because of CAL.

> I have been thinking of writing such a thing in java, but progress is very
> slow. All I have so far is a few UML diagrams and some experimental GUI
> stuff ...

I know all about the slow progress stuff. This whole thing started for me
about four years ago when I thought I wanted a better notation program not
a better sequencer. I've been drawing mockups, writing specs, and
throwing away semifunctional coding attempts for it all this time.
That's part of the reason I sound so stubborn on some of the issues here
-- it's not really all about ego. :-) I've just distilled my list of
complaints with existing sequencers down to a plan which I think is
feasible to implement properly, rather than ending up with more
complaints.

[ Going back and editing this message before pressing "send", I realize
that I wanted a quick conclusion here, but actually started a whole new
message. Foo. :-) ]

Anyway I think that once I get this done, we'll be well on our way to
having a really kick-ass music production system from the LAD community.
My personal needs for such would be:

1. A good MIDI sequencer. This is the most important for me, and as the
expression goes, if you want something done right, you have to do it
yourself. (Especially when nobody seems to agree with your definition of
"done right"!) That doesn't mean that I expect everyone to agree, but
Jazz and KuBase aren't about to disappear either.

2. A decent notation program. Although this is pretty important for me,
I'm afraid that it will be the one piece that remains missing from the LAD
music production system for the longest. For a while I had hoped that
Guido would fill the niche, but they've slowrolled their promised release
of the source code for too long to rely on them.

And no, please don't mention Lilypond. Even if you have the patience to
learn the language (which I might if sufficiently motivated), that fool
thing is right near impossible to successfully install. This goes deeper
than just lacking automake; it has too many dependencies, and referring to
its internal architecture as spaghetti code is a serious understatement.

3. A solid, non-sequencing audio sample editor/recorder. It looks like
Snd is starting to become mature enough to fill this niche. I rank audio
effects as much less important than the core editing functions here.

4. A good non-realtime audio sequencer/mixer/compositor. The recent
progress on Mix sounds very promising here, although the nine track limit
sounds very artificial to me. Then again, I'm personally biased towards
Rt (also ported to Linux from SGI, with similar Motif troubles) which was
written by my prof.

To respond to another thread, I'd not be adverse to writing a perl script
that would turn CSound into a compositor of this sort, reading a Mix, Rt,
or custom format script file. However, I'm a little busy with the
sequencer right now, and there are other people here who are much better
at CSound than me (most of you, in fact!). This might also fill the niche
sufficiently, especially if you then added an optional GUI (not
prohibitively hard in Tk, for instance).

5. A powerful synthesis engine. Here, the LAD community rules... we have
CSound, a mature program which can do nearly everything you'd want a
synthesis engine to do, but we also have quite a few other powerful apps
in this category, some mature and some under active development!

6. Lastly, a good realtime algorithm engine. This is not as essential as
the others for me, but it is still a very useful thing to have. Although
I haven't learned how to work it yet, JMax sounds like it fills this niche
rather nicely.

Once we have robust apps to fill these 6 niches (and assuming that they
all play well together with IPC and drivers and such), Linux will truly be
a formidable music production system!

Div.


New Message Reply About this list Date view Thread view Subject view Author view Other groups

This archive was generated by hypermail 2b28 : pe maalis 10 2000 - 07:25:52 EST