If you don't know Python, you get to learn some. This is unlikely to hurt much. If you only want to accomplish simple tasks, you can probably cut-and-paste them together from examples in this document. (If you want to do something complicated, you would have had to learn some programming or scripting language anyway.)
Python cares about indentation. (This is just about the only stupid thing about Python, and I suffer it willingly, but I don't pass up any opportunity to complain either. :-) If you're writing or editing a Python source file, you must keep the indentation consistent. If two lines are indented the same distance in an example, they'd better be indented the same distance in your code.
You can use either the tab key or the space bar to indent lines, but don't mix them in the same file. That leads to pain.
(Note: these rules are overly conservative. I am simplifying for the sake of the Python newbie. If you know how to mix tab indentation and space indentation safely, for heavens' sakes, go ahead.)
Back to Boodler.
A note is a particular sound file -- an AIFF, WAV, or AU -- played at a particular time. You can also specify a pitch, a volume, and some other parameters; or you can leave these at defaults.
An agent is a piece of Python code that schedules notes and other agents. The agent decides which notes to play when, and which agents to start up when. An agent can also create channels.
A channel is a collection of notes, agents, and channels. When Boodler starts up, there is exactly one channel, the "root channel". An agent can create new channels inside the root channel -- and then go on to create channels inside those channels, and so on. The channels thus form a tree, with a single root.
The initial agent -- the one you specify first on the command line -- runs inside the root channel. Normally, when an agent schedules notes (or other agents), the notes are played (and the agents run) in the same channel as the original agent. However, an agent can create a new channel, and then place notes and other agents inside the new channel.
Why all this foofaraw with channels? Because you can make a change to a channel, and that change will affect every note inside that channel -- including notes in channels inside channels inside it. (And so on.) For example, you can mute a channel (set its volume to zero) and thereby silence it and all its contents. Or you could set its volume to 50%, which would mute all its contents by half. Or you could simply kill the channel; this would stop (and destroy) all notes and agents that are running inside it.
(The root channel, of course, ultimately contains every note, agent, and channel. So if you kill the root channel, Boodler shuts down.)
Notes are always scheduled by agents, and agents run at a particular time. Therefore, when you specify a time, you always specify a relative time; 1.0 means "one second from now". The "now" is the time at which the agent runs.
Sadly for the musically trained in our audience, pitch is not given in familiar musical notation -- you do not specify "D" or "A flat". (This might not even be meaningful; what does it mean to play a rainstorm in B minor?) Instead, you specify pitch as a frequency multiplier. The value 1.0 means to play the sound in its original pitch -- the same pitch that is heard in the sound file it is taken from. The value 2.0 means to play the sound at twice the frequency. The value 0.5 means half the frequency, and so on.
The physics of musical sound indicate that for every doubling of frequency, the pitch rises by one octave. Pitch 0.5 is an octave down; pitch 2.0 is an octave up; pitch 4.0 is two octaves up; pitch 8.0 is three octaves up.
Since the entire waveform of the sound is sped up or slowed down, the duration of the sound changes. A sound played at pitch 2.0 will last half as long as the same sound played at 1.0; the same sound at pitch 0.25 will last four times as long. (This corresponds with your intuition that slowed-down sounds last longer. Just like on a record player that's running down. Um, remember record players? Sigh.)
You can specify any pitch greater than (but not equal to) zero. However, a value too close to zero or infinity will be inaudible.
It is legal to play sounds at a volume greater than 1, but you risk overdriving the digital sound stream, which produces horrible static (clipping noise). In general, you should not do this.
For this to work, the sound must have looping information embedded in it. (At this time, Python's sound-loading modules can only extract looping information from AIFF files. And not all AIFF files have loop parameters.)
Basically, the looping info specifies a segment of the sound which can be repeated over and over to extend its duration. This works better for some sounds than for others. For example, a trumpet blare can easily be looped -- the beginning isn't much different from the end, so looping a segment isn't noticeable. A long roll of thunder can probably be looped, but the grumbles might start to sound repetitive if the sound is extended too far. A plucked guitar string would not loop very well at all; the sound dies off smoothly, so looping any segment would lead to an audible "seam".
Some of the sounds in the Boodler sound library are looped, and some are not. It depends on how suited they are to loop-extension.
(Note that these changes are done by muting one speaker or the other. A mono sound played at pan 1 will sound quieter to the listener than the same sound at pan 0, because the left speaker has been silenced, but the right has been left unchanged. (It might make more sense to double the volume of the right speaker, but that would run into clipping problems, so we don't bother.))
A stereo sound is treated as two separate mono sounds: one at pan -1, one at pan 1. This makes intuitive sense; we hear the left channel of the sound only through the left speaker, and the right channel only on the right.
Unfortunately the picture gets more complicated when you try to pan a stereo sound. If you play a stereo sound at pan 1, the left channel is now centered (pan 0), while the right channel is at 2, which is treated the same as 1, meaning all the way right. The sound will appear to have shifted right, but only partially.
To make things worse, you can contract a stereo sound as well as shifting it. For example, you could take the original sound and play the left channel at -0.5, and the right channel at 0.5. The sound is still centered, but the two channels are closer together than before. You could even contract it completely, merging both channels at pan position 0.
Or you can combine a contraction with a shift. If you contract a stereo sound by a factor of 0.5, and then shift it -0.25, you'll wind up with the left channel at -0.75 and the right channel at 0.25.
The especially geeky will recognize this as a one-dimensional affine transform, similar to the 2D transforms used in PostScript or the 3D transforms used in computer graphics modelling. Is this not overkill for playing sounds in stereo?
Well, maybe. The advantage is this: you can treat an entire soundscape -- a Boodler channel -- as a very complex stereo sound. It may have a mixture of mono and stereo sounds, each played at a different pan position. But this entire range of positions can still be shifted or compressed, just like a single sound.
This is, in a nutshell, the power of Boodler. You can build up a complex effect in a channel, and then use it as a single component in another complex effect.