To move visual objects on the screen or to trigger audio objects and sychronize them with one another, Alambik provides two approaches :
· 1 Duration-based synchronization:
· 2 Event-based synchronization:
|Alambik's Duration-Based sequencer|
1.1. The basic principle
Alambik's sequencer facilitates the creation of ordered audiovisual
projects, usually linear, called "clips".
|1.1. The basic principle|
A timeline (shown here in green) represents the progression
of time; that is to say, the overall duration of a clip (".alg"
or " .hdv").
|1.2 Selecting a Timeline|
For optimal flexibility, Alambik allows you to partition your Timeline
according to the kind of production you want to create. You can therefore
This approach is best used when the clip you're building relies on
animating images which move in relation to time, measured here in seconds.
This works well for projects in which visual events are assigned sounds
after the visual part of the animation has already built.
This approach is best adapted to projects consisting of sequences in
which you want to manipulate visuals according to the audio signal.
To create talking characters, you can partition the Timeline according to the position of each spoken word in time. The animation of your character's arms, for example, or of his body, can be defined according to the Timeline and thereby take place in precise synchronization with the words it speaks. In addition, facial animations such as mouth movement and chin waggling, as well as emotive facial expressions, can be automatically synchronized in Alambik according to the phonemes pronounced, their amplitude, as well as by expression markers inserted into the text in the case of Text to Speech. (feature to be released)
|1.3 Automatically-triggered Events|
The Alambik sequencer lets you automatically trigger an ordered series
Let's imagine you're creating a simple project in which a character
steps up to the front door of a house and knocks. To design this scene
using a traditional approach, the project creator would have to carefully
consider elapsed time (in seconds) to determine the exact duration between
the appearance of the character on screen and the instant he begins
knocking. At that precise instant, the sound resulting from the character's
hand striking the door would then be recorded by the sound recordist
with the aid of a microphone.
· Partition the Timeline in "seconds."
As in real life, any sound effects arising from movement will automatically
follow any changes to the movements' speed. Indeed, we can consider
the usefulness of this feature from the point of view of a movie director.
Let's say this director often likes to compare the same scene played
by his actors at various different rhythms, in order to choose the take
which best conveys the emotion he wants to get across. Imagine, in this
case, that all sound effects had to be positioned on a Timeline measured
in seconds: our director would be forced to manually reposition each
effect every time he changed the rhythm of a scene. Even if he could
somehow modify the speed of his "internal timer" to adjust
the pace of sound effects, he would be constrained to make only linear
changes in speed - he could not vary the rhythm, for example, to take
on different speeds at different points. Furthermore, once he changes
rhythm he would lose the reference value of his Timeline, which was
originally broken down into seconds. Through triggering with images,
Alambik solves all these problems.
18.104.22.168 The basic principle :
The Alambik Editor includes a utility called the Audio Synchronization Tool which generates a unique marker corresponding to every line of a realtime-rendered audio file.
22.214.171.124 Synchronizing with Pre-rendered audio
files (".MP3", ".WAV",".OGG") :
This mode functions by digitally analyzing the sound signal in real time.
Another mode which lets you manually set the markers you want, by means of three different kinds of peripherals:
· The computer keyboard, certain keys of which can be assigned
to precise events,
After looking at how different "series of events"
can be set in a Timeline partitioned either in seconds (internal_timer),
or in audio markers (audio, music, or speech_timer), we are now going
to discuss how to call procedures based on different kinds of "trigger
126.96.36.199. Audio events :
188.8.131.52. Events triggered by the collision of
visuals (to be implemented):
|1.4 Chaptering a clip (".Alg" or ".Hdv" files)|
Chaptering a clip means placing markers in a chapter track
which let the user move to precise marked locations during the playback
of a clip.
|1.5 Practical Example|
Let's create an audiovisual clip. It will include all the different
tracks which control the behavior of their assigned elements. This clip
is based on a "unique Timeline," which will "synchronously"
give rhythm to all of its tracks (i.e., using duration-based synchronization).
As mentioned above, a clip can include four different kinds of tracks:
After having defined the basic form of our clip, we can now assign specific objects to each track.
as well as, coming soon,
Conventions for the track control string variable:
$ALL_TRACKS = "111111" plays all tracks.
"Selected Play" mode for a clip containing six tracks, three of which are non-controllable:
$ SELECTED_TRACKS = "11X1XX" plays all tracks
Let's take for example a clip, in this case an Alamgram, in which the
synchronization is duration-based. The above diagram illustrates the
script which will follow from our example. The Timeline will be partitioned
in seconds, by selecting "internal-timer."
|Alambik's Event-based Synchronization|
Event-based synchronization is non-linear, that is to say useful for
productions containing animated objects or events which are triggered
by outside actions (such as the keyboard, mouse, etc.). When you want
to synchronize your script based on events, but you don't know precisely
when they will occur, Alambik lets you use :
|2. 1. The Event-based sequencer|
Generally speaking, the event-based sequencer is preferable for creating
projects in which objects are animated (or events are triggered) by
pre-defined exterior events (keyboard, mouse, etc.) and when you don't
know precisely when they will occur.
Here is an example:
|2. 2. "Classic" programming-based animation|
This mode, a classic approach to animation, makes use of logical statements
and mathematical operators. It should be reserved to cases in which
use of the Alambik sequencer is not appropriate. For example: