Why Use UST?
By Chris Pirazzi. Some material stolen from
Say you want to play audio and video in sync to within ±7ms. You
may be wondering why you cannot just do this:
/* Best Effort Approach */
/* need to do these two quickly */
/* send audio and video that should go out together */
The idea is that you open up an audio port and a video path at the
same time, and then you loop around sending corresponding audio and
video data to each device. Assume there are plenty of system
resources for you to send all the data without dropping anything.
This technique will still fail to sync up the audio and video because:
The UST support in SGI's libraries addresses all of these issues:
- Your process may execute start_audio(), and then the kernel may
choose to preempt your process to run some other process or handle
some interrupt. Some time later (possibly more than 7ms later), your
process will again run and execute start_video(). You will have
started audio and video out of sync.
More generally, any attempt to do two operations "at the same time"
from an IRIX process will fail because your process may be preempted
between the operations. Any attempt to do something "just as soon as"
your process wakes up, or "just as soon as" some status you are
polling changes, will fail for the same reason.
The only exception to this is if you are using the REACT/Pro real-time
support described in (see Seizing Higher
Scheduling Priority), and this only provides an upper bound on the
amount of time your code can be preempted.
- Say you could somehow guarantee that your process will execute
start_audio() and start_video() with no intervening preemption. These
routines must set up the device, allocate memory, and perform other
operations which could easily vary by 7ms in duration depending on the
device and the state of your system. For example, many VL devices
take a whole frame time to start up. This will again mess up your
- Say you could somehow fix that. You still have a problem: the
delay between when you send data to a device and when that data
actually hits the output jack is variable and device-dependent. So is
the delay between when some new data arrives at an input jack and when
your program hears about it (either by being unblocked or by polling
the device). For example, D/A and A/D converters in audio subsystems
have fixed delays, and audio kernel software may add other variable
delays. Many of SGI's video subsystems have on-board field buffers
which affect the input and output delay you will see. The variability
in delay between different VL devices, or between configurations of
the same VL device, can be multiple video field times.
- Say you manage to finesse that somehow. Now you face the most
subtle problem of all: clock drift. Assume the audio and video you
are playing come from a movie file. The movie file stores a fixed
number of audio frames (say 44100) for a fixed number of video fields
(say 50). But your audio and video devices may not be operating at
exactly the ratio (44100/50). Check out Clocking and Clock Drift: Relating
Multiple MSC or UST Clocks for exact details on the problem and
- The psuedocode above failed because it required an IRIX process to
perform two tasks simultaneously (or atomically). SGI's UST support
gives you a toolkit of basic atomic operations:
- Input Timestamping
- Input some data at a jack of the machine, and
- snap the value of the UST clock.
- Output Timestamping
- Output some data at a jack of the machine, and
- snap the value of the UST clock.
- Output Scheduling
- At the instant when the UST reaches some value,
- output some data at a jack of the machine.
In each case, the library pairs the data is with its UST in memory, so
your program can manipulate them together. Essentially, SGI's UST
support transforms hardware operations that are atomic in time into
software structures that are paired in memory. You can use these
primitives to perform a wide range of synchronization tasks (some of
which are listed in Introduction to
UST and UST/MSC).
- The pseudocode above could only successfully synchronize audio and
video if it knew exactly how long start_audio() and start_video()
took. The UST support lets you measure exactly when your data came in
or will go out, so you can synchronize recording or playback without
- The USTs which SGI libraries pair with data are always USTs at the
jack of the machine. If a given subsystem adds an internal processing
delay to input or output (A/D or D/A delay, field buffers, etc.), that
delay will be factored into the returned UST.
- Finally, the UST support lets you measure the exact rate ratio of
any two sampled devices, whether their clocks are locked or not. In
the example above, you need only compare the UST difference of two
sufficiently spaced audio frames with the UST difference of two
sufficiently spaced video fields. This is the information you need in
order to tweak your data to fit the playback devices. See Clocking and Clock Drift: Relating
Multiple MSC or UST Clocks for more information.