Like so many other facets of the broadcast industry, the role of the
contract engineer continues to evolve around technological advancement.
One has only to look back 20 years or so to recall a time when
computers of any type were likely to be found only in a radio station's
sales or business departments. In those days, technical upgrades to the
broadcast facility were likely to appear as a new piece of equipment
that was simply plugged in to an outlet and wired into the audio chain
using XLR audio connectors or barrier strips. Today's realities are
different. Now the issue is not so much where to put the new box, but
how to best integrate the new “solution.” Instead of
pondering simple questions like balanced or unbalanced audio and remote
control requirements, engineers now face multiple issues when
introducing new types of digital audio systems into the broadcast
Integration at the hardware level is the first step.
There are three major areas of concern when it comes to integrating
new hardware/software platforms into an existing facility. The first is
connectivity at the hardware level. Most studio and production tools
are available with both digital and analog I/O, but interfacing them is
sometimes problematic. While AES and SPDIF digital inputs are more or
less the standard interchange, consumer-oriented equipment with optical
interfaces is also encountered. The picture gets even more complicated
when considering sample rates. Existing AES equipment usually employs
48kHz, while CD players most commonly employ 44.1kHz. If you're
connecting everything through a digital console with selectable input
types and sample rates, great. Yet the equipment or the desired
configuration often doesn't allow for this, and where dissimilar
digital I/Os meet, interface and sample-rate converters are required.
Because the cost of these black boxes adds up and sample-rate
conversions are to be avoided whenever possible, you need to pay close
attention from the start. Switching of inputs, outputs and studios may
also require digital routers and a master reference (synch) clock.
Achieving connectivity at the network level can also be a challenge.
For example, many popular PC- and Mac-based production packages can
communicate at the network level, but careful attention must be paid to
what network topologies are in use. Some older equipment was designed
around thin Ethernet, while 10baseT and 100baseT later became popular,
though economics frequently dictate that systems of these differing
vintages must be adapted to communicate with one another. Likewise,
many popular on-air digital delivery systems claim to be compatible
with existing traffic and accounting systems - but beware: If more than
one vendor is involved, getting this to work in practice is seldom as
easy at it sounds. If your IT skills are not quite up to snuff, you may
need some help sorting out cost-effective ways of tying various
The final factors in this equation are the digital storage mediums
and sound file topologies themselves. While CD burners have made direct
audio archiving easy and inexpensive, it has to be done at 44.1kHz and
in compliance with “orange book” standards. More often then
not, however, it is necessary to store complex production projects,
news actualities, and even music, as mass-stored data. As a result, the
engineer has to deal with the complex issue of sound file
interchangeability. Unfortunately, the PC world is still stuck with the
nebulous WAV format, with its variable word length and sample rates.
Further complications arise from the potpourri of compression
algorithms currently in use, a factor that sometimes results in their
haphazard overlap. So far, we've only mentioned the inside of the
studio facility, but digital STL and transmission chains are also
considerations, particularly in respect to sampling rate. Ditto for
remote broadcasts and feeds, including satellite and codecs, which all
employ some form of digital compression. The situation is greatly
exacerbated by the reliance on LANs, WANs, and the Internet by
broadcasters, which sometimes results in digital sound files being
copied (and converted) dozens of times.
Differences in protocols, topologies and data formats all must be
considered when mixing various systems.
Unfortunately, a lack of sophistication and awareness regarding the
negative side effects of compression overlay and file/sample rate
conversion is prevalent in the radio industry, and sometimes results in
an inferior on-air product. Thus, it ultimately falls to the engineer
to see that the quality, productivity and flexibility of any new system
are optimally balanced.
This can be done in three phases. First, you must be proactive in
understanding your client's needs and expectations before the selection
and purchase of new hardware and software. Don't be afraid to speak up
if you realize that another product or approach will better accomplish
a specific task. This requires fully educating yourself about the
systems in consideration as well as the mission of those expected to
Second, take the time to thoroughly read and understand the nuances
of the system or application before you install it. Because time is
money and today's broadcast hardware and software are complex, this is
an area where one may be tempted to cut corners. Often, the
documentation supplied lacks the detail necessary to achieve the best
results, and thus requires additional research on your part
Nonetheless, it's an essential step.
Finally, put together a training plan that not only trains key
personnel as users, but also increases their ability to make decisions
that will enhance, not degrade, the final product.
The role of today's contract engineer has indeed changed. Even so,
we can effectively deal with the challenges accompanying that evolution
by embracing a comprehensive approach to system integration.
Krieger, BE Radio's consultant on contract engineering, is
based in Cleveland and can be reached at firstname.lastname@example.org.