Live OB TV: It’s A Magic Show - Part 1

Most live remote outside broadcasts are thoroughly planned by producers and directors who are often too busy to consider potential equipment problems. Technology is an engineering responsibility. Engineers must be ready for any circumstances that threaten to take the show off-script or off-air, from dead wireless mic batteries to unexpected foul weather. In live TV, anything can happen and probably will, usually at the worst possible time.

To paraphrase a 1786 Robert Burns poem and later John Steinbeck book title, ‘The best laid plans of mice and men often go awry.’ I've heard this phrase echoed in headsets, control rooms, hallways and TV conference rooms because it happens more than it should in TV despite extensive preparations for success. Some call it live broadcasting karma and it is a TV engineer's job to avoid it.

Producers and directors make a plan, stick to it and maintain control. An engineer’s job is to monitor and adjust electronics systems operations to all dynamic situations, from camera shading to catastrophic equipment failures, maintaining seamless and technically correct program content flow at all times.

In the case of live OB sports production in public venues, variables like bad weather, equipment and broadband service reliability, scheduling, and even parking can make a live outside broadcast go awry if not identified, assessed and addressed before showtime.

They Won’t Know Unless We Tell Them
Most surveys confirm TV viewers want to lean back and be entertained. Live outside TV broadcasting is an on-stage magic show, complete with digital smoke, mirrors, and occasional deceptions. News anchors in coats and ties sometimes wear shorts and sandals under the desk, don't they? It's also a rock concert that requires improvisation and faking it when necessary. The undercurrent is a real-life version of 'Beat the Clock.'

You'll never get a second chance for an on-time, on-air premiere. The key to successful magic shows, concerts or live video production is to be fully prepared on time and not reveal what the audience can’t see. It’s a TV show, not a 'making of' documentary.

On-air discretion is everything.  Mum’s the word about behind-the-scenes live TV and magic tricks. Live TV directors I learned from always reminded talent and crews that no matter what happens on the air, carry on as if we planned it that way. Never insinuate there is any issue behind-the-scenes because most viewers won’t notice. News story not ready for air yet? Viewers can't see the rundown. Tease it again or ignore it. Only the station knows. It's a magic show that broadcasters make look simple.

Tricking Viewers
I recently engineered and directed a two-day World Championship Offshore Powerboat racing broadcast that was infested by more than the usual number of live TV gremlins. There are always a couple of gremlins. This show had 12. You may have found yourself similarly challenged with live, OB TV circumstances. Your reaction could change the show and your career, because in showbiz you’re never any better than your last show.

Live TV is a magic show, but TV engineers aren’t magicians. Engineers must prepare for anything and everything to go wrong. The trick is to persuade the production team and managers when it is time to abandon the original plan and move on with an alternative that saves the show and that viewers won’t notice.

To do so successfully requires a ready Plan B and the diplomacy of a skilled politician to sell it to the crew and producers. Nobody likes the disappointment of changing production plans in the middle of a live TV show. Some people must first be sold on the idea.

One powerboat racing gremlin popped in about 6 p.m. the night before the broadcast was scheduled to start at 11 a.m. the next day. That’s when race organizers called to tell us they rescheduled the race to begin at 10 a.m. instead of 11. We called the production crew members and informed them of the time change, but it was pointless to call any broadcasters who expected the broadcast to start at 11. Logs were printed, spots were sold, and other programs were scheduled for broadcast at 10 a.m. Instead, we told our broadcasters the situation and faked it by recording the first hour and playing it without fanfare over the last hour of the moved-up broadcast. Most viewers probably didn’t notice.

The window next to the switcher provided the view we needed to cover for the offline camera.

The window next to the switcher provided the view we needed to cover for the offline camera.

Not Carved In Stone
While directing the live Powerboat Race TV show and operating the production switcher, unforeseen changing circumstances suddenly required me to setup and operate a camera aimed down the racecourse out a nearby window with my other hand. During the show the wind blew our microwave receiving antenna on the studio roof over twice and it shattered its plastic case the second time. When erected again, the link still worked but the video quality was unusable. That microwave carried three IP cameras including a primary racecourse camera on a scaffold on the beach.

The studio window camera covered for the offline scaffold camera angle which was about one mile down the racecourse. The other IP cameras were beauty shots from a PTZ on the boom lift and link to a roll-around camera for interviews in the pits. It wasn’t the plan, but nobody but us in the studio knew we were in Plan B. Thank goodness for a 22x lens, a good fluid head, and SD cards for "looks-live" interviews.

Part of the engineering plan is planning to fail. What happens if the production switcher fails? That thought is always in the back of my mind, and we’ve been fortunate to not use my plan, yet. My plan is to switch the entire show on our Ensemble Designs Bright Eye router with an iPad controller. It wouldn’t look slick, but not as ugly as a time way back when I switched a live 5-minute TV news insert with a patchcord because the production switcher failed at 6 a.m. The TV transmitter didn’t like the glitches of patchcord-switching, but an engineer must do everything that can be done to keep the show on the air with what’s on hand. The show went on, we aired all the spots, and some viewers probably thought there was something wrong with their TV.

This damaged Ubiquiti Rocket antenna limped along until it rained.

This damaged Ubiquiti Rocket antenna limped along until it rained.

Default IP Addresses
Another gremlin appeared when the local cable company installed a cable modem for our internet service. We were relying entirely on streaming video to feed the show to our network affiliates, YouTube, and Vimeo. The first thing the new cable modem did was change the address of our private router from 192.168.1.xxx to 10.1.10.xxx. What?

We had locked in all our gear to static IP addresses, but the sudden 10.1.10.xxx router address change over-wrote the static addresses in some of our production gear. That problem lasted a couple of hours while the installation tech called the boss to get help resetting the cable modem IP address to 192.168.1.xxx. Of course, installers aren’t allowed to get into customer’s gear, so we verified and made all necessary local static IP address corrections ourselves.

The cable service speed test showed the bandwidth we paid for, but it turned out to have an occasional glitch. We backed up the cable broadband service with a pair of Verizon portable modem/routers that provided rock-solid broadband service throughout the broadcast.

We use 5GHz Wi-Fi for camera links, but we don’t use Wi-Fi connections in our temporary studio. All our studio production gear is hardwired. We’ve learned that a random smartphone entering the studio with its Wi-Fi on can assign itself an apparently unused IP address and eventually conflict with a network studio device assigned that same static address when it becomes needed online.

Never Assume
We rented a LiveU Solo to stream to our primary broadcast affiliate that happened to be a LiveU station. The Solo seemed to be a great match, but it turned out to be just the opposite. During a phone call to LiveU tech support, I learned that the TV station’s LiveU Enterprise system was incompatible with a LiveU Solo. I had assumed otherwise.

Fortunately, the TV station had a computer with VLC Media Player available to decode our RTMP stream for broadcast. It worked, but VLC certainly wasn’t in the original plan. That’s also when we discovered our cable modem service was not as stable as it should be, because every couple of hours the video stream to the station was dropped and Solo disconnected from VLC. Who knows, but I'm blaming it on the occasional cable modem glitch. We made it work because everybody was flexible, thank you very much. What’s the one word that makes challenging live OB productions look good on TV? Flexibility.

There is more to this story than space available. Please standby for Part 2, about drones, boom lifts, field cameras and lens filters.

You might also like...

Designing IP Broadcast Systems - The Book

Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

2024 BEITC Update: ATSC 3.0 Broadcast Positioning Systems

Move over, WWV and GPS. New information about Broadcast Positioning Systems presented at BEITC 2024 provides insight into work on a crucial, common view OTA, highly precision, public time reference that ATSC 3.0 broadcasters can easily provide.

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.