LOG 01 : THE PHYSICAL LAYER
WHEN PRO AV MEETS BROADCAST
WHAT BROADCAST INFRASTRUCTURE TAUGHT ME ABOUT HUMILITY (AND PHYSICAL LAYERS)
I've spent most of my career in Pro AV. I've designed systems, run cable, spec'd rooms, and sat across the table from engineers who needed solutions that actually worked. I thought I understood infrastructure.
Then I started working with broadcast and streaming customers.
The learning curve has been humbling — and I mean that in the most literal sense.
A different kind of engineer
The first thing you notice when you walk into a broadcast operation is how differently these teams think. They're not approaching infrastructure the way a corporate AV team would. They're thinking like data center operators.
Rack density. Fiber counts. Power draw per cabinet. Cooling capacity. Redundancy at every single layer. These conversations happen before anyone talks about the application layer. Before software, before workflows, before anything you'd see on a show floor demo.
That discipline exists for a reason. Modern stations try to avoid dead air at all costs, because of the loss of revenue and the potential loss of viewership. And that's putting it charitably. In live broadcast we experience a failed termination, a misconfigured switch, or a cable run that wasn't tested properly, but none of those manifest as a glitchy conference room that gets a help desk ticket. They manifest as a blank screen with 108 million people watching. Ask anyone who was on the technical side of Super Bowl XLVII in 2013. Language & Humanities
The average annual cost of downtime for communications and media organizations is $143 million.
This is a number that puts physical infrastructure reliability in a very different light than it gets in most Pro AV conversations. splunk
The industry talks software. The problem is physical.
If you spend any time reading the trade press, attending IBC or NAB, or sitting in vendor briefings, you'll notice a pattern: the conversation is almost entirely about software. Standards. Protocols. IP workflows. Cloud. The assumption seems to be that once you solve the software problem, you've solved the hard part.
I don't think that's true.
SMPTE ST 2110 standards replaces SDI cabling with IP networking and splits video, audio, and ancillary data into separate synchronized streams for live broadcast production. It's a significant and genuinely impressive standard. But here's the thing nobody puts on a slide: the physical infrastructure requirements that sit underneath ST 2110 are serious. The Broadcast Bridge
A single uncompressed 1080p60 video stream can demand over 3 Gbps. Scale that up to a full live production with multiple camera feeds, program outputs, redundant paths, and monitoring streams, and you're not talking about plugging into a standard office network. ST 2110 network switches must be managed, high-performance, low-latency switches with 10Gbps or higher ports, capable of handling PTP, IGMP/PIM, QoS, and seamless protection and redundancy. NevionPacketstorm
That is a data center problem. It always has been. It just lives inside facilities that most people in IT wouldn't immediately recognize as data centers.
The SDI-to-IP transition is real . but its not complete
The industry loves to talk about IP as if the transition is done. It isn't.
As of late 2024, 50% of broadcasters were using a hybrid video infrastructure combining SDI, IP, and cloud technologies, while 31% still relied solely on SDI, and only 14% had moved fully to all-IP. That means the majority of facilities are operating in a complicated hybrid state, managing legacy SDI infrastructure alongside IP-based workflows, often on the same physical plant. TV Tech
That complexity doesn't simplify the physical layer question. It makes it harder. You have more to manage, more points of failure, and more places where the physical layer can let the whole system down.
IP transitions that have been planned for years are now moving to implementation, driven by cost pressure, aging equipment, and competitive pressure. That acceleration means the physical infrastructure decisions being made right now are going to define how these facilities perform for the next decade. NewscastStudio
What "Good Physical Infrastructure" Actually Means Here
In Pro AV, "good physical infrastructure" usually means clean cable management, properly terminated connections, adequate power, and maybe a backup signal path for a critical room. That's the baseline. And if you hit that baseline, you're in good shape most of the time.
In broadcast, that baseline is the floor, not the ceiling.
Every component of the media network must be seamlessly integrated: audio and video endpoints, broadcast controllers, and management devices operating within a unified IP-based media fabric. Integration isn't just a software configuration task. It starts with physically sound infrastructure that has been properly designed, properly installed, and properly validated. Cisco Blogs
The 2024 Paris Olympics is a useful reference point. France Télévisions executed a large-scale media production based on IP infrastructure. This was an architecture built on a spine-leaf topology providing a non-blocking, high-bandwidth backplane for transporting thousands of simultaneous uncompressed media streams. That doesn't happen because someone specced good software. It happens because someone built a physical plant that could actually support it. The Broadcast Bridge
Where My Background Translates, and where it doesn’t
I want to be honest about what I bring to these conversations, because I think it matters.
I don't inherently know broadcast. I'm still learning the application layer, the workflows, the standards nuances, the way these teams operate under pressure. Every conversation I have in this space teaches me something, and im only at the surface.
What I do know is what good physical infrastructure looks like. I know what it looks like when power is properly distributed and when it isn't. I know the difference between a fiber run that was installed right and one that's going to cause intermittent problems nobody can trace. I know what happens when cooling is an afterthought, because eventually it becomes the only thought.
By the time you add the specialized network cards, the high-end switches, and the software licensing that actually makes it work, you are sometimes not saving the money you thought you were. Commercial Off The Shelf (COTS) is not a free lunch. It is a different set of trade-offs. That's a point more people in this industry are starting to reckon with and it's a point where infrastructure experience matters enormously. NewscastStudio
The Teams Who Win the Next Five Years
The traditional broadcast model relied on significant capital investments in dedicated hardware designed to operate for years. The emerging model favors flexible, software-defined systems where costs can be scaled with actual production requirements. That shift is real, and it's accelerating. NewscastStudio
But software-defined doesn't mean physically simple. The teams I see positioned well for the next wave of media infrastructure build-out share a common characteristic: they treat the physical layer with the same rigor they apply to everything else. They don't assume it's solved just because the cabling got installed. They don't commodity-shop the connectors on a system where dead air costs six figures a minute. They don't skip the physical validation step because the software demo looked clean.
I am still learning this space every day. There's no shortage of things I don't know yet. But I'm increasingly convinced that the gap in this industry, the real gap, the one that creates the most expensive problems, isn't on the software side. Plenty of people are working that problem.
The gap is physical. And the teams who close it first are the ones who'll be standing when the build-out settles.
Sources: Splunk, The Hidden Costs of Downtime in Communications and Media (2024) | Haivision Annual Broadcast Survey (2024) | Nevion, What is SMPTE ST 2110 | Cisco / The Broadcast Bridge, Building a Media Fabric | NewscastStudio Industry Outlook (2025/2026) | TV Technology / NAB Show 2026 Analysis

