Ice into infrastructure: Broadcasting martial arts tournament Oktagon 28 from an ice skating rink for DAZN By Heather McLean Thursday, July 31, 2025 - 10:38
Print This Story
SVG Europe caught up with BeeLab an Italian technical services provider to find out about how it transformed a working ice rink in Courmayeur, Italy, into a fully redundant, broadcast-grade live production environment using Blackmagic Design workflows for broadcaster DAZN.
The 28th edition of Oktagon, one of Europe's most established martial arts events, took place in a venue with no permanent infrastructure; the Courmayeur Sport Centre, an ice rink in the Italian Alps. BeeLab was tasked with delivering full 4K multicamera coverage for DAZN and streaming undercard fights, all without interrupting the venue's day-to-day use. Technical director, Enrico Beltramo, explains how BeeLab made it work.
What made Oktagon 28 a difficult production from an engineering standpoint?
Enrico Beltramo: The location. Courmayeur Sport Centre is an active skating rink in the Italian Alps. We didn't have the luxury of permanent rigging points, controlled temperature, or built-in infrastructure. The base of our setup was actual ice, with only a thin insulating carpet. That meant we had to design a temporary, fully redundant OB system that could survive condensation, temperature deltas of 15 C, and surface instability, while keeping everything elevated, sealed, and safe.
Condensation was our primary threat. With the air temp fluctuating rapidly between indoors and out, connectors and optical modules were susceptible to moisture buildup. That can lead to signal loss, shorts, or dangerous contact
What were the biggest environmental risks?
EB: Condensation was our primary threat. With the air temp fluctuating rapidly between indoors and out, connectors and optical modules were susceptible to moisture buildup. That can lead to signal loss, shorts, or dangerous contact. We lifted every signal and power cable off the floor or sealed them in IP-rated tubing. Even then, we split signal and power paths wherever possible to reduce cumulative failure risk.
How did you structure the OB deployment to manage that complexity?
EB: We ran a dual-van setup. The main unit, Kenobi , managed acquisition, switching, camera control, audio, and routing. The second van handled graphics, replays, playout, and streaming. This separation allowed us to keep each domain fully focused and prevent any interdependent failures. For example, if replay or graphics needed reboots or tweaks, it never interfered with the core acquisition.
What core switcher infrastructure did you rely on?
EB: The switcher was a 4 M/E Atem Constellation. We chose it because of the high input count, clean multiview outputs, and its ability to handle stingers and layered graphics natively. It was paired with an Atem 1 M/E Advanced Panel for tactile control. We routed four multiview outputs to the director, TD, replay, and lighting designer, each with different layout profiles. The Constellation gave us complete flexibility in how we mapped those.
The 28th edition of Oktagon, one of Europe's most established martial arts events, took place in a venue with no permanent infrastructure: the Courmayeur Sport Center, an ice rink in the Italian Alps
How was power redundancy handled?
EB: The main OB van had rack-mounted UPS systems with 30 minutes of failover capacity. For all remote nodes, interview areas, PTZs, mobile cams, we used Bluetti portable battery packs. They gave us several hours of autonomy with silent operation, which is a huge plus in venue environments.
Let's talk about cameras and signal transport. Tell me more!
EB: Primary coverage came from three URSA Broadcast G2 cameras on SMPTE fibre, giving us SDI, power, tally, and intercom on a single line. Secondary angles used Studio Camera 4K Pros, a Pocket Cinema Camera 6K G2 on a gimbal with wireless video, plus a Micro Studio Camera 4K G2 and two PTZs. We also flew a DJI drone for aerials.
Every cable run was reevaluated for safety and failure tolerance. All SMPTE runs were routed overhead or along elevated rigging structures. We used Blackmagic Camera Fibre Converters and Studio Converters to break out power and SDI locally. Our rule was, nothing touches the ice unless it absolutely has to.
How did you manage routing and system-level redundancy?
EB: The heart of the routing system was a Smart Videohub 40 40 12G, with a secondary 20 20 used for local loops and redundancy. We implemented physically separate fibre paths for every primary signal. If one line went down, we could switch to the backup instantly. We also deployed Blackmagic 2110 IP converters on select sources, allowing us to send bidirectional 3G-SDI over IP as a tertiary path.
This gave us SDI baseband reliability, with IP redundancy as a safety net. All signal paths were continuously monitored, and our SOPs included failover switching by role.
Did you handle transmission inhouse? If so, how?
EB: Yes. We delivered to DAZN using a LiveU LU800 as primary, with an LU600 as backup. Both were bonded cellular and routed through Eitower servers to DAZN's MCR in Ireland. We actively monitored compression and latency on site. We weren't willing to rely on best effort' connections; we needed clean, stable contribution feeds.
How were replay and graphics integrated?
EB: Replay was handled by our internal BLT system. It's a modular SDI ingest tool that lets us clip and cue replays on the fly between rounds. We ingested four SDI feeds and used macro-triggered transitions to drop them into the live mix.
Graphics and LED content ran from Resolume Arena, output through an Atem Mini Extreme ISO for playout flexibility. Some of the static elements, such as a commercial spot, were played from a HyperDeck Shuttle HD. Graphics, s










