BT Sport on its vision for augmented live sports broadcasting becoming reality with 5G By Heather McLean, Editor Monday, September 14, 2020 - 17:05
Print This Story
Boxing with a 3D AR version of the match on your coffee table could become a reality with BT Sport
BT Sport recently became one of the winners of the UK government's Department for Digital, Culture, Media & Sport (DCMS) funding boost for 5G, under the 5G Create competition.
Named 5G Edge-XR, the consortium led by BT's Media and Research teams alongside TheGridFactory, Condense Reality, Bristol University, Dance East, and Salsa Sound, will develop virtual and augmented reality experiences for live sport, working closely with BT Sport.
5G Edge-XR will enable people to view immersive sporting events from all angles, across a broader range of devices including smartphones, tablets, AR and VR headsets and TVs.
The project is worth a total of £2,558,494, with £1,486,004 of that figure coming from the DCMS out of the 5G Create competition fund. The project will run over 20 months, using 15 full time employees.
Immersive sporting experiences
Speaking to SVG Europe, Matt Stagg, BT Sport's director of mobile strategy, explained the idea behind 5G Edge-XR's goal: We've been looking for a long time at how we can provide more immersive experiences as technology evolves. We had a vision for how we can use 5G for doing exactly that, around how we can take the value we have as a broadcaster for people within stadiums and then using the technology and our capabilities [to work out] how we can bring that stadium experience into the home for people who can't go.
Already, we've got a lot of interactive and immersive things on the BT Sport app, but it's [about] how do we take that to the next level, and that's all around volumetric capture, augmented reality, and virtual reality, he notes.
This is the most exiting area of sport technology I've ever been involved in, the vision for 5G and augmented experiences for sport is coming to life
BT Sport is pushing everything within this vision under the mixed or cross reality (XR) banner, because XR really changes your actual reality to increase the experience either through immersive means or access to content and experiences you wouldn't have had otherwise, with the ability to be able to turn that on and off, to be the curator of your own experience in the home and outside of the home, notes Stagg.
Stagg wrote the vision for this project at the beginning of 2019.
View the video of this vision
He says the question now is, how do we bring this to life? He goes on: That's why I'm so excited about it is because we put a visionary video out there and everybody said that looks amazing, but people were sort of sceptical that that could ever happen or any of that could actually be done, but we were already working on it so we knew it could be done, but we didn't know how to do it at scale or what the correct architecture was, and how that worked out from a commercial viewpoint, from a technical viewpoint, you know, can we do these things?
Realistically we have to do a proof of concept to make sure this work, but the learnings from that will come out [as we work out how to] do something commercially; is there a model for this? Is it too processor intensive or is it unrealistic now that we know how much [processing power] it takes to do it?
MotoGP is working with the 5G Edge-XR consortium to bring the millions of data points captured in each race to your living room
Creating and testing
Making this work in a live sports environment for distribution to wearable devices involves volumetric capture. That may include a set up using 16 x stereo pairs in four vertical clusters, capture with 32 x 4K cameras at 120fps, eight capture PCs, 1 Fusion PC to encode and contribute, cloud processing and encoding, cloud-based editorial tools, and delivery to apps at 30Mbps.
Ingest will go over fibre or 5G, depending on what is available, while distribution will go out over the 5G network. Location and tracking from the wearable devices will need to feed back into the system with 10ms to 30ms latency to create a realistic, enhanced XR experience for the end user.
A number of use cases are going to be the object of the project, including one with the data-rich sport of MotoGP to create a sports fan experience delivered over 5G and powered by edge cloud compute GPU facilities, enabling a photoreal, real time MotoGP Viewing Suite to be rendered and streamed to viewers' smartphones, tablets and headset devices.
I'm really, really passionate about this because it's limitless. Some of these experiences will change the future way of how some things are done
The MotoGP demonstrator will focus on a 5G delivered cloud XR rendered AR experiences, using high polygon (5M) 3D models, multiple video streams, immersive audio and data integration. Among other functionalities, it will hopefully deliver a video wall of multiple live race video streams with corresponding immersive audio, a 3D track view of riders, immersive halos for 8K 360 video, and rider and team profiles, stats and a replay area.
Comments Stagg: We've been working on a demonstration MotoGP. MotoGP is a very good way of showing what augmented experiences can bring because you have so many data points and so much content coming from it, and then you look at how you create your own experiences from that data.
Another demonstrator will be with a boxing event, allowing a real time virtual volumetric hologram to be interacted with on the viewer's coffee table at home, allowing them to watch live action and replays from all angles, simultaneously alongside BT Sport's live feed, creating an editorially-driven experience including commentary.
Yet another demonstrat










