
Friday, March 10, 2023 - 8:52 am
Print This Story | Subscribe
Story Highlights
At the PLAYERS Championship, fans who have an Apple iPhone with AR capabilities are in for a treat. On Holes 16, 17, and 18, the PGA TOUR has deployed an AR platform that allows fans to hold their phone to a hole and see virtual flightpaths of shots on it. They can get caught up on action they might have missed, compare longest drives, and see how players played the hole.
The AR efforts have been built on Quintar's platform, and SVG sat down with Quintar co-founder Jeff Jonas - who has been EVP at Sportvision (1999-2016) and managing director, global business development, at Intel Sports (2016-21) - to discuss this year's deployment and the future of AR and Quintar.
Quintar's Jeff Jonas says AR can make the live event more compelling for fans in the stands.
Walk us through how you got to this point, with an AR service launched on Holes 16, 17, and 18 here at the PLAYERS Championship?
We started about two years ago, and [Quintar] co-founder Jay Jayaram and I were looking to do something new. We read an article about the coming hardware wars in AR between Microsoft, Apple, Google, and Snap and how they were all going to be butting heads. The article didn't discuss content. When I was at Sportvision and doing 3D 1st & Ten lines for ESPN for 3D television, it was fantastic. But there wasn't enough 3D content, and no one bought into 3D. This is kind of the same thing: someone's going to need help build this platform and develop compelling content for AR.
What kind of content do you think needs to be created?
The things that work for technology, especially in sports, either have utility or [provide] entertainment. I go back to the 1st & Ten line because that had both: it had utility, and it made the game more entertaining because you could see data in the video and know where the team was trying to get the ball. In AR, we didn't see anything like that, so we felt, Let's build this platform that will enable people to create content in AR that's meaningful.
What's at the core of the platform from a production, a programming, and a technology standpoint?
We figured out how to - what we call it - register large places: when you bring your camera or your phone out, it knows exactly where you're looking all the time. If you can do that, then you can start putting accurate features in that video.
I like to say it's like people have tried to do AR before with GPS and GPS is not that accurate: something could be placed 5 ft. this way or 10 ft. that way. Therefore, whatever I try to put in AR could be off by a number of feet. Someone had to solve for that, and that's what we do at the venue.
Fans at TPC Sawgrass can use AR to get much more information about a shot on the 17th hole.
Does that require placing technology on the course?
No, this is all machine learning, computer vision, and AI. It keeps learning and keeps getting better, which is kind of the exciting part that wasn't there at the time I was doing other stuff. There are things being done without human intervention, which is pretty cool. Once you're able to do that, you enable a lot of things. For example, we're seeing the ShotLink data, which is content. But then, we can borrow from other AR platforms.
I like to talk about Pokemon Go. It's a free-to-play game that's generating $900 million of revenue from advertising and merchandising, from customization. In sports, if we can enable venues to do AR, you have avid fans that can take part in that same type of thing. You can imagine a future where there's merchandising, there's advertising, there's customization, there's sports betting. But you need that platform to be there first for that to work.
Walk me through the machine learning. If you go to a new course, how does the process of learning the course begin?
We have a fairly simple survey process: we go and take some pictures. That's the easy part. Once we have that survey, the magic starts happening. The actual work starts happening with the engineers. The engineers create a 3D map of the area and do all kind of things that I'm not privy to because I'm not an engineer.
The cool thing about what we're doing is that, when people use the app, they're actually taking pictures the whole time. We can crowdsource some of that, so our surveys get better and better and better over time. Here at TPC, there can be 30,000 people on 17 using the app, and we're getting 30,000 pictures per second. That helps our system learn.
Over the years, efforts like this have involved apps. Do you have an app?
We have no desire to be an app company. What we want to do is take our SDK [software-development kit] and license that to people. Our theory is, there are people that know how to create content already - like a lot of the people in this compound - but no one's producing golf content in AR because there's no platform. What we want to do is, basically, give that platform to content creators and let them go nuts.
This week, we're being integrated into the PGA TOUR app for the first time; the AR app used to be a standalone app. That's big because of the audience [the PGA TOUR has].
We were talking with Tom Sahara, your SVP of production technologies, about whether this is meant to be used live, like holding up your phone while the players are hitting, or as a replay device. He said the focus is the latter. Can you explain why that is a more compelling experience?
We think it's most effective as a second-screen experience. A lot of le