Here at The Alliance, we aim to push the boundaries of sports media consumption by providing interactive and dynamic experiences in both the mobile apps and the website. Among these is The Alliance App's real-time play prediction game.
This game enables one to see the formation of the players, understand the situation through the scoreboard, see historical outcome percentages, and make predictions on what will happen next, all in real-time. No other company has developed a tech stack that enables such possibilities, and we’re not stopping there. We aim to provide a similar experience providing data and stats in real-time as an overlay on streaming video. In this article, I want to explore the problems we faced, and how we solved them.
Let’s take a moment to talk about how we provide game data in real-time for our play predictions game.
Our backend provides a GraphQL-based API including support for WebSocket based subscriptions. The data for the scoreboard, for example, comes to us from several subscriptions. There’s one for the game clock, one for the play clock, and one for the general game status e.g. score, timeouts, yards to go, etc. In the app, we assemble this data and present it in the UI as soon as it arrives. This approach works great for real-time, but what about replaying games? What if we need to account for typical latencies that occur in video production pipelines? Or, what if we’d like to go back and see what happened at a particular time in a game? For this, a different approach is needed.
One of the more popular features in our apps and on the website is streaming raw unfiltered video from the Low SkyCam. We want to find ways of taking the immersive experience video provides and building on it, enhancing it, and creating an experience that will dazzle our audience. So, wouldn’t it be great if we could provide all this data about a game while you’re watching it? We found a way to take this video and overlay a dynamic scoreboard that is decoupled from the video feed.
To stream our games live in our app, we use HTTP live streaming (HLS). HLS typically has a delays incurred by video processing, and because the data for our games is available in realtime, we can query our backend for the data and sync it up the video. The result is we are able to show the game's state in a dynamic UI on top of the video perfectly synced.
The videos are time-coded, and our video players provide a callback to notify us of its current time. Our backend provides an API that lets us query for data within a particular time window, so we query for the data from the time that the video is currently at, to about 30 seconds ahead. This data comes in from various parts of our API but is all timestamped. For example, the game’s clock is available and comes back as an array of timestamped values for the clock, but the play clock data is available separately and comes back in its own array. So, with many separate sources of data that need to be played back in chronological order, we use a priority queue as a buffer sorted by timestamps. We add all of the results from our query to the buffer, and playback the events synced up with the video player. When the buffer size gets too small, we query for more data and fill it up again. This allows us to seamlessly display the game state synchronized with the video.
This is really just the beginning of our ideas for how sports media consumption can be enhanced through new technology. We aim to go beyond just presenting data to the user to creating interactive experiences no one has seen before.