"HP wanted to do something different with big LED video walls, AR and MR tricks that would look good to the live and broadcast audiences,” notes Technical Producer Scott Millar, working alongside Pixel Artworks to power video for the event using disguise. Real-time MR content was made in Notch, with data coming in from the game to power the info graphics. The final output was carried out in disguise and output to the LED video walls.
Mixed Reality was used in two ways during the show, Scott explains. The first was in using a Notch-generated studio environment, which allowed the casters to be transported to a different world in-camera. The world could also be rendered from the game engine to place the casters directly into the map. The second use was to allow players and interviewers to “step into” the game world and replay the biggest moments. Using a Steadicam and rendering the game engine into both the LED and virtual worlds, the players could see themselves in the game, and describe their best moves.
“Working closely with the developer of the OMEN game and amateur map designers, we built a custom solution to make the real-world set, game data and content align with the digital equivalent. We essentially created a virtual studio and OMEN set which, through camera, would seem as though a real person was in the game,” explains Oliver Ellmers, Interactive Developer with Pixel Artworks. This was achieved by using xR to enlarge and overlay content in-camera while using MR to track the real-broadcast camera world, including camera tracking, so the live broadcast cameras were aligned with the digital cameras in the game.