Stage Precision On Managing Position and Camera Tracking Data with Pinpoint Accuracy
For live event, installation, and broadcast AR workflows, managing large volumes of data from different protocols and technology manufacturers proved to be an unwieldy task and the biggest workflow hurdle for creative and technical companies. To better manage data in response to production challenges, Stage Precision was born out of a partnership between members of bright! studios and Creative Director and Software Developer Michael Hantelmann. AJA recently sat down with Consultant Sarah Cox to discuss how Stage Precision software helps teams manage position-control and camera tracking data, with accurate data synchronization enabled by support for AJA capture cards. Here are highlights from the conversation:
Provide an overview of Stage Precision.
In today’s digital world, data is everywhere; it’s the foundation of everything that we interact with and Stage Precision is a software technology company that connects virtual and digital worlds by unifying all production data into one 3D space. Currently in beta, our software allows teams to interconnect data from all devices in a live event or installation workflow into a centralized hub, enabling increased usability and accessibility. Our background is in producing live shows as a creative and technological agency, so we have firsthand experience in the complex challenges that accompany working with data dispersed in different locations.
What different types of data do you specialize in?
We offer an extensive input/output connections library that supports all position-control and camera tracking data. Camera tracking is currently in high demand for AR workflows in the broadcast and live event space, and we support a wide range of protocols for brands including NCAM, Spidercam, Mo-Sys StarTracker, stYpe, Technocrane, and TrackMen, among others. Position-control compatibility includes support for real-time 3D tracking solutions like PosiStageNet, OptiTrack, or BlackTrax, and 2D tracking approaches such as blob tracking that enables analysis through imaging. We also support a wide range of other industry output device connections, including audio, sensors, controllers, and many more. For a production using four or five different protocols for camera and positioning, our software unifies everything into a single 3D data set that can be analyzed, controlled, and sent out to a real-time engine, like Unreal or Unity.
What are the benefits of using Stage Precision software?
Because our software wrangles data and shares it as a single source of truth to real-time engines and other devices in complex media systems, users no longer need to worry about blueprinting. We handle input, output, and setting up connection profiles for all the different protocols, a process that traditionally requires scripting and coding. This increases accessibility when working with real-time engines and allows teams to focus more on the creative aspects of production. Our software further helps unify creative and technical pipelines, enabling artists and technologists to work in Unreal Engine to solve different parts of the puzzle and ultimately resulting in a smoother production process.
Why was the software initially developed?
Like most modern technology, it came about as a result of solving a problem. bright! studios, the creative studio behind the concept, and Michael Hantelmann, one of the owners of Stage Precision and the lead developer of SP, have together worked on live events and broadcast projects that incorporate AR graphics for the past 15 years. The team found it cumbersome to work with so many different types of protocols and technical languages, so they began looking for a solution to unify all data during production. Stage Precision is the result of their efforts and has evolved into a Swiss Army Knife for production that can handle the most complex AR and virtual production workflows.
What types of beta users are working with Stage Precision software?
Our core customers are from the live event community, including AV and systems integrators developing workflows with performance and object tracking, projection mapping, or lighting desk integration with real-time engines and media servers. As Stage Precision is capable of bringing powerful data connection points into a media server, some of the projects our clients have delivered are impressive. We’re also seeing strong growth in the broadcast AR market, where camera tracking for the shoot and recording data for post-production are critical. Stage Precision records all data to the timeline and it’s instantly available for playback or post-production needs. This allows teams to easily make adjustments and match all content changes to the camera or lens position at the time of the shoot.
Can you tell us about Stage Precision’s Shield plugin?
Shield is an Unreal Engine plugin that offers users more control of broadcast AR workflows. The plugin removes all the complexities of tweaking blueprints and facilitates a drag-and-drop workflow in Unreal, providing native access over lenses, calibration, and other production variables.
Where does AJA technology come into play?
We recommend clients use AJA capture and output cards for synchronization of the protocol data to drive the control devices and real-time engines for the frame lock and gen lock. Locking our system is critical, because it needs to be accurate. AJA cards allow customers to align data points together from different manufacturers and protocols, and shift the data within a frame in less than 0.1 milliseconds. AJA cards also receive images for Stage Precision’s lens calibration workflows and for reading the embedded timecode.
Our software supports most AJA I/O cards, but we’ve tested and recommend AJA Corvid 88, Corvid 44, Corvid 24, Corvid LHi, Corvid IP, KONA 5, KONA 4, and KONA 1. Our customers lean on us heavily for recommendations when building their own hardware solutions, and we always suggest AJA as the Rolls Royce of I/O cards for their accuracy and economical pricing. As we’re challenged with managing new protocols and more complex live event technologies that are coming to market, having the ability to offer our clients the most precise data system with support for AJA ensures the most accurate data synchronization, which is key.
What industry trends are you following?
The big trends right now are the explosion of the immersive experience economy. AR is just the beginning, and we’re seeing a whole new way of entertaining and delivering messaging to clients and end users. We’re on the cusp of connecting physical and digital worlds with Stage Precision technology, so we’re well-positioned to play a key role in the metaverse, evolution of haptic and wearable technologies, and next-generation immersive experiences.
Lowrance Sound Company, Inc.
2132 Nailing Drive
Union City, TN 38261