In 2024, RIT hosted an AR-enhanced play of Todd Rundgren’s “Singring and the Glass Guitar”. I was a part of the team that developed the effects and synchronized them to the performance. This project was created in Unreal Engine 4 for the Magic Leap, and required working closely with a team of graphic designers and choreographers.
The primary challenge of AR-enhanced plays is ensuring that the effects appear in the correct locations. More advanced AR headsets are able to map their environment, but the Magic Leap lacked this functionality. Instead, we used a QR code on the stage and calculated its size and orientation from visual input. This allowed us to determine the location and orientation of each user relative to the stage, and display all of the animations and effects in a location correct for them.
Each effect had to occur for all viewers at the exact same time. We set up a web server to send each device a timestamp to start at. Based on our tests, server messages sent to the Magic Leap took a few seconds to process, so this timestamp ensured every device would start at the same time. This project was scheduled to take a full year, but I was only available for a single semester, so I had to ensure that these systems were modular and well-documented. The following team of developers were able to expand upon these systems to complete the entire performance.