Marvel live show reveals the future of stage tech

Exclusive: The tech team behind Marvel Universe LIVE explains how it pulls off the stage show's special effects.

An actor playing Loki performs in a rehearsal for Marvel Universe LIVE.

Feld Entertainment

July 25, 2014

Marvel Universe characters are taking the battle between good and evil on the road via a live stage show that’s likely to attract not only comic fans, but also those seeking the latest action-packed uses of infrared theater technology.

The new Marvel Universe LIVE show has an arsenal of innovative stage technology that includes 3-D projection mapping and effects that fire-off with pinpoint accuracy thanks to an infrared tracking system incorporated into the production elements.

“Ironman never misses. Ever,” says J. Vaught, Feld Entertainment’s vice president of ice and stage operations and creative development, who has ensured this by tricking-out all the actors and set pieces with infrared tracking beacons. These infrared beacons are picked up and tracked by infrared cameras mounted all around the set. All this feeds into computers that create a 3-D projection map of the stage show. This way, Mr. Vaught and his crew know where every prop, every piece of scenery, and everyone is throughout the entire show.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

Once the team has total awareness on stage, it can start projecting images in precise locations. That means that when Ironman shoots a bad guy, the 3-D projection of his repulsor ray beam finds its target every time, even if an actor or set misses its mark.

“There is really a lot to geek-out about when it comes to this show,” says Vaught. “Candidly, the way that everything transferred from the servers to the real world blew my mind the first time we flipped the switch and saw it come together on a live stage.”

The show is touring 85 cities, starting with Tampa, Fla. in late July. The first leg of its North American tour will include stops in Washington, Philadelphia, Nashville, Miami, and Atlanta, among many other smaller cities around the east coast. 

Projection mapping, also known as video mapping technology, is used to turn objects into display surfaces for video projection. When a high school teacher uses a projector in class, any piece of furniture that gets between the projector's beam and the wall gets projected onto.

With projection mapping, however, infrared beacons and cameras all tell the computer exactly what surface gets the image and what doesn’t.

Howard University hoped to make history. Now it’s ready for a different role.

If teachers used Vaught’s techniques, they could project the class work onto a single desk and nothing else in the room would have the image projected on it.

Many TV viewers were introduced to projection mapping with the 2014 Olympic opening ceremonies in Russia.

“But you see, we have taken this 3-D projection technology much farther than they did with the Russian Olympics,” Vaught says from his office in Ellenton, Fla.

For the Marvel stage show, the projection runs behind the live action of actors, motocross bikers, stuff that blows up, and martial artists and all the while a musical soundtrack and sound effects are interspersed with the action.

The new tech allows directors to create the impression of movement onto normally motionless stage objects.

Vaught says that it has taken two years to create the projection and complete all the mapping, which was a coordinated effort between the production houses, vendors, and engineers.

Projection surfaces for this traveling show include: a back wall that’s 110 feet across and 36 feet (3 stories) tall and a floor 70 feet wide by 140 feet long.

In addition to those flat surfaces, Marvel’s show uses a computer-aided design (CAD) program to create “malleable perspective” by wrapping images onto 3-dimensional objects.

In this case, that means projecting onto spheres, ramps, flooring, walls, ceiling, and hexagonal spaces.

All those moving and stationary pieces had to be laser scanned and then loaded into the CAD program.

Google has designed its own technology for quickly mapping out 3-D environments called Tango. “I wish that technology had been around when we began doing this production," says Vaught. "We probably would have been done far sooner." 

Another perk of the infrared beacons and sensor arrays, he says, is that audiences will not be yanked out of the illusion of the 3-D environment by projected effects getting washed out when spotlights or other lighting inadvertently crosses the projection stream.

“My favorite moments in the show are the transitions,” Vaught says. “There’s a moment when we’re leaving New York City in the Quinjet and you’re saying to yourself, ‘Holy Cow! I’m flying between buildings!”

The audience will, via 3-D projection mapping, fly around the world with a detailed overview of New York City while traveling to locations within the Marvel Universe: The Avengers’ Tower, AIM Lab, Hydra Island, and of course, Loki’s Fortress.