14.07.2021 — digital humans, our work

Building the Stage — Live Motion Capture

Our first tech demo tested a concept to make virtual concerts feel more immediate and vital.

Tom Faber, Head of Content

Salif had just one request: he wanted Back to the Future shoes. The self-lacing Nike MAGs worn in the iconic sci-fi movies by time-travelling student Marty McFly had actually been produced and released on a limited run, but they were being sold online for upwards of $20,000. For real-world Salif, they were out of reach.

As much as Salif wanted the shoes, we wanted him. We’d seen him moonwalk in front of the Eiffel Tower and remarked upon the huge online following for his dances. He was the perfect talent for our first tech demo.

The experiment was live motion capture. While big special effects projects for film and gaming generally create digital models after capturing the body’s movements, we wanted to see if it could be done in real time. 

Salif Gueye's avatar dancing in a digital recreation of New York, wearing Nike Air Mags.

The idea came from the virtual concerts our founder Jonathan Belolo had watched over lockdown. It felt like there was something missing. “I believe the audience would feel more engaged if they know what’s happening is live rather than pre-recorded,” he says. “Between performances Salif can grab his phone and talk to the audience. Down the line the idea is to have audience interaction fully within the 3D space, but for now I just wanted to show it’s possible to do motion capture live.”

What better test for this than a dancer whose every flick of the wrist and spin of the ankle would test the precision of our technology? When we asked Salif, he was immediately into the idea. He just had that one demand—if he couldn’t have Nike MAGs on his real feet, he wanted them in the digital world, where anything is possible. So we crafted a pair out of pixels and put Salif’s virtual avatar into his dream shoes.

After months of preparation, rehearsals and mocap sessions, on January 31st 2021 Salif entered the 8m2 capture space of Paris’ Mocaplab wearing a full body suit with 50 mocap markers, special gloves to capture finger movements and a headpiece with cameras mounted to his face. In order to capture and process the data live, each of these three inputs streamed data onto a separate computer. Each of these fed into a fourth PC which integrated the body image and sent it into the Unreal Engine for digital rendering. 

“Each gyration of his hips was replicated faithfully by his digital avatar with no perceptible lag”

The whole experiment was streamed live using a simple setup of a smartphone streaming onto Instagram Live. “I wanted to use simple tech to prove beyond a doubt that what you were seeing was completely live,” explains Jonathan. From the first moments there were over 1000 viewers online, and streams of heart emojis poured across the screen.

“What’s about to happen is incredible,” Salif told his fans, “it’s going to be revolutionary!” The home audience’s view was split across two scenes: above they could see Salif dancing in the mocap studio, and below they could see on a monitor how his movements were translated in real time into the virtual world, where his avatar wore a canary yellow t shirt and danced on the rain-slick rooftop of a New York City tower block.

Article continues after video

Salif danced as if he was sliding across ice, each gyration of his hips and every spin replicated faithfully by his avatar with no perceptible lag. He performed three dance routines to music by Michael Jackson and Tyler, the Creator. The fans loved it. “Can I ask you for a dance lesson?” one commented at the end. After months of preparation, it was all over in 15 minutes. The team burst into applause and Salif said goodbye with a huge grin. 

Jonathan considered the experiment a success, proving how immediate and compelling a virtual performance can be when there is live performance and audience interaction. In this case it was simply Salif talking to his fans in between sets, but in the future interactivity between performer and crowd could be much more sophisticated—the artist could have a screen in their studio on which they can see images of audience members’ avatars from all over the world. They could encourage them to raise their hands, sing along, or even talk to them in real time.

There were a few technical hiccups, naturally. The video recording of the event failed, meaning that the IG LIve video is the only evidence that remains. “Everything that was virtual and high-tech worked like a charm, and the traditional tech failed,” Jonathan comments wryly. The team also learned how important it is to have as many rehearsals as possible in future, and that more attention needs to be paid to the minute details of facial expressions in order to achieve believable quality of emotion in the rendered avatar. “We promised that next year, on January 31st 2022, we’ll do the same thing with Salif and show how much progress we’ve made in a year,” Jonathan adds.

“It was really magical and extremely frightening. We spent so much time on this and so many things could have gone wrong, but everybody was so passionate and we were surprised by how smooth it was—a great test for building our future product. Let me tell you, I had barely slept for days before the tech demo, but on the night after the show, I slept like a baby.”

About Stage11

Informed by the voices of fans, gamers and digital creators we are venturing into the unknown to establish a bold new digital home for music and culture.

Find out more

13.10.2021 — our work

Why Music Needs the Metaverse

What inspired our founder, Jonathan Belolo, to start us on this journey…

Read more

All articles