As the CTO at Stage11, I’m expected to wax lyrical about our technology, and dive deeply into the minutia of our stack. And I will. But not today.
Because I just attended my first stadium concert in almost two years, and seeing Martin Garrix perform an explosive set of EDM in Paris reminded me of the fundamental physicality of concerts. Sound isn’t just something you hear, it’s a vibration you feel with your entire body, punctuated by the kaleidoscopic play of lights and lasers. The feeling of being swallowed in a crowd, the sense of shared experience, communion, breathing the same air. It’s somewhere between a tribal and a religious experience.
Unfortunately, while technology is at a point where we can make a digital replica of a concert that looks almost identical to a physical one, I don’t think we’re any closer to replicating what attending one really feels like. So when we say at Stage11 that we’re “reimagining music for the metaverse”, we mean we’ll use the amazing potential offered by interactive technologies to rethink how a music performance could feel in the digital world—how it might be different.
Music is a core part of our human DNA. Some anthropologists suggest that early humans developed the ability to sing before they could speak, and the oldest instrument excavated by archaeologists is a flute that is more than 50,000 years old. This shows that humans have always needed music, but they also need it to evolve as they do. Throughout history, every technological revolution has brought with it changes to how we play and experience music; from the invention of amplification to the broadcasting potential of radio, which enabled a single musician to reach millions across continents, to the internet’s revolutions in streaming and algorithmic recommendations today, with each revolution shaping the way future music is made, performed and experienced.
“We want to return to the full-throttle transportative power of sound”– Olivier Ozoux, CTO, Stage11
From a visual perspective, perhaps the most instructive precedent for what we’re doing is MTV. When music videos arrived, they announced that music isn’t just something you can listen to, it’s also something you can watch. At Stage11 our vision is to turn music into something you can play with, too. The challenge is to envision how to represent music with visuals and interactivity in a way which is just as engaging and profound as a live concert, using all the new potential offered by the metaverse. One where the fan is an active participant, not just a passive observer. Just like cinema is more than bringing a camera to a theater to film a play, creating a space for music in the metaverse is more than simply hosting a concert inside a game.
Notice that we’ve barely used the word “metaverse” until this point? Many argue that the metaverse is a Ready Player One-style permanent visual internet that you live in 24/7—this is not our view, at least not yet, and possibly not ever (see my first comment about that feeling you get at a physical concert). As far as we see it, the metaverse already exists. It exists in communities of people moving online through shared digital experiences. Our metaverse can be either entirely virtual or overlaid on top of your physical surroundings using mixed reality.
It is not about taking you away from your real world, offering an alternate reality to escape a dying planet—we are not in the business of making dystopian fiction a reality. Instead it will be lightweight and flexible, a parallel space which intersects with your own, offering musical experiences which are powerful, but do not seek to replace or overwhelm. Our tagline says “every world’s a stage”, and by this we mean there are many different worlds, and we want to empower users to choose what they want to do and make that meaningful in their own way.
So, what will we actually be offering? What are the possibilities that the metaverse can provide for music? Our technology will unleash the imaginations of both performers and fans. For artists and creators, this means they can create spectacular experiences that exist beyond the bounds of physical reality, but also achieve unparalleled intimacy—have your favourite artist playing music just to you, sitting on your sofa, and we can replicate this experience to be accessible for a vast audience.
For fans, we will offer the opportunity to live inside the musical worlds of their favourite artists, entering an audiovisual space that offers more fireworks and detail than even the most ambitious concert. Gameplay and non-linear storytelling makes them active participants, and would allow strong replay value by allowing users to make different choices and potentially encounter different sounds, visuals or stories.
What’s more, fans will be able to contribute and create collaboratively, rather than being simply a passive avatar lost in a virtual crowd. Ultimately, we will share our tools to allow users to make their own music metaverse experiences, and I’m sure they will be able to think of applications of this technology that we never even dreamed of.
Today, fewer people than ever just put on a song and truly listen to it. The development of streaming platforms means music has receded into the background, something we use to soundtrack the activities of daily life. Music videos and concerts are an antidote to this in a way, adding a visual element which allows people to concentrate wholly on the sounds themselves. We want to return to the full-throttle transportative power of sound, to give people permission to bring the music back into focus. The interactivity and digital pyrotechnics are simply a way to say: come closer to this music, listen deeper. Because ultimately, when you love a song, it becomes more than a piece of art made by someone far away. It becomes yours, too.
Olivier Ozoux is Stage11’s CTO. He is an artist turned technologist who has worked in Animation, Games and Visual Effects industries with companies such as Softimage, Microsoft, Method Studios, Animal Logic and Digital Domain.