Virtual Music Concerts

The History of Virtual Concerts? 


K-pop and Gorillaz at the forefront

Some of the first innovators in virtual concerts came at the tail end of the 90s.

South Korean music company SM Entertainment first experimented with holographic performances for the boy band H.O.T in 1998. With increasingly busy schedules, a virtual tour would allow the band to appear in more places than physically possible for the members.

Unfortunately, the tour was a failure and it would take another decade before virtual tours properly took off in South Korea.

K-pop superstar Psy, the artist behind 'Gangnam Style', embraced virtual tours in 2013, performing electronically to over 20 venues in South Korea.

On the other side of the world, in the UK, ex-Blur frontman Damon Albarn launched his new band Gorillaz the same year as H.O.T's virtual tour, 1998.

Gorillaz’s band members are the virtual avatars of fictional musicians Murdoc Niccals, 2-D, Noodle, and Russel Hobbs.

Although Albarn and other real musicians are present for concerts, performances and music videos have represented the band as if the virtual members are in control.

Damon Albarn performs in front a screen displaying the virtual members of GorillazMark Allan/Mark Allan/Invision/AP

2Pac’s big appearance

One of the biggest moments pushing forward virtual elements of performance in western music was 2Pac’s appearance at Coachella 2012, 15 years after his death.

A hologram of 2Pac joined Snoop Dogg and Dr. Dre on stage to perform his songs 'Hail Mary' and '2 of Amerikaz Most Wanted'.

At the time, people were awestruck by the lifelike-quality of the 2Pac hologram. And while 2Pac's mother was "positively thrilled", others felt like it was "unnecessary". 

2Pac’s surprise posthumous return to the stage ushered in a new era of holographic performances by beloved stars from beyond the grave.

One of the latest deceased musicians to have hit the stage was a 2019 Frank Zappa concert in New York, which saw the guitar virtuoso return nearly three decades after his death. 

A 1974 recording was used to recreate his exciting live performance. 

There have also been hit tours of Roy Orbison, Buddy Holly, and Whitney Houston. An Amy Winehouse hologram tour was planned for 2019, but the idea was shelved after a backlash from fans.

Video game concerts

The most virtual of virtual concerts have taken place in the alternative reality spaces of video games.

Duran Duran made history as one of the first major bands to perform a concert in a virtual world when they hit the stage in 2006 in the game “Second Life”.

A screenshot from users wandering the virtual world of "Second Life"PETER ZSCHUNKE/2007 AP

Although other games had featured recorded performances, Duran Duran’s presence in the game was unique as their performance was a permanent feature of the multiplayer virtual world.

Since Duran Duran’s debut in 'Second Life', other live video game concerts have taken place on Minecraft and Fortnite. 

Minecraft has even hosted a virtual music festival called Fire Festival, while Fortnite held interactive concerts from Travis Scott, BTS, Diplo and Ariana Grande.

Covid concerts

Many of these video game concerts took place during Coronavirus lockdowns since 2020.  

Nick Cave of the rock band Nick Cave and the Bad Seeds performs during their concert at the Sziget (Island) FestivalBalazs Mohai/AP

With people sheltered at home, people missed out on gigs and the music industry needed to find a way to keep its artists and production teams afloat.

One Direction’s Liam Payne went on a live concert tour that gave fans a personal experience of being on stage with the star. Billie Eilish also gave virtual concerts a try in the midst of the pandemic.

Purposefully not trying to replicate the concert environment, Nick Cave recorded an intimate piano concert to an empty Alexandra Palace in London for a virtual concert in 2020.

While the pandemic meant many stars embracing the virtual concert for the first time, ABBA’s incredible new show may prove that the format will outlive the last few years’ restrictions.

 

Are Virtual Concerts the Future? 


Before COVID-19, we never would have expected to see top music stars like The Weeknd or Travis Scott hold virtual concerts. But seeing your favorite musicians transformed into virtual humans has a unique appeal. With their new virtual avatars, you get to see them in an entirely new context. Here are some of our favorite virtual concerts featuring a human musician:

The Weeknd TikTok Concert

TikTok was one of the first major social media platforms to take the leap of faith with a virtual concert during the pandemic. The Weeknd’s virtual concert featured an animated version of the artist performing his hit song “Blinded By the Lights,” where virtual backup dancers performed the famous TikTok dance for the song. Everything from his sunglasses shining to the virtual background of neon signs resonated with the song’s message to enhance the concert experience.

What’s special: Including people’s real-time comments on light-up signs and their usernames on fireworks is a great way to inspire audience participation. Tiktok instructed viewers to “share their love in the chat to light up the sky!” and did so. People come for The Weeknd but stay to see their comments and names in lights next to their favorite artist.

‍Travis Scott in Fortnite

It’s impossible to discuss virtual concerts without praising the famous Travis Scott concert in the popular game Fortnite. Epic Games has a history of partnering with virtual influencers such as Guggimon to feature in their games. They took virtual concerts to the next level with the Travis Scott performance that captivated over 12 million live viewers. Scott’s avatar is the size of a skyscraper and directly interacts with the audience member’s avatars. At the 2:56 timestamp, you can see him grab two stars from the sky and use their power to levitate everyone’s avatar around him.

What’s special: Travis Scott used the setting of Fortnite to its fullest potential by making his concert mobile. Travis Scott’s large avatar walks around the virtual world in Fortnite with the audience members following behind him as he ventures through outer space and even underwater at one point. This technique keeps the audience engaged and motivated to stay through the entire hour-long concert as they follow Scott around.

Ariana Grande in Fortnite

For their next virtual concert, Epic Games invited pop star Ariana Grande to preform in Fortnite. While Scott’s concert was a singular performance, Grande’s virtual event took place over the course of several days. Epic Games integrated Ariana Grande's virtual concert into the larger Fornite narrative and used her performance as a world-building opportunity. The overall theme and message of her concert resonated with Season 7, fittingly titled “Moment of Togetherness.” What better way to join players together than in a live concert event?

‍In the beginning of Ariana's concert, fallen players awaken in a black landscape. Fellow players extend a hand to help lift each other up as golden orbs float into the sky above them. Tim Elek, the Live Events Art Director, explained that developers included this touching moment was meant to represent Fornite helping players throughout quarantine. Similar to Travis Scott's concert, Ariana Grande's giant avatar interacted with each viewer during her performance by lifting them into the sky as she sang her hit single "7 Rings."‍

Virtual Influencers Holding Virtual Concerts 

Now that we understand virtual concerts’ appeal and have seen a human musician in one, it’s time to add the secret ingredient: virtual humans. They already live within the Metaverse, so it feels more like we’re stepping into their world to join them in their natural setting. Here are some of our favorite live performances online featuring virtual pop stars.

Kai Roblox Concert

Roblox is a popular online game that allows users to program and create their own games. People can create their own avatar (that has a simple Lego or Minecraft aesthetic) and hang out in virtual worlds with their friends. Kai is a 16-year-old virtual influencer developer who created Splash, a game within Roblox that allows users to create and perform their music to live audiences. She’s not only a virtual celebrity on Roblox, but an inspiration to other young musicians on Roblox who look up to her for building her own career online.

What’s special: Kai performed to over 100,000 fans in 7 different Roblox venues simultaneously every hour for over 48 hours. That is not only an impressive feat but a great way to use an expansive online venue like Roblox, where players gather in many different virtual rooms at once. Read our interview with Kai to learn more about her impressive virtual concert experience.

Kizuna AI VR Concert

Kizuna AI is a VTuber (or Virtual YouTuber) with over 4 million subscribers across her channels. On the website for her virtual concert, Kizuna AI wrote, “Online concerts have become a norm in the world we live in now, and because we live in a time like this, I want to deliver a new kind of live performance where the real world and virtual world can blend together!” To accomplish this, Kizuna’s team used a clever blend of VR and AR technologies on a physical stage to create a unique virtual concert that featured both humans and VTubers.

Kizuna AI’s concert is truly the first and only of its kind in the history of virtual concerts. As a virtual being, she bridged the gap between the real and virtual worlds by including both human and virtual guests such as Virtual Kaf and the music producer TeddyLoid. Some human guests were transformed into VTubers, while others remained in their standard form as they danced next to the virtual Kizuna AI.
In the end, the coronavirus has changed the way many of us experience live music by expanding our horizons to new virtual concerts we otherwise may never have imagined. Even the iconic Hatsune Miku, who usually performs in hologram concerts, was forced to switch to a VR virtual concert in 2020. One comment under her video preview explains:

“Miku is always coming to the real world to perform for us, now it’s our turn to get into the virtual world to see her 💙”

This YouTube commenter summarizes the appeal of virtual concerts well: Fans can truly step into the world of their favorite virtual artist or see a human artist reimagined in new dimensions. In a virtual concert, performers can instantly change their avatar’s appearance  or transport fans into any place imaginable.

In the Metaverse, fans have infinite possibilities for virtual human interaction.

 

How to produce a Virtual Concert? 


Tools

THE PROCESS

The Process of Creating a Virtual Concert

1. CREATING AVATARS

We scan the artist to create a digital avatar or build fictional ones.

2. BUILDING ENVIRONMENTS

Our production team creates customized, interactive stages
in a virtualized environment.

3. CAPTURING THE ARTIST

Artists are captured in real time in our MoCap studio to feel more connected to the audience.

4. STREAMING THE SHOW

Broadcasting the virtual concert for a VR or 2D audience on a global stage.

Unreal Engine

Unreal engine is the standard software of choice for virtual event production due to its real-time capabilities, advanced handling of all aspects of the production process (e.g. sound), and its smooth animation workflow.

If you’re familiar with creating games in Unreal Engine or building out previs for your virtual sets, then you’re already three steps ahead of the crowd. The process on Unreal is pretty similar to what you’re familiar with. 

How Unreal Engine is used by virtual events and shows

Studios use Unreal to build virtual environments to give fantastical performances in real-time.  The visual effects can be extremely advanced or understated depending on the scope of the project. 

Most commonly, Unreal Engine is used in three main ways: 

  • To build virtual sets for live production

  • To aid in creating accurate animatics and previs

  • To render virtual venue in real-time

In some cases (like that of the Karate Combat show), Unreal Engine is used to record the virtual venue in real-time. The recordings are sent to a studio for post-production to form a rock-solid virtual production pipeline. 

Another popular use case is using the game engine to add 3D graphics to the weather channel during a live broadcast. This is an example of how virtual reality is slowly creeping into even the most mundane aspects of content creation. 

How the music industry pioneered Unreal Engine for virtual event production

Unlike the film industry, celebrity musicians have been fast to pick up the concept of virtual venues. But they’re certainly not limited to the virtual world alone. Deadmau5 is most famous for his mind-blowingly complex events that extensively use augmented reality technology to create mesmerizing visualizations that stun crowds. 

As one of the keynote speakers for Unreal Engine’s 2021 virtual conference, Deadmau5 reveals his highly complex technical considerations when planning a show. He describes how he creates an Unreal build to develop a virtual set as a sort of previs for his performances. They provide a sneak peek to his production team so they can see exactly what they need to build to make the vision a reality. Deadmau5 can then focus on coding the accompanying light show with pinpoint accuracy. The game engine renders the elements in real time during live events, even pulling in a live feed of social media posts to make shows a truly “live” experience.

A great example of a successful virtual event which saw buy in from mainstream artists was the Astronomical Fortnite event. As the name suggests, the virtual concert was held within Fortnite on the well-known Fortnite Island. Players could log on, and interact with the surroundings, making their avatars dance and run around in the trippy landscape. In a twist, the landscape transformed into a series of immersive virtual worlds, building a better story than any real live performance could hope to achieve.  ‍

“Astronomical” on Fortnite in 2020 was the first entirely virtual music event. 

The real-time immersive 3D was the starting point for virtual reality and showed significant promise as over 12 million players logged on to the game at the same time. 

The concept of virtual concerts was first tested out by Epic Games in  2019 with the DJ Marshmellow, bringing a respectable 10 million concurrent players. The 2019 virtual performance had stationary graphics more reminiscent of a real stage and highlights the incredible advances made in just one year of development. Check it out below.

The technicalities of holding a virtual event through Unreal Engine 

Virtual event film production using Unreal Engine software requires significant investment in building 3D assets, creating a virtual studio and setting up real-world hardware. 

You can expect to invest heavily in LED walls, complex rigs, lighting systems, and cameras for live music events. 

Live stream events that are more post-production heavy like Karate Combat, have more physical set requirements such as green screens and matching virtual camera movement.

For virtual productions that exist entirely within the 3D world, expect to get deep into motion capture technology, environment design, and broadcast network requirements.‍

Digital artists walk you through the creation of the Astronomical Fortnite event. They used a basic environment with props and a suited mocap actor to map out the previs for animators.

If you want a more straightforward setup for a small project, short episodes, or a student experiment, check out how you can create a virtual event using Rokoko Studio. The National Film School of Denmark is currently using this workflow to create a faster short film workflow with their students.

Considering sound for virtual events

With the introduction of VR headsets, immersive experiences have been the focus of every new feature development. Unreal Engine has created an advanced tool that mimics the ambient sound of music throughout an environment. 

So, for example, if you walk into a nearby virtual room from the main concert hall, the volume, reverberance, and pitch of the music will adjust realistically. 

See how Unreal Engine handles ambisonic sound field rendering:

It’s good to see that Aaron McLeran is still on the audio team. Here, now you can feel good that you know a lot about audio already.

Live control, live events

Working with live control, real-time control opens up performance possibilities – even if the performance winds up being virtual.

DMX ins and outs? Yes! MIDI control? Sure! And here’s where this gets crazy – you might be rigging both real-world and virtual lights, together. It’s like being inside and outside the Matrix at the same time. (Just in case you decided to take both the red and blue pills. I shudder to think what you’d do with the Marshmallow Test.)‍

And yeah, I see an 808 and a Sensel Morph in there! Live external control:

3D production and vfx

Working with Blender will be a big boon to artists developing in that tool. Unreal of course isn’t open source, but the combination of free as in freedom with free as in beer (for most uses) sure isn’t bad. (There are also other tutorials on importing material from Blender to Unreal, so you can make your models in Blender.)

Volumetric effects are just spectacular in Unreal now, with their Niagra system:

And Niagra in general is worth a look – a serious VFX particle system in a game engine:

This stuff is transforming in-camera effects even on TV production. Sure, you may not be HBO, but that also suggests a blurring between what happens in real-time effects and what you watch in a TV or film:

Holding a virtual event is a potent engagement tool

The real-time tech that makes virtual events possible is powerful when used for both live shows and the previs of performances. It allows large teams of people to visualize a set or a sequence of events immediately without any space for miscommunication. And that’s one of the biggest benefits of using virtual production. 

 

MADISON BEER


Epic Records artist Madison Beer has released what might be the most photorealistic depiction of a musician yet. The Madison Beer Immersive Reality Concert Experience, a groundbreaking, effects-filled virtual performance that premiered on TikTok LIVE and is now coming broadly to YouTube, VR platforms and more, shows just how far an idea can go when artists set real-time rendering and virtual production loose on their vision.

An ultra-realistic digital avatar of Madison is the centerpiece of a boundary-pushing concert that would be impossible to recreate in real life. Sony Music Entertainment and Verizon worked with Madison to develop a full-scale recreation of New York’s Sony Hall and present a medley of her hits with all the production value you’d expect from a major artist. Only it’s completely virtual—except for the music and performance driving the experience.

For creatively adventurous artists seeking new and innovative ways to connect with audiences, that can be a good thing. While most concerts are limited by worldly constraints, a virtual concert can be whatever an artist wants it to be, giving them the power to shape fan experiences and realize fantastical concepts at a much higher level than is possible in real life. The Madison Beer Immersive Reality Concert Experience takes this idea and runs with it, turning one piece of content into the type of transmedia campaign that can thrill fans from YouTube to VR.

Keeping it real 

For all the leeway afforded to them by 3D, the production team—led by Sony Immersive Music Studios, Magnopus, Gauge Theory Creative, and Hyperreal—still saw value in maintaining a measure of realism.

“When we started with a blank canvas, our creative goal was to construct a virtual concert through photoreal recreations of a real venue and a real artist, but which also layered in enough magic to reimagine the concert experience itself,” says Brad Spahr, Head of Sony Immersive Music Studios.

“You start with things that are totally plausible in a physical setting, because that’s what’s going to make your fans get into it and accept the experience,” says Alex Henning, Co-Founder of Magnopus. “Once you’ve got them hooked with that kernel of truth, you start to build on top of that with the fantastical. And the more you can pull off the former, the more “wow” you get out of the latter.”

For Magnopus, this meant the venue and the VFX packages. For Hyperreal, it meant Madison herself.

Hyperreal started by capturing Madison’s face and body with two separate arrays of high-resolution camera systems in Los Angeles. The first system produced a volume for her face, neck, and shoulders, as it recorded photometric data at the sub-pore level. By capturing the way she moved from every angle, Hyperreal was able to get enough data to construct an ultra-realistic avatar, or “HyperModel,” that steers clear of the Uncanny Valley.

With the help of 200 cameras, Madison’s body, muscles, and shape were then recorded in a range of biomechanical positions to ensure deformation accuracy in Hyperreal’s real-time HyperRig system. After adding Madison’s preferred performance gear—outfit, hairstyle, earrings—Hyperreal brought the avatar into Unreal Engine to experiment with movement before the live capture session at PlayStation Studios in LA.

While this was happening, Magnopus was hard at work on the venue and VFX systems. Like the HyperModel, the goal was to stay as real as possible to ground the event, so when things like star fields started appearing above Madison, they would seem magical and surprising.

After considering a full LiDAR scan, Sony Immersive Music Studios decided to construct the venue from scratch to allow them more control over the lighting. They started with the original CAD files, which were imported into Autodesk Maya and given the full artistic treatment, including all the nuances that make Sony Hall unique. Magnopus was then able to build upon that with lighting and VFX to achieve the overall goal of a reimagined concert experience.

“Sony Hall is an intimate venue with a lot of character, detail and beauty, which made it an ideal environment for the experience” says Spahr.

“It is also great for VR, because of the scale. It’s not a giant, cavernous arena or a tiny hole-in-the-wall club,” says Henning. “It’s got almost the perfect amount of dimension.”

Since Unreal Engine would be used throughout the creation process, Magnopus made use of its built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects. But first, they needed the performance.

The benefits of virtual production for music 

Unlike most motion capture shoots, where everyone could be together, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera. In Philadelphia, Hyperreal CEO Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.

Because Unreal Engine operates in real time, virtual productions can use its remote collaboration tools to stream 3D environments anywhere in the world, completely synced across locations. This allowed Madison’s performance to be recorded in one continuous take, with no cuts and no edits, which was important for a team who wanted the performance to feel as authentic as possible.

After the motion capture shoot was completed and the experience was polished, cameraman Tom Glynn was able to build out the shot selections for the final 9.5 minute performance.

“There are moments where you can’t believe this was done in a game engine,” says Tom Glynn, Managing Director at Gauge Theory Creative. “There’s a 3D world with a performance happening, and it’s happening at the same time that I’m moving a camera around. It’s hard to believe what I was seeing in the viewfinder while I was shooting it. I’m looking at an avatar of Madison Beer and it feels like a real person I’m standing in the room with. It kind of blows my mind.”

In two days, they recorded hundreds of takes, ensuring that they could get any shot they wanted.

“We were hitting the play button, Madison was performing, and Tom was getting his shots in real time. He was instantaneously watching his shots on the monitor directly in front of him. Then we would stop, readjust, hit play, and the concert would go again and he’d get another shot,” says Spahr. “That real-time feedback was huge. If there was one thing about this that was a challenge, it was, ‘I have so many good shots, I don’t know which one to use!’ It was an embarrassment of riches.”

Glynn was surprised about how easy a virtual production experience could be on a cameraman, especially for a “concert” shoot. Traditionally, a live performance would necessitate five to twelve different cameramen who would be set up in strategic parts of the venue with a variety of different tripods, dollies, Steadicams, and so on. The team would prepare, shoot it once, and get what they got. In this case, Glynn was able to use all the same equipment, including a handheld rig, but film within a virtual world that allowed for quick takes.

Using Unreal Engine, Glynn was also able to overcome some of the physical limitations of the real world with a few quick commands. For instance, sometimes the shot he wanted was a little above or below Madison’s eyeline. So the team just increased his size by a factor of 1.2 or 1.5 within the environment, and he was suddenly “tall” enough to get it. Other times, he wanted to keep up with her quick moves without introducing bumpiness into the take. So they increased the translation scale by 1.5–2x until one step equaled seven. Problem solved.

Moment makers 

Once the footage was “in the can,” it was up to Magnopus to sweeten it with effects that would not only catch the eye, but would be impossible in real life.

“There’s a sequence where there’s a ring of fire around Madison. There’s a moving experience where raindrops are falling around her. These are things that, due to safety issues, wouldn’t be allowed in a normal concert venue,” says Spahr. “So we are giving a fan the ability to see a concert in a new way, but then we dial it up, with cosmic star fields.”

Magnopus created all the special and lighting effects within Unreal Engine, using real-time ray tracing and the timeline tools in Sequencer, the engine’s built-in multi-track editor, to jump around as they edited different sections of a song. And with Pixel Streaming at its disposal, Magnopus was able to overcome the hardware limitations that box artists in. Pixel Streaming enables you to run an Unreal Engine application on a server in the cloud, and stream it to any browser on any device, just like a YouTube video.

“In real time, you’ve always got a render budget and you can’t go over it. It’s always locked to whatever the target device’s power is and you’ve got a certain amount of things you can render, at a certain level of quality, in order to get the screen to refresh at a refresh rate that you need it to,” says Henning. “Being able to exceed that and go well beyond it, and not make choices of ‘this or that,’ but to choose both—as in I want a photorealistic human render, and I want fire, and I want rain, and I want smoke, and I want atmospherics—is appealing for an artist.”

One for all

While The Madison Beer Immersive Concert Experience premiered on TikTok LIVE, it was also meant for bigger things. And because it was created in a game engine, the project could be easily transformed for different channels, removing the need for the production teams to recreate the experience for video, VR, and mobile applications.

“If you’re natively working in real time from the beginning, and that’s at the core of all of your deliverables, if those can all be slices of the same pie, as opposed to needing to make an entirely new pie every time, it really opens up the door to unifying things in a more interesting way and doing a lot more with less total effort,” says Henning. “The more we can use the same setups to do the VR piece and the mobile AR/interactive version and the pixel stream version and the 2D video extract for YouTube or TikTok, the more you can focus all your energy and creativity on the world itself.”

But even with incredible experiences like this filtering out into the world, the question still remains: How far will virtual concerts go in the next few years? According to Spahr, really far—where he sees plenty of opportunities for new shows and new ways to use digital avatars to reimagine music.

“Anything an artist can dream up can be brought to life, no matter how fantastical it might be. We don’t have to operate within constraints,” he says. “We don’t have laws of physics. We don’t have fire code and safety [protocols] that we have to abide by. To be able to sit down with an artist and say, ‘Dream up your fantasy experience for your fan. If you could do anything, what would you want to do?’ and to know that the tools and the technology exists to make that a reality is the most exciting thing for artists and the music industry.”

 

FINAL EXAMPLES