10 Questions to Create Amazing XR

Whether you’re at the conceptual phase or midway through production, asking these 10 questions will bring clarity to your creative process. I use it with my clients to identify what’s most important, and build a roadmap for success.

1) Who is your audience?

Before we get to all the epic cutting edge toys and tech, this is the reason we’re making something at all. Getting really clear on who your audience is, is always the first step towards creating a revolutionary experience. Ask yourself is there a target demographic? What are their needs, desires, fears, biases, past experiences, future expectations. This is the bedrock of your decision making.

2) How do you want to affect your audience?

This is where we focus on three main areas – the brain, the heart, and the actions. The brain is the analytical part of the circuitry. It’s concerned with the plot of the story, the facts at hand, and the reason for the season. If you’re a destination travel agent showing clients a VR teaser of your latest island resort, then you might want them to think “That’s a gorgeous location. My family would love that.”

When I created Excision’s opening concert visuals, I wanted his audience to think, “Wow, we’re blasting off, and going to another dimension. The rest of this show won’t be like anything I’ve ever seen on earth.” The thought process is usually the easiest part, because it often comes first.

Next, how do we want them to feel? Everyone remembers the plot of the movie, but the subconscious element that moves them to tears, is the music score playing in the background. I created a VR music video for Red Bull, called Money which simulated the experience of dropping through layers of dreamstates, starting with waking reality, and ending in primal unconsciousness. The first gateway to this realm has us catapulting with our hero down into the bed, and into the dream. We broke a cardinal rule of 3DOF VR and changed the perspective of the viewer from looking straight ahead, to looking straight down. Our actor Danny G is the focal point which gives us familiarity and stabilizes us, but the background is the ground, and the further he falls away from us the more we realize that we’re not in Kansas any more. And right as our conscious brain is connecting to this subconscious understanding, our hands wipe to Danny’s 1st person perspective.

The result is like doing a big drop on a rollercoaster, and is designed to kick in a bit of adrenaline to start the show off . Together this matches with the music, to stir the emotions. When thinking about emotions we usually have to consider them as a journey. Movies with happy endings always have a moment of “all is lost” right before the conclusion. Same holds true for emerging media.

Finally, what do you want your audience to do? What’s the call to action? With the travel example, do you want them to book a flight? Is your metaverse gala going to raise money for charity? Is your Instagram AR filter going to create shareable experiences for your fans to promote your product? These questions will all inform the next steps.

3) Is it active or passive?

Passive experiences are like books or tv shows. Active experiences are more like video games, where the audience is making decisions and has agency over the story. While a VR documentary might be great to watch passively, a VR dungeon adventure will ask you which door to enter, and measure the accuracy of your sword skills. Both approaches are powerful, and generally somewhat mixed. Passive VR experiences play out differently based on which direction you look. And even the most open world RPGs still have their sections that’re on rails.

Money VR (Left) and Axon VR (Right) are both live-captured experiences. However, only Axon is considered active because the player makes decisions that affect the branching narrative. Even though they’re both shot similarly, Axon has to be programmed inside of a game engine, while Money was created with traditional editing and 3D software.

So think about your audience, what you want them to do, and what level of agency will make the most impact. This will all dictate what form of XR it is, and what tech is needed to pull it off.

4) What version of Extended Reality (XR) is it?

What device is your audience going to experience it through?

There are pros and cons to every medium, and the lines between them overlap. If you’re at the stage where you’re not sure which one is right for you, just circle back to this section after you’ve read the rest.

Quick list:

– PC VR (Virtual Reality) – Oculus Rift, HTC Vive, Valve Index

– Standalone VR – Oculus Quest, Vive Focus, Pico 4

– Headset based AR (Augmented Reality) – Hololens, Google Glass, Oculus Pass-Through

– Smartphone AR – Instagram, Tic Toc, Snapchat, Lenses and filters.

– Metaverse – VR Chat, Horizons, Altspace, Sandbox Metaverse, Decentraland

– Web3.0 Application – NBA Top Shot, Sandbox Metaverse, Crypto Kitties

– Mixed Reality – Axon VR, Industrial Training, Beatsaber on Twitch, I Expect You to Die (Game) 

– Virtual Production – The Mandalorian, House of the Dragon, The Lion King

– Location Based – The Void, Sandbox VR, Hologate

– Amusement Park Ride – Transformers the 3D Ride, The Simpsons Ride

– Broadcast Screens – Fox Sports NFL, Dancing with the Stars, Miss Universe

– Projection Mapped Concert –  Excision, Amon Tobin

– Planetarium Dome Show – Childish Gambino, MSG Sphere, Omnispace 360, Wisdome

Cat Sorter is a PC VR game, but we created a mixed reality version to show off the gameplay for the traditional flat-screen commercial.

5) What perspective are we in?

1st Person, 2nd Person, 3rd Person? All of the above?

One of the appeals to VR is that you’re not watching the game. You’re in it! When a person walks up to you and looks you in the eye, it has a powerful impact. In filmmaking this is called breaking the 4th wall. In VR it’s just a first person experience, because there are no walls between you and the content.

The first police officer training simulation I directed for Axon VR, included all three perspectives. It starts off in the 1st POV of an officer at a domestic disturbance call. You meet the husband and wife who’re trying to convince you there’s no problem. You must choose in real time how to respond to the situation. It’s brilliant as a training exercise for new officers, because they can work through what it’s actually like to be lied to by someone who is potentially dangerous.

When their decisions lead them down a failed path, we see the results from a third person perspective. When they get it right, we go into the 2nd POV of the victim. We hear her thoughts, and see the police officers entering her home, and ultimately offering to help her. In this way, we’re able to experience a positive police interaction from the victims viewpoint. It creates empathy between the officers and the communities they serve. It’s not always easy to switch gears when people are lying to you. But when you experience the inner world of the victim, it deepens the understanding of what brought them to this place, and how you can help.

When it comes to projection mapping, perspective is everything. Virtual Productions use game engines to update LED walls 120 times per second based off the position of the camera. The result is a seamless integration between the actors, and a digital environment in real time.

When creating the projection mapped concert visuals for Excision, I had to calculate the geometry of their wrap-around LED stage in relation to the audience’s eye level. From there I was able to build the temple architecture that held up from every angle, even when it was moving and rotating. Every XR modality has its limitations, but if you can design with those in mind, while factoring the perspective of your audience, you can make creative decisions that turn limitations into opportunities.

6) Is it single player or a shared experience?

Individual experiences have the big advantage that the creator has more control over all the variables, and generally they’re easier and cheaper to produce. Shared experiences bring people together in novel ways. It’s one of the most exciting aspects about emerging technology. Whether it’s competition, collaboration, or community building enhancing human connection is one of the most life affirming activities we can partake in as creators. This is the cornerstone of why people are excited about the metaverse (other than selling billion dollar NFTs).

We created InstaQuest as a live Dungeons and Dragons Twitch Stream, where armorclad cosplayers battled their way through a hoard of creepy crawlies, all inside a projected mapped video game. We projected a digital fantasy world all around them, and they’d have to decide between themselves what the best course of action was. We took classic theater of the mind and brought it to life using virtual production magic.

The other layer was audience interaction The dungeon master would ask the Twitch stream to vote on what would happen next. They’d vote for risk vs reward, and collectively become the dungeon master themselves. We did this all at a time when most of America was under strict lockdown. It was a fun way to let people be involved remotely, and collaborate on a large scale.

7) What is the technical core?

Unity, Unreal, Adobe, Maya, Blender?

If it’s interactive, you’re probably looking at using an established game engine. If it’s a passive experience you might be able to get away with Adobe and/or 3D software. Realistically, you’re going to be implementing a bunch of different software/hardware solutions, including some that are in beta or created by enthusiastic amateur programmers. My advice is that you use tried and true software at the core or your project, and supplement it with exciting experimental programs. Maybe you just have to have that new Unity capability from the most recent Beta, but if you don’t then there’s real benefits to using last year’s stable build with long term support. Namely, you’ll have way less bugs, crashes, and surprise roadblocks.

On the other hand, Maya has been an industry standard tool for years, but Blender is becoming more powerful year over year, and it might give you everything you need for free. So start with a strong reliable core and integrate the experimental pieces, knowing that you may need a backup plan if they aren’t cutting it.

In the beginning all the VR tech was experimental. When I shot Money VR, we created our own helmet camera for an embodied first person experience. We had to stabilize the footage at the point of the stitch, but then we were using after effects to composite the back mounted camera, so you could look over your shoulder. The problem was that Mystica (the stitching software) couldn’t talk to After Effects. So, we had to bring in a programmer to write a script that would import the tracking data from one piece of software to another. Today, it’s a lot easier to move between applications, but just know there’s always new tools coming out that open up new possibilities for your production. You’ve just got to decide whether or not it’s worth the amount of time, energy, and money it’ll take to implement them.

8) What are your peripherals?

Controllers, Haptic Feedback, Bass Plate, Lasers, Lights, Pyrotechnics, Chat

One of the exciting things about working in the game engines, is that the sky’s the limit for what you can add to it. You can design your VR experience to work with a haptic suit with leg and arm trackers, while running on an omnidirectional treadmill. With Excision’s projection mapped concert visuals, I also wrote up art direction for the laser lights, pyrotechnic and fog cues.

Finding the balance with all these toys really depends on how much control you’re going to have over the final user experience. If it’s just going to be added to a digital marketplace, you’ll want to weigh how many people own the peripherals, vs how much it’s going to cost to program them. If you’re a location based VR, you get to decide which peripherals you use. For instance, Sandbox VR has mixed reality weapons that you use to blast your way through pirates, zombies and aliens. One thing to note is these things are fun, but only if the gameplay and storytelling are already on point. If you’re not solid in the foundations of what you’re creating, then adding other peripherals will feel like a gimmick.

In Kungfuscius VR we added a Subpac that sent deep bass vibrations through the viewer’s chest whenever they were struck by a ninja. It was an early form of haptic feedback, and could even simulate their heartbeat racing as they were hanging off the side of a building. The whole thing was programmed through sub-audible frequencies in the music track, and brought a new dimension to the experience. The lesson is that if you’re clever enough, you can make your peripherals perform well without a lot of extra programming.

9) Where is it in relation to the cutting edge?

Do you need to pioneer new software, hardware, filming practices, editing techniques?

The cutting edge is exciting. For many companies in the XR space it’s a badge of honor. That said, you don’t have to reinvent the wheel every time you start a project. The truth is there’s been a lot of emerging media that’s suffered because it’s focused on pioneering the bleeding edge. You can use the technological innovations created by others, to tell your own unique stories without all the R&D. And believe me, there’s a lot of room to improve on what’s already been created.

Standing on the shoulders of giants is ancient wisdom. If you can tell a great story, it doesn’t have to be an engineering marvel. On the flip side, if you just have to engineer the most complex, cutting edge, transhuman, AI, web3.0, mixed reality omniverse, then there’s never been a better time. The secret is understanding the elements that go into it, and partnering with the engineering teams that want to see their tech in action.

On the Candy Crush TV show (left) we have the Guinness record for creating the world’s largest touch screen. On Mental Samurai (right) we harnessed contestants to a giant robotic arm that swung them from screen to screen as they attempted to answer trivia questions. In both cases we are pioneering new ground, but it only works because it’s based in a tried and true format – TV game shows. A good piece of advice is if you want to push the technical edge, stay true to an existing format. If you want to push the format, use proven technology.

Sean “Stroke Hacker” Entin asked if VR could help him regain motor function in the paralyzed left half of his body. I took an existing technique known as “mirror boxing” and created a custom application for him. It’s a VR exercise regiment that allows him to perceive his hand as if it had all its full functionality back. After eight weeks of use he called me to tell me that it’s worked so well he’s now bench pressing a bar above his head.

10) What happens outside of the XR?

This is really a wide range of questions. What happens right before, and right after the experience? How are you publishing it? What platforms is it on? Is there a pre-existing community, or do you have to build one? Is it part of a larger piece of intellectual property?

Sometimes art exists in a vacuum, but more often it’s part of something bigger. A great experience leaves the audience better than it found them. Axon’s police training has teachers working side by side with VR simulations. They’re not trying to get rid of the human element. They’re augmenting it. Ask yourself how XR will make an existing experience more impactful?” What little external factors can you add that reinforce the message on a deeper level?

We used the 360 projection of the film to create a dreamy environment for the Money VR premier party. Ultimately, I think of XR as a way of enhancing our stories and our connections. Both the connections between communities and the neurons in our brains. The more vivid we’re able to create these experiences the more profound the impact is on the audience. The same way a powerful dream changes your view of waking life, a powerful experience can change the way you dream.

Shoot us a line to connect with industry professionals

We’d love to produce, direct, or consult on your next big project.