To capture the unique aspect of the virtual reality gameplay, the creative agency behind the project (Ayzenberg/Space.camp) turned to virtual production to create a spot that blended live actors playing inside the in-game world.
Filmed at Soapbox Films in Burbank, California, the trailer was shot against a 30x12ft 1.5mm 6K Absen AX Pro LED video wall powered by ARwall‘s own ARFX Pro Plugin for Unreal Engine, ARFX Pro Server System (check out our interview with ARwall at NAB), and the HTC Vive for camera tracking.
A few production highlights:
- Project timeline was just a few weeks. Production was only a handful of days.
- Location scouting was done virtually in the game’s worlds
- Director Jonny Zeller’s first experience with Virtual Production
- The team used practical effects including synced lighting systems, a treadmill, and a turntable, along with dirt explosions and fog.
Led by Producers and Virtual Production Supervisors on the project, VP veterans Rene Amador & Jocelyn Hsu, the ARwall team comprised Environment Artist Parnian Javid, Virtual Production Producer Ryan Arms, Virtual Production Manager Brandon Kakudo, Senior Software Engineer Danny Vargas, and Technical Artist CJ Galarza. Additional environment design work was led by Mark D. Allen of Allucinari.
Virtual Production Prep
What was the prep process like and how was it different from how you tackle a green screen or on-location shoot?
Zeller: One of my biggest pet peeves about green/blue screen is that wrapping on set is only half the game. There’s so much post-work to actually finish the job and that can be terrifying. I love working with VFX so virtual production gives me the best of both worlds. I get to utilize the effects I want without having the unknowns of dealing with it down the line. I also feel like I get better performances when the actors can “feel” the world they’re working in.
Knowing how the talent was going to live in the space helped me design the blocking and camera positions. Then I thought about how to push it further and make the whole piece feel “bigger.”
We used every inch of our 30’ LED wall. ARwall was great about listening to my vision and then bringing their own sense of artistry to help the piece come to life.
During the tech rehearsal day, I could work with DP, David Klassen to pick the specific frame, lensing, etc. We moved the sun around, turned on virtual lights, knocked out walls, and plugged in a few explosions. Now I’m hooked and don’t want to do it any other way.Jonny Zeller, Director
Klassen: The preparation process for a greenscreen shoot and an on-location production varies significantly in workflow and involvement.
Typically with greenscreen, the majority of the work is handled in post-production, placing a significant creative burden on the editors and visual effects artists. In contrast, shooting on an LED wall offers a distinct advantage by allowing the cinematographer to play a more active role in shaping the final image. When executed effectively, this approach yields more concise and aesthetically pleasing visuals, while reducing both time and budget in the editing room.
One of the key differentiators lies in the required creative discussions needed in pre-production before shoot day.
With greenscreen shoots, many creative choices are deferred to the editing phase, meaning certain elements may remain undecided until post-production.
In contrast, working with LED screens necessitates meticulous pre-production planning, leaving no significant aspects open to interpretation on the day of shooting. This shift towards enhanced collaboration fosters the development of highly innovative and imaginative solutions. The increased involvement of the entire team, including the cinematographer, propels the creative process forward, leading to a better final product.
What was the virtual scouting process like inside the game?
Zeller: I’ve always enjoyed physically scouting locations for their energy and real-time visualization. But virtual scouting offered a new way to explore the diverse settings in the game. I absolutely loved it. I could “fly” around the massive levels from the game and pick the locations with the right aesthetic. Seeing the digital world like that helped shape my shot list and opened up my mind to new ideas that I would not have thought about had I not been able to see them prior to shooting.
During the tech rehearsal day, I could work with DP, David Klassen to pick the specific frame, lensing, etc. We moved the sun around, turned on virtual lights, knocked out walls, and plugged in a few explosions. Now I’m hooked and don’t want to do it any other way.
Klassen: The virtual scouting process proved to be an engaging and productive experience, particularly for those with an appreciation for video games. Navigating through virtual environments and having the ability to select real-time camera angles provided invaluable support to the director and cinematographer, streamlining the vision-formation process significantly.
By eliminating uncertainties and variables, this approach ensured that the intended outcomes were achieved.
If this wasn’t a VP shoot how would you have gone about recreating the video game world?
Zeller: If this wasn’t a virtual production shoot, recreating the video game world and matching it after the fact to the footage we captured would have been so much more tedious and limiting.
Virtual production allowed me to treat a VFX-driven shoot more similarly to a traditional on-location shoot that I’m used to. I felt more comfortable as a director like this.
Klassen: Given our unique circumstances, alternative options for recreating the video game world were limited, making the decision to shoot on an LED wall both creatively and financially prudent. The LED wall offered an ideal solution, aligning perfectly with the project’s creative direction while also adhering to the budgetary constraints.
Klassen: My background in digital spaces, such as 3D animation and video game design, provided the necessary familiarity with the technology required to make a smooth transition into shooting on the wall.
It’s important to say that working on the LED wall requires a specific language and understanding unique to the medium. To effectively communicate with the skilled Unreal Engine technicians and optimize collaboration, it really is essential to grasp the basics of 3D modeling. This knowledge is invaluable in facilitating productive discussions during the production process.
Did you hit any roadblocks or limitations with VP and how did you work around or overcome it?
Zeller: Timing and a short prep window were the source of a lot of our issues. New technology always has bugs no matter how well it’s built. We made some quick pivots, adjusted the shooting process. That combined with a really knowledgeable team helped us still capture what we needed and keep the client happy.
Klassen: One prominent challenge we faced was the considerable restriction on camera movement imposed by the wall. Excessive panning or tilting could potentially reveal the edges of the LED wall, disrupting the illusion of the virtual environment. This restriction limited our ability to execute sweeping or rotating shots practically.
To address this issue, we collaborated closely with our skilled Unreal technicians and were able to integrate the camera movements directly into the digital environments. We then complemented those movements with real world lighting changes, such as sweeping them around or dimming them up and down. This approach enabled us to achieve the desired effect, allowing our talent to rotate indefinitely within the virtual space.
The Future & ARFX
Has this changed how you’ll think about or film future shoots?
Zeller: Understanding how the tools work feels like half the battle. Once I understand what’s possible, my brain is free to run and create. Working with virtual production is a prime example of this. Time to push the limits!
Klassen: The LED wall technology has absolutely transformed my perspective on future shoots. It stands as a great addition to my creative arsenal, though I recognize that it may not address every situation.
Nevertheless, it has significantly expanded my understanding of what can be achieved within a given budget.
Incorporating the LED wall allowed me to create seemingly boundless scenarios in a remarkably short span. In a single day, I shot atop Mount Everest and, right after lunch, transitioned to crossing the Sahara desert. The level of versatility and immersion it offers is truly remarkable and surprisingly enjoyable.
Moving forward, I am eager to continue to integrate this tool into my projects where appropriate. The creative possibilities it unlocks are inspiring and continue to peak my interest as the technology develops.
Rene – what did the ARFX plugin do that sped up or helped the production process?
Amador: The ARFX Pro Plugin for UE5.X is a virtual production toolset that unifies the workflow of tech scouting, previs, rehearsal, on-set production, and post-production for in-camera effects, AR, and live chromakey shoots. This means that the directorial team on Crossfire were able to dial in their looks ahead of time and then toggle between approved angles, lighting, color, and animations on set.
Because the ARFX Plugin was built with filmmakers in mind, the user-friendly game-style menu interface gave the team the flexibility to tweak these presets as needed. This is dramatically different from the traditional Unreal Editor workflow, where you must program every single in-engine feature as if you were a game developer.
Coupled with the streamlined setup process, this meant that we were saving valuable time on set, transitioning from one scene to the next within seconds thanks to the one-click calibration — which was a key consideration for a VFX-heavy shoot like this one, evoking the in-game experience.
Interview was slightly condensed and edited for clarity.