Red vs. Blue is officially over. On Tuesday, Warner Bros. Discovery released Red vs. Blue: Restoration, the final installment in the long-running saga that was once at the forefront of a whole new form of entertainment: web videos created from in-game footage. Machinima signaled a new world where that footage—of Halo, in Red vs. Blue’s case—could power viral clips. That was 2003. Now it seems as if Restoration might be machinima’s swan song.
“Machinima directors use game engines, which allow them to record a scene from any conceivable angle, like a Hollywood director uses a cinematographer,” WIRED wrote in a 2002 piece heralding the potential of this new filmmaking technique. When it launched a year later, Red vs. Blue exemplified those possibilities. The series was created by linking several Xboxes together and recording footage of a Halo multiplayer match, then adding voiceover. The absurdist, existential tone of the dialog was a hilarious counterpoint to (and commentary on) the run-and-gun gameplay of the first-person shooter used to create it. The show’s creators founded a production company, Rooster Teeth, and made more than a dozen seasons’ worth of episodes.
Red vs. Blue would go on to develop a huge fan base and become a geek touchstone in the two decades that followed—which is why Restoration’s release feels like an ignominious sendoff. In March, Rooster Teeth general manager Jordan Levin announced that Warner Bros. Discovery, now Rooster Teeth’s parent company, was shutting down the studio, and it soon became clear that the IP was being split up and sold off for parts. Today, the final installment of Red vs. Blue is being unceremoniously dumped onto streaming platforms with minimal fanfare or promotion.
It’s a sad moment for fans of Red vs. Blue and Rooster Teeth, but it’s a great moment to reflect on the impact the web series had. Machinima isn’t talked about much these days, but across the media landscape, you’ll find people using games to create everything from streams to clips to GIFs to art films, and doing it in ways that were unimaginable 21 years ago. “Machinima is not a word we use anymore, and it’s not really something we think of as like a medium or a genre anymore,” says Adam Bumas, a writer for the internet culture newsletter Garbage Day. “But it’s still going strong. In fact, it’s everywhere.”
What hath machinima wrought? For starters, look at the phenomenon of Fortnite concerts. Over the past few years, major recording artists like Kid Laroi, Ariana Grande, and Travis Scott have performed sets for millions of people logged in to the game world. (Lil Nas X did a similar virtual event inside of Roblox.)
“The reason those concerts happened is because Epic realized that people were just hanging out in Fortnite and not even playing,” notes Bumas. “It’s like an evolution of a social space.” And since Fortnite‘s gameplay is centered on building and creating things as well as shooting each other, it was only natural that Epic would also lean into developing tools that help people express themselves and entertain each other within the game world.
The game publisher has also developed tools that let filmmakers use the underlying game engine that Fortnite runs on in their production process. For instance, Industrial Light & Magic has employed Epic’s Unreal Engine in its StageCraft virtual on-set production process since the first season of The Mandalorian. For the most recent season, the company used Unreal to help actors and filmmakers visualize how a CG droid character would interact with flesh-and-blood actors.
“When you’re confronted with a sea of green and representations of characters on ping-pong balls or tennis balls, it becomes a pretty daunting experience for the actors and the director,” Epic Games’ chief technology officer, Kim Libreri, tells WIRED. “I think what we’ve been able to do here is give control back to the filmmakers.”
In a different galaxy far, far away, artist Tim Richardson recently collaborated with fashion designer Iris van Herpen on the CG short Neon Rapture, which was also made with Unreal. The tech allowed van Herpen to push her eye-popping concepts and designs further than she ever could have in the real world, and Richardson says that the game engine was his “sound stage” for the production. Where the Red vs. Blue creators had to simply capture footage of themselves playing Halo, Richardson had a toolkit to work with that was specially designed for someone intending to render content rather than have a play experience. It allowed the filmmaking team and the fashion designer to prototype every aspect of the shoot from designs to lighting to costume to sets, and mix motion-capture data with a digital environment on the fly to figure out their shots.
“It was the closest thing to shooting live-action I’ve experienced in VFX-based filmmaking,” Richardson says. “I was able to share ideas and collaborate with Iris on a time-scale impossible in linear VFX. I see game engines as an essential aspect of my future work.”