(2024-04-22) How Perfectly Can Reality Be Simulated

How Perfectly Can Reality Be Simulated? Video-game engines were designed to mimic the mechanics of the real world. They’re now used in movies, architecture, military simulations, and efforts to build the metaverse.

Quixel creates and sells digital assets—the objects, textures, and landscapes that compose the scenery and sensuous elements of video games, movies, and TV shows. It has the immodest mission to “scan the world.” In the past few years, Caron and his co-workers have travelled widely, creating something like a digital archive of natural and built environments as they exist in the early twenty-first century:

Quixel is a subsidiary of the behemoth Epic Games

Video games have long bent toward realism

Most game developers rely on third-party engines like Unreal and its competitors, including Unity. Increasingly, they are also used to build other types of imaginary worlds, becoming a kind of invisible infrastructure. Recent movies like “Barbie,” “The Batman,” “Top Gun: Maverick,” and “The Fabelmans” all used Unreal Engine to create virtual sets.

“It’s really coming of age now,” Tim Sweeney, the founder and C.E.O. of Epic Games, told me. “These little ‘game engines,’ as we called them at the time, are becoming simulation engines for reality.”

In 2011, Quixel began capturing 3-D images of real-world objects and landscapes—what the company calls “megascans.”

most of Quixel’s assets are created on treks that require permits and months of planning, by technical artists rucking wearable hard drives, cameras, cables, and other scanning equipment

Sweeney, Epic’s C.E.O., has the backstory of tech-founder lore—college dropout, headquarters in his parents’ basement, posture-ruining work ethic—and the stage presence of a spelling-bee contestant who’s dissociating. He is fifty-three years old, and deeply private.

“It’s probably going to be in our lifetime that computers are going to be able to make images in real time that are completely indistinguishable from reality,” Sweeney told me.

Sweeney’s lodestar was a company called id Software. In 1993, id released Doom

id also took the unusual step of releasing what it called Doom’s “engine”—the foundational code that made the game work

Online, Doom “mods” proliferated, and game studios built new games atop Doom’s architecture.

they were proofs of concept for a new method and philosophy of game-making

Sweeney thought that he could do better. He soon began building his own first-person shooter, which he named Unreal.

Today, Unreal Engine’s user interface looks a little like a piece of photo- or video-editing software; it offers templates such as “third-person shooter” and “sidescroller.”

Epic’s demos are so system-intensive that they would slow to a stutter on the average laptop.

Last year, the company laid off sixteen per cent of its workforce; Sweeney cited a pattern of spending more money than the company was bringing in

Today, some major game studios, such as Activision Blizzard, which makes Call of Duty, still use their own proprietary engines. But most rely on Unity, Unreal, and others. A number of big-budget titles—including Halo, Tomb Raider, and Final Fantasy—have recently traded their own engines for Unreal.

certain things remain hard to simulate. There are multiple types of water renderers—an ocean demands a kind of simulation different from that of a river or a swimming pool—but buoyancy is challenging, as are waves and currents.

it’s incredibly difficult to realistically simulate humans. “The solution to fluid dynamics and to fire and to all these other phenomena we see in the real world is just brute-force math,”

But humans have an intuitive sense of how others should look, sound, and move, which is based on our evolution and cognition

Fortnite, which is made with Unreal, is a cultural phenomenon, with about a hundred million monthly players. Its most popular mode is Battle Royale, in which players blast one another with weapons. But there are also more social modes. There are now live concerts in Fortnite, attended by millions of people.

In February, Disney invested $1.5 billion in Epic, for a nine-per-cent stake in the company; Bob Iger, the C.E.O. of Disney, has said that he plans to create a Disney universe in Fortnite

Mastilović suggested that MetaHumans could one day be used to create autonomous characters. “So it will not be a set of prerecorded animations—it will be a simulation of somebody’s personality,” he said.

a simulation of alternate reality

When James Cameron began making “Avatar,” in the mid-two-thousands, he announced that he would replace traditional green screens with a new technology that he called a “virtual camera.”

The crew called the technique “virtual production.”

The term “virtual production” is now used for a number of filmmaking techniques. The most prominent application is as an alternative to a green screen

This year, N.Y.U. will begin offering a master’s in virtual production, at a new facility funded by George Lucas and named in honor of Martin Scorsese

Game engines can also be used for previsualization, including virtual scouting, which relies on 3-D mockups of sets. The virtual models are often created before the sets are built. “Everything that was happening in Barbie Land we technically had a real-time version of, for scouting,”

Nonetheless, virtual production presents difficulties. It’s hard to establish distance between actors, since a volume can be only so large.

Decisions about lighting, scenery, and visual effects have to be made in advance, rather than in postproduction.

In 2020, Zaha Hadid Architects used Unreal to model a proposed luxury development in Próspera, a controversial private city—and a special economic zone, marketed as a haven for cryptocurrency traders—on an island in Honduras. (Locals oppose it, fearing displacement, surveillance, and infrastructural dependence on a libertarian political project.) (charter city)

Last year, Epic worked with Safdie Architects to create an elaborate model of a completed Habitat 67, Moshe Safdie’s unfinished utopian development in Montreal, which never got the authorizations necessary to realize Safdie’s vision. The brutalist architecture looks gorgeous in the virtual light. (brutalist?) ((2008-01-12) Weder For Everyone A Garden)

Within a decade, Sweeney told me, most smartphones will likely be able to produce high-detail scans. “Everybody in humanity could start contributing to a database of everything in the world,” he told me. “We could have a 3-D map of the entire world, with a relatively high degree of fidelity.

Sweeney sees the metaverse more as a space for entertainment and socializing, in which games and experiences can be linked on one enormous platform. A person could theoretically go with her friends to the movies, interact with MetaHuman avatars of the film’s actors, drop in on an Eminem concert, then commit an act of ecoterrorism in Próspera, all without changing her mutant skin.

By this point in our conversation, Sweeney had stopped vibrating, and seemed more relaxed. He described the metaverse as an “enhancer”—not a replacement for in-person social experiences but better than hanging out alone

The military has experimented with using games as training tools since the seventies, and has been integral to the development of computer graphics and tactical simulators

In many cases, simulators are less focussed on photorealism and more concerned with physical, mechanical, and even sonic realism.


Edited:    |       |    Search Twitter for discussion