Dennis Lim's recent piece in The New York Review of Books on Alexandre Koberidze's Dry Leaf—a three-hour feature shot entirely on a Sony Ericsson W595, a slider phone from 2008 with a 240-pixel maximum resolution and a choppy fifteen frames per second—has me thinking about how badly games have lost the plot on graphical fidelity. I just started the new Indiana Jones from last year, so I'm being absolutely bludgeoned with Life.

And yet, Koberidze's wager, as Lim describes it, is that blurred, smeared, low-information images won't just be tolerable to eyes raised on 4K and 8K, but actively replenishing. And he's right. A smudge becomes a person, then vanishes, then reappears. Clouds gain serrated edges. The pixel itself—the unwanted video artifact—becomes a compositional element. The lag of digital compression becomes the movie's heartbeat. None of this is due to the camera's limitations. It's because of them.

Games have spent the last three decades sprinting in the opposite direction. Every console generation is sold on polygon counts, ray-tracing, hair physics, and skin subsurface scattering. The implicit promise is that emotional resonance scales with fidelity—that if we could just get the eyes wet enough, the moment would land. But anyone who has actually played games knows this isn't true. I can name pixelated things that are emotionally inert and photorealistic things that are emotionally inert, and the inverse for both. Fidelity and feeling are independent variables.

What Koberidze's film clarifies is that low-fidelity isn't a deficiency to apologize for or a retro affectation to wear ironically. It's a generative condition. Opacity makes the viewer work. When the image refuses to resolve, the mind fills the gap, and that filling-in is where meaning lives. Lim quotes Koberidze: "The more you remove, the more remains for the spectator." That's not a poverty argument. That's a design principle.