Scientists have designed a way to save our brains from fake AI videos
Visual truth is going down in flames, thanks to new generative AI models that produce synthetic media that looks indistinguishable from reality. But a team of university researchers has figured out...
Source: www.fastcompany.com
Visual truth is going down in flames, thanks to new generative AI models that produce synthetic media that looks indistinguishable from reality. But a team of university researchers has figured out a hardware fix that just might save us. Engineers at ETH Zurich have designed a working prototype of a camera that physically stamps a cryptographic seal of authenticity onto every photo or video right at the image sensor (electronic chip) that captures each photon from the actual world. “Trust in digital content is eroding. We wanted to create a technology that gives people a way to verify whether something is genuine,” co-developer Felix Franke explained in a press release. This new hardware architecture fundamentally changes how we authenticate media. Right now, the tech industry relies on a standard called C2PA—Coalition for Content Provenance and Authenticity—which is already available on some devices, such as high-end cameras from Leica, Nikon, Fuji, and Sony’s Alpha