Pushing back on misinformation with provenance

 - Arthur Dash
- Arthur Dash

BitDepth#1427

Mark Lyndersay

SANTIAGO LYON, a former vice-president at the Associated Press, noted that Adobe's Content Authenticity Initiative (CAI) began in 2019 as an open-source initiative to address distrust of news and particularly images and video.

It isn't Adobe's first effort at open-sourcing its technologies. The portable document format (PDF) and the digital negative format (DNG) both sought to bring open access to document and raw digital-image formats respectively.

While PDFs are widely used and there is some adoption of DNG, most recently as a preferred format for smartphone raw files, the planned CinemaDNG format crashed ignominiously and Adobe doesn’t talk about it any more.

"We identified four main areas, starting with detection, which involves uploading suspect files to programs that look for telltale signs of manipulation," said Lyon, who is now head of Advocacy and Education for CAI, at a September 19 webinar.

"We deliberately decided not to get involved in the detection game, in part because it's not scalable. It's also not particularly accurate and ends up as an arms race, with bad actors staying one step ahead of the latest detection software. “We chose to focus on three other areas: policy, which involves briefing policymakers and lawmakers around the world to make sure that they're up to speed with the latest technologies.

"(We also offer) advice and education that's focused on media literacy and consumer literacy, but what we're really focused on is the notion of provenance. (CAI wants to make the) origins of digital content, whether images, video, audio recordings or other file formats, transparent, showing the viewer where and how the files were created, what changes might have been made to them along their journey from creation through editing to publication."

In the face of generative AI image manipulations and AI-powered image alteration tools that make falsification easy for novice users, journalism faces real challenges in ensuring that readers and viewers can trust what they see as a representation of the truth.

Despite concerns about image-manipulation and video deep fakes, which use video manipulation to falsify motion clips, Ariel Bogle, investigations reporter at the Guardian Australia, warns of the greater pervasiveness of cheap fakes.

"This is content that is real video that's been filmed at some point in time that now has been removed from its context. Maybe it's been slightly edited, slowed down, pauses on someone's face when they have a strange expression to suggest that they are evil or lying to the public.

"Cheap fakes end up recontextualising imagery, using basic effects which most people can access and use, speeding up, slowing down, adding colour filters to give a dingy or sinister aspect to videos."

Photojournalist Ron Haviv of the VII foundation explores how this happens in a documentary of his 30-year-old photo of violence in Bosnia (biographyofaphoto.com) which has been decontextualised, misrepresented and used as propaganda. Haviv is using CAI to tag the digital version of the brutal photo.

"Fake News is an insult. In this case, it's an insult to those who die. My documentation of a war crime in Bosnia of well-armed Serbian paramilitaries executing unarmed Muslim civilians is a perfect example of an image that needs to be believed then and also really to be believed now.

"If we, as a community of visual journalists, lose the trust of our audience, they just say like, ‘Oh, that's been Photoshopped,’ the tragedy becomes the stories of these people are not believed.

“It's important to also understand that the images that we're seeing, they're not just made up of pixels, they represent real people.

"When you break that chain, when you break that trust, then we're in great danger."

Getting traction for CAI implementation on the scale that is required, including hardware manufacturers, software programmers, creators and consumers, is a staggering task.

There are also certain classes of users who want to eliminate provenance for legitimate reasons. Journalists in a war zone would prefer not to share files that embed their GPS co-ordinates. A human-rights activist would not necessarily want to be identified with a damning document.

The end-use case of implementation would be a CAI icon that the viewer can click on to see what has been done to the image since its creation, a feature that's leagues beyond the simple rights-identification exercise that Creative Commons licensing has been trying to bring to the online imaging community, with only marginal success, since 2001.

Lyon notes that the primary markets for the technology are news producers, e-commerce, medical and satellite imagery and the insurance and law enforcement industries.

Mark Lyndersay is the editor of technewstt.com. An expanded version of this column can be found there.

Comments

"Pushing back on misinformation with provenance"

More in this section