RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2

Adobe Content Authenticity Initiative (CAI)

Product
Developers: Adobe Systems
Date of the premiere of the system: June 2022
Branches: Information Technology

2022: Release of open source software that will distinguish fake photos from real ones

In June 2022, Adobe released open source software that will distinguish fake photos from real ones. With this project called Adobe Content Authority Initiative (CAI), the company wants to counter the spread of visual misinformation online by adding metadata about the origins of images. This will not only help distinguish real photos from fakes, but also confirm the authorship of the content.

The Adobe Content Authority Initiative (CAI) project was first announced in 2019, after which the company released technical documentation on the technology, integrated the system into its own software and actively promotes it among partner companies. Now Adobe has announced a three-part open source software toolkit: the JavaScript SDK, which allows you to organize the display of content information in browsers, the command line utility and the Rust SDK for creating applications for stationary computers, as well as mobile devices - such software will help you create content with an exact link to the author or display data about content creators.

Release of open source software that will distinguish fake photos from real ones

The new standard, called C2PA, also records details of how the file was created, including exactly how it was created and edited. It is expected that this metadata can soon be viewed on social networks, photo editors and news sites.

The C2PA standard is the result of cooperation between CAI and partners such as Microsoft, Sony, Intel, Twitter, Nikon and well-known international publications. Using a number of software tools, any media platform can embed code into its site, allowing anyone to view image data.[1]

The main goal of CAI is to combat disinformation on the Internet. At the same time, content creators whose works are stolen and put up for sale can also become beneficiaries of the new system - with the development of the NFT market, this has become a serious problem. It is known that companies operating neural networks and similar systems are also interested in CAI. The integration of metadata into computer-generated images will distinguish them from original, human works. Adobe claims that unlike EXIF, new metadata is much more difficult to delete, and can be recovered quite easily even after being deleted using Adobe services.

The company believes that people with bad intentions will always find a way to fool others, but ordinary users will finally get more information about the origin of the images.[2]

Notes