
Aimed at tackling AI deepfakes and protecting original content from being used for AI training, the United States has introduced a new bill. The bill, dubbed Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED ACT), has strong support across the US political spectrum. Another senate bill termed “Take It Down Act '' was tabled last month that advocated for the removal of AI deepfakes specific to non-consensual intimate imagery.
Several controversies including Taylor Swift's AI generated deepfake nude images went viral on X (formerly Twitter), Facebook, and Instagram in January. This sparked a nationwide debate on the ills of AI technology.
Last month, a report from Forbes accused Perplexity AI - an AI enabled search engine- of stealing its content. New York based technology magazine Wired also followed suit with its own investigation that found Perplexity was summarising its articles despite the Robot Exclusion Protocol being in place, trespassing into areas of their website designated as off-limits to search bots.
Apart from addressing deepfakes, COPIED Act will also address the concerns of content creators, journalists, artists and musicians that AI has been profiting off of their work, without acknowledgement or paying deserved compensation.
The COPIED act will enable a mechanism whereby provisions will be made for a digital document called "content provenance information" that is akin to a logbook for all content- news articles, artistic expressions, images or videos- which will ensure authentication and detection of all AI generated content.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.