YouTube has officially launched its likeness-detection technology for creators in the YouTube Partner Program. This launch follows a successful pilot phase. The new tool enables eligible creators to detect and request the removal of AI-generated videos that imitate their face or voice without consent. The technology is aimed at protecting creators from the misuse of their likeness in content that could falsely endorse products, spread misinformation, or harm reputations.
According to a YouTube spokesperson, this is the first wave of the global rollout, and creators received notification emails on Tuesday morning.
On YouTube’s Creator Insider channel, the company outlined how the tool works. To begin, creators can access the new “Likeness” tab, agree to data processing terms, and scan a QR code using their smartphone. The code leads to an identity verification page where users must provide a photo ID and a short selfie video.
Once verified, creators gain access to a dashboard displaying all detected videos that feature their likeness. From there, they can submit a removal request based on YouTube’s privacy policies, file a copyright complaint, or choose to archive the video for record purposes.
In April, YouTube also voiced support for the proposed NO FAKES Act, legislation designed to curb the creation and distribution of deceptive AI-generated replicas that imitate individuals for malicious or misleading purposes.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



