YouTube Expands its AI Fake Detection to Top Creators

On Wednesday, YouTube announced the expansion of its pilot program aimed at identifying and managing AI-generated content that mimics the likeness, including faces, of creators, artists, and other prominent figures.
The company also expressed its support for the NO FAKES ACT, legislation targeting AI-generated replicas that simulate someone’s image or voice to mislead others and produce harmful content.
YouTube collaborated on the bill with its sponsors, Senators Chris Coons (D-DE) and Marsha Blackburn (R-TN), as well as industry groups like the Recording Industry Association of America (RIAA) and the Motion Picture Association (MPA). Coons and Blackburn are set to reintroduce the legislation at a press conference on Wednesday.
YouTube Acknowledges AI’s Potential and Risks in Creative Expression
In a blog post, YouTube outlined its continued support, acknowledging that while AI has the potential to “revolutionize creative expression,” it also poses significant risks.
“We recognize the risks associated with AI-generated content, including its potential misuse and the creation of harmful material. Platforms must take responsibility and address these challenges proactively,” YouTube stated in the post.
“The NO FAKES Act offers a smart approach by balancing protection with innovation, giving individuals the power to notify platforms about AI-generated likenesses they believe should be removed. This notification process is crucial as it helps platforms differentiate between authorized content and harmful fakes — without it, platforms can’t make informed decisions,” the company added.
YouTube launched its likeness detection system in collaboration with the Creative Artists Agency (CAA) in December 2024.
Likeness Detection Technology Enhances YouTube’s Content ID System
The new technology builds upon YouTube’s existing Content ID system, which detects copyrighted material in user-uploaded videos. Like Content ID, the likeness detection system aims to automatically identify violating content — in this case, AI-generated faces or voices, YouTube explained earlier this year.
For the first time, YouTube is sharing the names of the program’s initial pilot testers, which include top creators like MrBeast, Mark Rober, Doctor Mike, the Flow Podcast, Marques Brownlee, and Estude Matemática.
During the testing phase, YouTube will work with these creators to refine and scale the technology. The program is set to expand to more creators over the coming year, although YouTube did not specify when the likeness detection system will be more widely available.
In addition to the pilot program, YouTube has also updated its privacy processes, allowing individuals to request the removal of altered or synthetic content that simulates their likeness. It has also introduced likeness management tools that help users detect and control how AI depicts them on the platform.
Read the original article on: TechCrunch
Read more: YouTube introduces Spotify Real-time Lyrics feature
Leave a Reply