This isn’t the primary time that researchers have suspected that ElevenLabs instruments had been used for political propaganda. Final September, NewsGuard, an organization that tracks on-line misinformation, claimed that TikTok accounts that share conspiracy theories utilizing AI-generated voices, together with a clone of Barack Obama’s voice, had been utilizing ElevenLabs’ expertise. “Greater than 99 p.c of customers on our platform create attention-grabbing, modern and helpful content material,” ElevenLabs mentioned in an emailed assertion to The New York Occasions on the time, “however we acknowledge that circumstances of abuse exist, and now we have frequently developed and launched safeguards to curb them.”
If Pindrop and Berkeley’s analyzes are appropriate, the deepfake Biden robocall was created utilizing expertise from one of the crucial distinguished and well-funded AI voting startups within the tech trade. As Farid notes, ElevenLabs is already seen as one of many highest high quality artificial voice choices available on the market.
In keeping with the corporate’s CEO in a latest Bloomberg article, ElevenLabs is valued by traders at greater than $1.1 billion. Along with Andreessen Horowitz, traders embody distinguished figures corresponding to Nat Friedman, former CEO of GitHub, and Mustafa Suleyman, co-founder of AI lab DeepMind, now a part of Alphabet. Traders additionally embody firms corresponding to Sequoia Capital and SV Angel.
With its beneficiant funding, ElevenLabs is arguably higher positioned than different AI startups to place sources into creating efficient safeguards in opposition to dangerous actors – a job made all of the extra pressing by the upcoming presidential election in the USA. “Having the best safeguards in place is necessary as a result of in any other case anybody can create a likeness of an individual,” says Balasubramaniyan. “As we method an election cycle, issues are simply getting loopy.”
On a Discord server for ElevenLabs fanatics, persons are discussing how they need to clone Biden’s vote, and sharing hyperlinks to movies and social media posts highlighting deepfaked content material that includes Biden or AI-generated dupes of Donald Trump’s votes and Barack Obama.
Though ElevenLabs is the market chief in AI voice cloning, in just some years the expertise has turn out to be broadly obtainable for firms and people to experiment with. That has created new enterprise alternatives, corresponding to making audiobooks cheaper, but in addition will increase the prospect of malicious use of the expertise. “Now we have an actual downside,” mentioned Sam Gregory, program director on the nonprofit Witness, which helps individuals use expertise to advance human rights. “When you’ve got these very broadly obtainable instruments, it is fairly troublesome to police.”
Whereas the Pindrop and Berkeley analyzes counsel it might be attainable to unmask the supply of AI-generated robocalls, the incident additionally underlines how ill-prepared authorities, the tech trade and the general public are because the 2024 election season approaches . For individuals with out specialist experience, it’s troublesome to verify the origin of audio fragments or decide whether or not they have been generated by AI. And extra superior analyzes will not be accomplished quick sufficient to offset the injury attributable to AI-generated propaganda.
“Journalists, election officers and others shouldn’t have entry to dependable instruments to do that shortly and shortly when probably election-altering audio is leaked or shared,” Gregory mentioned. “If this had been one thing that was related on Election Day, it might have been too late.”
Up to date 1/27/2024 3:15 PM EST: This text has been up to date to make clear the attribution of ElevenLabs’ assertion.
Up to date 1/26/2024 7:20 PM EST: This text has been up to date with feedback from ElevenLabs.