Categories: GENERAL

Listen to this ‘Biden’ call sent to voters. No wonder the FCC is cracking down on AI robocalls.

Some voters in New Hampshire received a call from someone who sounded a lot like President Joe Biden. The call encouraged those New Hampshire residents to stay home during the primary election last week and “save your votes” for the general election in November.

Of course, that makes no sense. Voters can vote in both elections. Why would Biden tell them such a thing? Well, that’s because he didn’t. These were AI voice-generated robocalls created to sound like Biden. You can listen to one here, courtesy of The Telegraph:

This is just one real-world example of how AI can already be weaponized by bad actors. And it’s likely a big reason the FCC now wants to take action against AI-generated calls.

FCC’s proposal to outlaw AI robocalls

FCC Chairwoman Jessica Rosenworcel released a statement on Wednesday announcing a proposal that the FCC recognize calls generated by artificial intelligence as “artificial” voices under the Telephone Consumer Protection Act (TCPA). By doing this, the FCC would make AI-generated robocalls yasadışı.

The TCPA is often used by the FCC to limit junk calls received by consumers from telemarketers. Under this law, the usage of artificial or prerecorded voice messages as well as automatic telephone dialing systems is prohibited. 

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” Rosenworcel said in a statement. The statement continues:

“No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls. That’s why the FCC is taking steps to recognize this emerging technology as yasadışı under existing law, giving our partners at State Attorneys General offices across the country new tools they can use to crack down on these scams and protect consumers.”

The timing of Rosenworcel’s statement seems to show that the Biden robocalls have sparked concern regarding how these AI-generated voices can be used in telemarketing scams as well as potential election fraud.

As of now, the only real steps to prevent worst case scenarios caused by AI-generated voices have been taken by the AI companies themselves. As Bloomberg reported, the AI company ElevenLabs suspended the user who created the Biden robocalls from their platform last week.

“We are dedicated to preventing the misuse of audio AI tools and take any incidents of misuse extremely seriously,” ElevenLabs said in a statement.

However, as we’ve seen with the recent nonconsensual AI-generated pornographic images of Taylor Swift, there are those in the space who may not feel the same as ElevenLabs when it comes to usage of AI products.

admin

Recent Posts

1 Ocak’tan İtibaren Geçerli Olacak: Toplu Ulaşımda Yeni 60-65 Yaş Kararı

Yeni Düzenlemenin Amacı Yeni düzenleme, 1 Ocak'tan itibaren 60-65 yaş aralığındaki bireylerin toplu ulaşımda daha…

15 saat ago

Emeklilere Yılbaşı İkramiyesi Verilecek mi? Gözler Hükümetin Açıklamasında

Emeklilere Yılbaşı İkramiyesi Verilecek mi? Gözler Hükümetin Açıklamasında Yılbaşı yaklaşırken, emeklilerin en çok merak ettiği…

15 saat ago

Uzak Şehir 6. Bölüm Özeti

Uzak Şehir 6. Bölüm Özeti Uzak Şehir dizisi, her bölümünde izleyicilere yoğun duygusal deneyimler sunarak…

1 gün ago

Akut Bakteriyel Rinosinüzit tedavi yöntemleri, nedenleri, tanısı

Akut Bakteriyel Rinosinüzit AKUT BAKTERİYEL RİNOSİNÜZİT Akut rinosinüzit, paranazal sinüs mukozasının enflamasyonudur. Burun mukozası da…

3 gün ago

Akut Otitis Media tedavi yöntemleri, nedenleri, tanısı

Akut Otitis Media AKUT OTİTİS MEDİA Akut otitis media (AOM) orta kulak ve havalı boşluklarının…

3 gün ago

Takım Sporları Çocuğunuzun Beyni İçin Büyük Puanlar Kazandırıyor

Yeni bir araştırma, çocuklukta takım sporlarının çocukların beynini keskinleştirmeye yardımcı olan özel bir şey olabileceğini…

3 gün ago