Categories: GENERAL

Listen to this ‘Biden’ call sent to voters. No wonder the FCC is cracking down on AI robocalls.

Some voters in New Hampshire received a call from someone who sounded a lot like President Joe Biden. The call encouraged those New Hampshire residents to stay home during the primary election last week and “save your votes” for the general election in November.

Of course, that makes no sense. Voters can vote in both elections. Why would Biden tell them such a thing? Well, that’s because he didn’t. These were AI voice-generated robocalls created to sound like Biden. You can listen to one here, courtesy of The Telegraph:

This is just one real-world example of how AI can already be weaponized by bad actors. And it’s likely a big reason the FCC now wants to take action against AI-generated calls.

FCC’s proposal to outlaw AI robocalls

FCC Chairwoman Jessica Rosenworcel released a statement on Wednesday announcing a proposal that the FCC recognize calls generated by artificial intelligence as “artificial” voices under the Telephone Consumer Protection Act (TCPA). By doing this, the FCC would make AI-generated robocalls yasadışı.

The TCPA is often used by the FCC to limit junk calls received by consumers from telemarketers. Under this law, the usage of artificial or prerecorded voice messages as well as automatic telephone dialing systems is prohibited. 

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” Rosenworcel said in a statement. The statement continues:

“No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls. That’s why the FCC is taking steps to recognize this emerging technology as yasadışı under existing law, giving our partners at State Attorneys General offices across the country new tools they can use to crack down on these scams and protect consumers.”

The timing of Rosenworcel’s statement seems to show that the Biden robocalls have sparked concern regarding how these AI-generated voices can be used in telemarketing scams as well as potential election fraud.

As of now, the only real steps to prevent worst case scenarios caused by AI-generated voices have been taken by the AI companies themselves. As Bloomberg reported, the AI company ElevenLabs suspended the user who created the Biden robocalls from their platform last week.

“We are dedicated to preventing the misuse of audio AI tools and take any incidents of misuse extremely seriously,” ElevenLabs said in a statement.

However, as we’ve seen with the recent nonconsensual AI-generated pornographic images of Taylor Swift, there are those in the space who may not feel the same as ElevenLabs when it comes to usage of AI products.

admin

Recent Posts

Muşamba Nedir, Ne Anlama Gelir? İle İlgili Yararlı Bilgiler

Muşamba Nedir, Ne Anlama Gelir? 03 Ekim 2024 Perşembe 22:40 ABONE OL Muşamba, su geçirmezlik…

1 ay ago

Yıl İle İlgili Yararlı Bilgiler

Yıl 30 Ekim 2008 Perşembe 20:43 ABONE OL Yıl Nedir?Dünyanın, güneş çevresinde tam bir dolanım…

1 ay ago

Gebelik Izlemi tedavi yöntemleri, nedenleri, tanısı

Gebelik Izlemi GEBELİK İZLEMİ Gebelik izlemi, gebeliğin planlanmasıyla başlayan, sağlıklı sürdürülmesini ve sorunsuz bir doğumu…

3 ay ago

Menopoz tedavi yöntemleri, nedenleri, tanısı

Menopoz MENOPOZ Menopoz, ovaryan aktivitenin (üreme ve östrojen yapımı) yitimi ertesinde, menstrüasyonun kalıcı olarak kesildiği…

3 ay ago

Birçok Kadın Endometriyal Kanserin Önemli Uyarı İşaretini Bilmiyor

Yeni bir araştırmaya göre, çok sayıda kadın, kadın üreme organlarının en yaygın kanseri olan endometriyal…

3 ay ago

Çok mu Oturuyorsunuz? Egzersiz Sağlığınıza Verdiğiniz Zararları Telafi Edebilir

Her gün sekiz saat veya daha fazla oturan kişilerin, her hafta 140 dakikadan az orta/yoğun…

3 ay ago