日批在线视频_内射毛片内射国产夫妻_亚洲三级小视频_在线观看亚洲大片短视频_女性向h片资源在线观看_亚洲最大网

Global EditionASIA 中文雙語Fran?ais
World
Home / World / Americas

AI becoming a handy tool for US fraudsters

By BELINDA ROBINSON in New York | China Daily Global | Updated: 2023-07-28 07:02
Share
Share - WeChat

Technology being employed to clone people's voice for ransom, govt warns

People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

"Mom! I messed up," her daughter said sobbing on the phone call.

DeStefano asked her daughter, "OK, what happened?"

She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

Cybercrimes on rise

Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

The software to clone voices is becoming cheaper and more widely available, experts say.

AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

Most Viewed in 24 Hours
Top
BACK TO THE TOP
English
Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US
主站蜘蛛池模板: 国产日韩欧美成人 | 天堂网视频在线观看 | 天天干天天狠 | 欧美一区二区三区精品 | 激情视频一区二区 | 国产精品久久久91 | 91久久久久国产一区二区 | 一区二区视频网站 | 国产精品高潮视频 | 一区二区三区四区五区在线 | 婷婷丁香在线 | 人人爽人人草 | 久草一区二区 | 国产黄色免费在线观看 | 快色视频在线观看 | 手机在线精品视频 | 91精品综合久久久 | 黄色大毛片 | 一级福利视频 | 九九热视频在线观看 | 日韩av免费在线 | 能看毛片的网站 | 中国黄色片视频 | 午夜视频网站 | 亚洲一区二区三区在线免费观看 | 欧美综合在线观看 | 成人首页 | 手机看片久久久 | 欧美韩日国产 | 日本中文字幕在线免费观看 | 午夜精品免费观看 | 国内91视频 | 免费黄网站在线观看 | 免费黄色看片 | 视频这里只有精品 | 中文字幕亚洲欧美日韩 | av片国产 | wwwxxx黄色| 中文字幕高清在线 | 欧美日韩亚洲天堂 | 黄色片国产 |