Scammers can use AI deep fakes to impersonate loved ones, get money

Friday, November 10, 2023
Think you could tell the difference between your loved one and a bot?

AI deep fakes are getting better by the day, and all the information we share online is helping scammers. They can use the technology to impersonate loved ones.
[Ads /]
"Deep fakes are critical. Even with limited tech knowledge, people can use AI to replicate someone else for a variety of reasons," Los Angeles Assistant FBI Director Don Alway said.

Scammers can use fake voices that sound just like friends and family in trouble. They use the deep fakes to ask for money. And the more information they have about you, the easier it is.

"Many of us have our voices on the internet and it is easy to replicate, so the only way to verify is to hang up and call that family member back and verify their status and get law enforcement involved immediately," Alway said.

Another safety step, create a password with loved ones you can share if a scammer strikes.



But, most importantly, limit the information you are sharing online, especially if you are heading out of town.
[Ads /]
"It is like putting a sign in front of your house saying, 'We are not home.' We would not do that in the real world, yet we are comfortable doing it in the social media space," Alway said.

Another fear for law enforcement, AI can create its own computer code, which can be used to send fake emails. It is a big worry that has experts warning the public to be careful what you click on.

Join us every weekday morning on Eyewitness News at 5 a.m. for our new segment, ABC7 On Your Side. John Gregory has you covered on money-saving tips, including tricks to save on your bills, smart negotiating tactics, plus where you can score free stuff!
Copyright © 2024 KABC Television, LLC. All rights reserved.