Ok, it seems that criminals has found it's way to scam people using the latest technology that we have right now, which is AI. This is not new modus though, but this time they have take advantage of AI plus human emotions to be able to pull this one of:
You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble — he wrecked the car and landed in jail. But you can help by sending money. You take a deep breath and think. You've heard about grandparent scams. But darn, it sounds just like him. How could it be a scam? Voice cloning, that's how.
https://consumer.ftc.gov/consumer-alerts/2023/03/scammers-use-ai-enhance-their-family-emergency-schemesSo scammers thru AI has found a way to clone and mimic our love ones voice. And then we received this calls, they they are any form in a emergencies and need money or even bitcoin at some point. And so you panic and send them the money right away without hesitation. And after that initial shock, you realized that you have been scammed already.
AI can replicate voices in high-tech phone call scams, FTC warns
So everyone just be careful out there, as AI is evolving and so is the criminals with elaborate modus to get money or crypto from us.