Author

Topic: [Warning]: AI voice replicating scams (Read 213 times)

legendary
Activity: 1890
Merit: 1537
May 03, 2023, 07:01:14 PM
#22
So scammers thru AI has found a way to clone and mimic our love ones voice. And then we received this calls, they they are any form in a emergencies and need money or even bitcoin at some point. And so you panic and send them the money right away without hesitation. And after that initial shock, you realized that you have been scammed already.
It is really scary, and artificial intelligence is a double-edged technology that can be used for good and for the benefit of humanity and can be used by a criminal group in acts of evil and scams, cloning of voices, mimicking faces or handwriting and conversations by means of AI that can be believed by a large group of people who do not know such modern technology, they can be deceived quite simply, but the scammer, before using AI voice, collects somewhat large information & data about the targeted victim, the people close to the victim, sensitive information, their phone numbers, names, accounts and so on, which ensures that his method of scam will not be exposed and can be believed without the slightest doubt, so if the person is anonymous on the Internet and maintains his privacy, there is no doubt that this scam can be detected in minutes.
legendary
Activity: 3080
Merit: 1353
Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.

However this is no more than a hunch.


The data from the scamming call centers can now start selling those audios and personal data.

As far as I know, only the phone numbers are the data that can be maximized by the scammers from call centers, not sure they recorded voices. But in any case, this is the first time that an AI replicating scams has been reported or being in front of the news right now. Usually it just the face of someone being used and then inserted people's voice.

But this is the other way around, it's the real face that the victims recognized and know, and then that voices that they copy. So in a sense when you hear it, your first impulse is to help right away.
hero member
Activity: 3024
Merit: 614
Leading Crypto Sports Betting & Casino Platform
It's being developed to replace you, me, and the rest of the humans who demand wages, pension, benefits, and time off. Robots and AI don't need anything. One engineer who can operate it and give it commends suffices.
I have this feeling we are going on a Matrix or The Terminator scenario where we all depends on machine and AI this is because engineers are so genius they want businesses and industries to make and create things in the easiest possible way.
Going back to these AI voice-replicating scams they could scam a lot of people because the ears are one of the parts of our senses that is easily deceived.
People should be aware of this they can do this by advising their loved ones how to verify the call or what happened, we have a modern phone now they can request to do a video call to verify.
legendary
Activity: 2730
Merit: 7065
The AI technology has gone too far and evil mind people are using it to scam people. I wonder if one day, these scammers may use deepfake videos to scam people
Youtube already has plenty of these. It's going to become a problem if they figure out how to fake a person in a live video call. As long as you can call the person who has to show their real face, it's still ok.

Also, when you get a threatening call, these people won't give you a chance to hang up the phone call and confirm things. They just create such a panic that you feel that if you do not react quickly and agree on their demands, you might end up losing the life of your near and dear ones. So these people will demand an instant transfer of money.
This isn't such a situation. This particular scam is not about hostages and demanding money to save the life of a family member. The scammers pretend they are the family member who is in urgent need of money. And that can luckily be verified in a number of ways like I explained above.

Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.
Social media, of course. People post everything about their lives on Facebook, Instagram, and the rest. I don't think it will change in the future, making it even easier to get someone's voice and looks for free.

AI is created and being developed to help many industries to ease the job of their workers and to help industries to reach their goal of maximizing their potential...
It's being developed to replace you, me, and the rest of the humans who demand wages, pension, benefits, and time off. Robots and AI don't need anything. One engineer who can operate it and give it commends suffices.
full member
Activity: 2324
Merit: 175
AI is created and being developed to help many industries to ease the job of their workers and to help industries to reach their goal of maximizing their potential,  but we know scammers are innovative, every tool or application for them can be used to scam people, and so we have this voice replicating scams.
Government and authorities should come out with warnings and advisories on the existence of this scheme.
hero member
Activity: 2632
Merit: 833
Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.

However this is no more than a hunch.

Most likely, in today's world everyone is on the social media hype, and whether we like it or not, most of the times we leak this kind of voice sound/wavelengths. And before we can only see this in the movies, like Mission Impossible wherein Tom Cruise can not just mimic the face, but the voice itself. So it's really very dangerous world that we live it because we don't know if we are going to be the next victim of this type of scams. And with that, just like the rest of us is saying in crypto market, trust no one.
member
Activity: 1103
Merit: 76
Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.

However this is no more than a hunch.


The data from the scamming call centers can now start selling those audios and personal data.
legendary
Activity: 3416
Merit: 1225
Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.

However this is no more than a hunch.



I'm sure on the social medias like Facebook and Youtube, Tiktok,  you can upload videos on these social medias and if you have a scammer who knows you or living around you they can spy on you and make you a target.

Scammers are very innovative they used every existing tool to scam people and now that we have AI they will find a way to scam people because its easy money, and majority are not well informed or do not know how to react in such scenarios.
legendary
Activity: 1666
Merit: 1037
Question: Where do people think the criminals are getting the initial voice sound/wavelengths from in order to be able to replicate it with AI?

My hunch: Big Tech/Social media.

However this is no more than a hunch.

legendary
Activity: 2436
Merit: 1104
I've seen several videos on youtube of people posting videos using voice mimic AI to mimic a singer's voice and make it sing a song, it was fun hearing it but after this thread, it is actually quite concerning, if this became more rampant and people became cautious, this could affect actual loved ones that are in dire need of help and using an unknown number to call their family.

member
Activity: 1103
Merit: 76
This is the first time I hear about this type of voice scam. I wonder how much audio the AI would need to be able to create a fake voice recording?
you only need minutes of clear conversation.
the software price is affordable making it for scumbags lucrative they only need to pay $50-150 and if they can scam people for thousands bucks then it is a good investment for them.
hero member
Activity: 2870
Merit: 594
This is the first time I hear about this type of voice scam. I wonder how much audio the AI would need to be able to create a fake voice recording?
Still, unless you are an older citizen, this shouldn't be something you should worry about. If someone calls you pretending to be a family member, hang up and call them to verify what's going on. If you are still in doubt, ask them questions only those close to you would know. It can be anything you two have witnessed, lived through, spoken about, etc.
A scammer won't know those answers.

The AI technology has gone too far and evil mind people are using it to scam people. I wonder if one day, these scammers may use deepfake videos to scam people  Sad
We've already seen deepfake videos already, with Elon Musk face and voice. And I read that AI does create a fake CEO face or something to that effect to scam investors of their hard earn money.

Also, when you get a threatening call, these people won't give you a chance to hang up the phone call and confirm things. They just create such a panic that you feel that if you do not react quickly and agree on their demands, you might end up losing the life of your near and dear ones. So these people will demand an instant transfer of money.

Anyways, the only way to be safe from them is to keep your nerves and also those who know about this scam will surely act in a more sensible way.
I agree, we might be overwhelm in the beginning if we received such calls, but we should confirmed everything first and talk to the people around us to calm things and then see if this is real or not. Otherwise, if we react with our instincts, we might give in to this scammers and then send our money in a haste or even might fall for the crypto thing as well.
hero member
Activity: 2926
Merit: 567
I just knew that something like this can happen AI is being used to scam people because AI is smart and can imitate they can be used by AI as a platform to scam people, and the scammers will have to do a background check first this scheme is very old I have read and heard it hundreds of times in TV news but they usually targeted house helpers.
I would verify it if the call is coming from my relative phone or check the hospitals if you're not good at tracing or you first time you encounter this you will likely fall for it.
legendary
Activity: 3136
Merit: 1172
Leading Crypto Sports Betting & Casino Platform
May 02, 2023, 03:02:58 PM
#9
This is the first time I hear about this type of voice scam. I wonder how much audio the AI would need to be able to create a fake voice recording?
Still, unless you are an older citizen, this shouldn't be something you should worry about. If someone calls you pretending to be a family member, hang up and call them to verify what's going on. If you are still in doubt, ask them questions only those close to you would know. It can be anything you two have witnessed, lived through, spoken about, etc.
A scammer won't know those answers.

The AI technology has gone too far and evil mind people are using it to scam people. I wonder if one day, these scammers may use deepfake videos to scam people  Sad

Also, when you get a threatening call, these people won't give you a chance to hang up the phone call and confirm things. They just create such a panic that you feel that if you do not react quickly and agree on their demands, you might end up losing the life of your near and dear ones. So these people will demand an instant transfer of money.

Anyways, the only way to be safe from them is to keep your nerves and also those who know about this scam will surely act in a more sensible way.
legendary
Activity: 2730
Merit: 7065
May 02, 2023, 01:55:02 PM
#8
This is the first time I hear about this type of voice scam. I wonder how much audio the AI would need to be able to create a fake voice recording?
Still, unless you are an older citizen, this shouldn't be something you should worry about. If someone calls you pretending to be a family member, hang up and call them to verify what's going on. If you are still in doubt, ask them questions only those close to you would know. It can be anything you two have witnessed, lived through, spoken about, etc.
A scammer won't know those answers.
hero member
Activity: 1876
Merit: 721
Top Crypto Casino
May 02, 2023, 12:14:00 PM
#7
This is a disturbing development. It's always important to verify the identity of the person on the other end of the line before sending any money or personal information.
Normal people are not aware of such traps, and people end up sharing their personal information with strangers. As a result, scammers use such developments to steal people's important information. Those who have knowledge about AI voice can easily understand these voices that someone is trying to scam using this voice. So it can now be seen that scammers will continue to try to scam using such AI voice.
jr. member
Activity: 53
Merit: 2
May 02, 2023, 12:03:19 PM
#6
There are a few scams, e.g. "otc trades" conducted via discord, where the impersonation of a trusted party is already a topic even without AI. The easy solution is to reverse reach out anytime anyone asks for money / tells you to send money somewhere to check for impersonations. If the scammers use AI to impersonate the other person this can be also to a degree dealt with by asking a question from the past only the real counterpart would answer.
jr. member
Activity: 47
Merit: 1
May 02, 2023, 06:05:43 AM
#5
This is a disturbing development. It's always important to verify the identity of the person on the other end of the line before sending any money or personal information.
Ucy
sr. member
Activity: 2674
Merit: 403
Compare rates on different exchanges & swap.
May 02, 2023, 03:17:49 AM
#4
It can also mimic faces, writing pattern, handwriting etc and use them in live conversations. This is part of the reasons people are adviced to mask their voice, faces, writing pattern while in live conversation in trustless environment. Whenever they decide to communicate with love ones with their real identity over the internet or trustless environment, the data should be produced and encrypted offline before it's sent to their love ones who must be the only ones with decryption keys... The decryption and going through the data/information should be offline too.
sr. member
Activity: 602
Merit: 442
I buy all valid country Gift cards swiftly.
May 02, 2023, 12:00:58 AM
#3
I recently  came across a clone video call and mehhh
I was really impressed and I think this AI is beginning  to favour this scammers and criminals than even its main purpos.
This is a wake up call to everyone to always stay very alert and not fall for every cheap scam as I'm sure that this scammer are truly not relenting or backing off soon as it seems they're  always up to date  with new scams.
I always encourage  people using truecaller features and applications on their phones as this will also help  detect some of this  scam calls.
legendary
Activity: 1372
Merit: 2017
May 01, 2023, 11:42:03 PM
#2
Yes, we will have to be careful with this. The scams in most cases are based on taking advantage of a mistake and/or large numbers: they are tried with many people and statistically in the end some of them fall. If we add to this this technological advance I think it can give a boost to scammers by the appearance of reality until a time comes when the general population is aware of this and is alert.

And the bad thing is that this is only one of the aspects in which AI can help scammers, I guess there will be more. But I also wonder if AI could not help us to defend ourselves from this kind of scams, detecting them or in some other way.
hero member
Activity: 1414
Merit: 542
May 01, 2023, 10:54:28 PM
#1
Ok, it seems that criminals has found it's way to scam people using the latest technology that we have right now, which is AI. This is not new modus though, but this time they have take advantage of AI plus human emotions to be able to pull this one of:

Quote
You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble — he wrecked the car and landed in jail. But you can help by sending money. You take a deep breath and think. You've heard about grandparent scams. But darn, it sounds just like him. How could it be a scam? Voice cloning, that's how.

https://consumer.ftc.gov/consumer-alerts/2023/03/scammers-use-ai-enhance-their-family-emergency-schemes

So scammers thru AI has found a way to clone and mimic our love ones voice. And then we received this calls, they they are any form in a emergencies and need money or even bitcoin at some point. And so you panic and send them the money right away without hesitation. And after that initial shock, you realized that you have been scammed already.


AI can replicate voices in high-tech phone call scams, FTC warns

So everyone just be careful out there, as AI is evolving and so is the criminals with elaborate modus to get money or crypto from us.
Jump to: