Thu. Nov 21st, 2024
Occasional Digest - a story for you

WHEN mum-of-four Jennifer DeStefano picked up the phone last January her blood ran cold as her terrified teen daughter Briana sobbed and screamed for her help.

“Mum, these bad men have me, help me! Help me!” 15-year-old Briana cried out, her panic-stricken mum’s mind going into overdrive as she desperately tried to think how she could save her daughter. 

Jennifer DeStefano was aghast when she seemingly received a call from her daughter Briana (both pictured) screaming for help4

Jennifer DeStefano was aghast when she seemingly received a call from her daughter Briana (both pictured) screaming for helpCredit: Supplied

However, it would later transpire that it was all completely fake – a cruel and chilling con. 

It wasn’t Jennifer’s daughter at all, but an AI robot perfectly mimicking her cries and voice seemingly as part of an elaborate ploy to try and scam Jennifer out of tens of thousands of pounds.

It’s a situation that sounds unimaginable, but Jennifer’s experience isn’t just a one-off.

Horrifyingly, recent research by online protection company McAfee reveals almost a quarter of Brits have already experienced an AI voice scam or know someone who has, and most of these victims (78 per cent) lost money as a result.

READ MORE FABULOUS REAL LIFE

Researchers also found a voice can be cloned from just three seconds of audio – particularly eye-opening when you consider how much of our – and our families – lives are posted on social media.

McAfee Senior Security Researcher Oliver Devane explains: “Scammers are using AI voice cloning technology to dupe parents into handing over money, by replicating their childrens’ voices via call, voicemail or WhatsApp voice note. 

“In most of the cases we’re hearing of, the scammer says they have been in a bad accident, such as a car crash, and need money urgently.

“Having tested some of the free and paid voice cloning tools online, we found in one instance, that just three seconds of audio was needed to produce a good match. 

“For cybercriminals, this is a more sophisticated example of spearphishing – which is a term used for any type of targeted attack. 

“The cybercriminal is betting on a parent becoming worried about their child, letting emotions take over, and sending money to help.”  

HOW TO STAY SAFE FROM AI SCAMMERS

Here, F-Secure Threat Intelligence Lead Laura Kankaala shares her top tips to spot a scam, and what you need to be aware of to avoid falling victim to a cruel hoax…

  • Any type of content can now be fake: AI tools used by scammers rely on data being available on the web. It takes as little as several seconds of audio to create a convincing AI generated voice note. Similarly, images can easily be taken from social media and altered.  For example, a scammer may use an AI image generator to change the background to a dangerous scenario or add in today’s front page of a newspaper in minutes.
  • Ask a security question or agree a safe word: AI is intelligent but it can’t replicate personal relationships. If suspicious, ask something personal and unique. For example, ‘what’s the name of your first teddy bear?’ or ‘how many bones have you broken and how?’. Avoid asking things like, ‘what’s the first line of your address? or ‘what is the name of your first pet?’ as scammers can access this type of information, which is often inputted online for online purchases and account recovery.
  • Agree on a password with family members: It should be noted however that in the case of a real emergency, your child may not remember or say the password out loud, so this might not be the most reliable option.
  • Phone numbers can be spoofed: Even if the call you receive looks like it’s from your child’s phone number, phone numbers can be cloned. This is known as spoofing, and is a common tactic used by scammers as it encourages the recipient to trust the caller. Using a phone number alone as verification is not sufficient.
  • Hang up and tell them you’ll call them back: If it’s a call you’ve received and you suspect it’s not genuinely them, say you’ll hang up and contact them. A phone number can be faked for incoming calls, but not outgoing. So use your phone’s contact app and call your child that way, to confirm it was them.
  • Never give any bank details over the phone: Never share bank details over the phone or via messages/email even in legitimate circumstances, as it increases the risk of someone misusing or accidentally leaking your sensitive information. It’s normal for your child to ask for money, but if any suspicions arise, give them a call and make sure they are okay.
  • Educate and inform: Education is the best form of prevention – by being aware of such scams you can be alert to the warning signs and act accordingly. Share this knowledge with family members – particularly older relatives who may be less technologically savvy. You should also educate your children, even if they’re older and comfortable with technology.

‘She was crying and sobbing’

Jennifer, 44, lives in Arizona, US, with her husband of 22 years and their four children, including Briana, 16. 

She still has no idea how her daughter’s voice was cloned – or who was responsible, and had never even heard of AI scams before becoming subjected to an attempt to trick her into handing over thousands to save her daughter Briana from kidnappers in 2023.

At the time Briana was on a ski trip, and Jennifer was on the way to pick her other daughter up from a dance rehearsal when she received the ominous call. 

Speaking to Fabulous online, Jennifer says: “I initially assumed Briana had injured herself and maybe she was calling to let me know. 

“She was crying and sobbing, and saying ‘mum, I messed up’, and then I heard a man telling her to lie down. 

“I worried maybe she was hurt and was being tobogganed off the mountain or something. 

“But she kept crying, and then suddenly she said ‘these bad men have me, help me, help me.” 

“She was screaming and pleading, and then a man came on the phone and that was the last I heard of her voice.”

Jennifer was relieved to discover her daughter was safe, but the scam raised alarming questions around how easily they mimicked her voice and cries

4

Jennifer was relieved to discover her daughter was safe, but the scam raised alarming questions around how easily they mimicked her voice and criesCredit: Supplied

Panic mode

Jennifer had been plunged into every parents’ worst nightmare.

She says: “That’s when I really went into panic mode, and I felt sick.

“My mind was racing, and I quickly started to think about what I could do to help my daughter. 

“I still don’t know how they did it but it sounded exactly like Briana. It wasn’t just her voice, it was the way she sobbed and cried out, and I didn’t for a second doubt it was her. 

“A mother knows her own child’s cry – it’s like a fingerprint, unique, and that’s what threw me off the most.

“He said if I told anyone, he was going to pump her stomach full of drugs and I’d never see her again and asked for $1m, which they then lowered it to $50,000. 

“I asked for a routing number and wiring instructions, but the man refused and demanded cash. 

“He said they would pick me up in a white van with a bag over my head, and he said if I didn’t have all the money, then we were both going to be dead.” 

‘I collapsed to the floor in tears of relief’

While still on the phone, Jennifer ran into the dance studio, and another mother grasped the seriousness of the situation and called the police, who said they’d received a spate of reports of similar calls that turned out to be AI hoaxes. 

While this gave Jennifer hope her daughter was safe, she still feared the worst, saying: “We’d actually tragically had a family friend abducted a few years before and he’d actually been killed, so it was terrifying as I thought maybe this was linked to that. 

“I demanded to speak to her again, but the man refused and kept insisting he had her.”

It was only when Jennifer was thankfully able to speak to her actual daughter – who was still on the ski trip with her dad – just minutes later that she finally allowed herself to believe she was safe. 

She says: “Up until that point I’d never even heard of AI scams, and I genuinely believed it was Briana who’d been screaming for help down the phone, so while I wanted to believe she was safe, I couldn’t believe it until I talked to her and she verified it was definitely her. She’d no idea what was going on. 

“At that point, I hung up and collapsed to the floor in tears of relief.” 

Thankfully Briana was safe, and still on her ski trip - but the scam was alarmingly believable

4

Thankfully Briana was safe, and still on her ski trip – but the scam was alarmingly believableCredit: Supplied

‘I couldn’t believe the lack of humanity’

Sadly, the culprits were never found, and there was no law to pursue – so to the police it was unfortunately just deemed a ‘prank call’. 

Jennifer says: “It was the worst financial scam I’d heard of in my entire life – I couldn’t believe the lack of humanity, using my daughter’s voice to stage such a horrendous scenario for money was just the lowest of the low.

“Here’s your child in harm’s way, and you’re pleading for their life. It’s every parent’s worst nightmare, and it still makes me emotional just thinking about it now. 

It sheds light on a whole new world and reality that I had no idea even existed – and that is terrifying”

Mum-of-four Jennifer DeStefano

“It wasn’t just the money, but also the fact they were asking where I was and to physically pick me up too and were asking my location – it scared me to my core as my greatest fear would be that this technology could be used to lure in children. 

“Now, we have a code word, and we’re aware to ask questions that only we would know the answer to, but even then it’s not foolproof. 

“One other mum told me she received a fake call using her son’s voice and they even knew his unique nickname somehow.

“Others have asked what the codeword is and the scammers have just hung up.

“I’ve had so many people reach out to say they’d experienced a similar thing, whether it’s a call about a kidnapping or an accident or they’re in trouble and in prison – there are loads of different scenarios, and the deep fake videos that are coming out now too are so scary. 

“We need more legislation around AI, and AI being used to aid crime. There needs to be penalties and consequences to misusing it.

“It’s a relief my daughter is safe and that situation wasn’t real, however it also sheds light on a whole new world and reality that I had no idea even existed – and that is terrifying.” 

‘Pause and think’

It’s a scenario that is so difficult to imagine being in, but it’s important to remember scammers will try to put you in situations that make it almost impossible to think straight.

Especially as it’s a problem that is only set to increase, with the the amount of fraud committed in the UK more than doubling to £2.3bn in 2023, according to accounting firm BDO’s latest FraudTrack report.

Mcafee’s Oliver Devane adds: “If you do get a call or voicemail from someone asking for help and money, just pause and think. 

“Does that really sound like your child? Even if it’s a number you recognise as theirs, as it may be fake. 

“It might be wise to use a previously agreed codeword or ask a question only they would know.

Also remember that cybercriminals are betting on emotions running high. 

“They will play on your connection to your child and create a sense of urgency to prompt you into action. Try to remain level-headed and pause before you take any next steps.”

Recent research by online protection company McAfee reveals almost a quarter of Brits have already experienced an AI voice scam

4

Recent research by online protection company McAfee reveals almost a quarter of Brits have already experienced an AI voice scamCredit: Getty

Source link