
A man on TikTok has shared a warning to viewers after he received a phone call from someone who sounded exactly like his sister. He claims she was in hysterics and then she passed the phone over to a man who told him his sister had been abducted by the cartel.
TikToker “Captaindarebear” shared his PSA for “people who will probably go through this in the future.” In his video he explains that he received a call from an unknown number, something that he would usually ignore, however, as he had been expecting a call he answered it. To his surprise, it was his sister’s voice on the other end, he describes her as sounding hysterical.
Naturally, the man thought his sister had been involved in some sort of car accident and so he tried his best to calm her down. After unsuccessfully trying to figure out what was going on, his sister passed the phone over to a man who claimed he was a member of the cartel, “he tells me, “your sister was in the wrong place at the wrong time. She’s not injured. However, I’m a member of this cartel.”
At this point, Captaindarebear realizes this is a scam and promptly hangs up before calling his father to check that his sister is with him. Thankfully she is, which confirms his suspicions.
This is a very dangerous scam
While it’s great that the sister was never in any danger, it’s terrifying to think how advanced these scams are getting. Captaindarebear claims that the scammers used A.I. to almost perfectly recreate his sister’s voice despite the fact she isn’t on social media. This begs the question, how did they even have access to recordings of her voice in the first place? If it can happen to her, it can happen to anyone.
This scam could easily trick a lot of people, thankfully Captaindarebear had heard about the scam before but others might not be so lucky, even the most discerning of people could be fooled if they aren’t aware of how A.I. is being used by scammers. Hearing a loved one in distress could easily catch someone off guard resulting in them not thinking with a cool head and falling for it.
“How is this even possible?” Asked one viewer, “Where are they able to get her exact voice?” Others called for A.I. laws and regulations to be brought in faster. Some joked that they would use fake voices whenever answering unknown calls in future to avoid scammers recording their real voice.
This is pretty disturbing news but are we really surprised that A.I. is being used this way? Scammers aren’t exactly the most moral people. Hopefully more people educate themselves on the topic before they fall victim to this insidious scam.