- Advertisement -
HomeTECHAI Voice Scam: Woman loses 1.4 Lakh to imposters, know how to...

AI Voice Scam: Woman loses 1.4 Lakh to imposters, know how to keep yourself safe

An AI voice fraud in India cost a woman a whopping Rs 1.4 lakh when she fell victim to it. This is a frightening example of how crime is helped by technology.

AI Voice Scam: A 59-year-old woman lost an incredible Rs 1.4 lakh in a sad occurrence at the hands of a clever AI-generated voice scam. In order to create a terrifying story that shocked the victim, the fraudster imitated her nephew’s voice. This terrible incident highlights how easy it is for people to fall victim to technology-enabled scams. Read on to learn more about the incident.

AI Voice Scam

A woman in Hyderabad was tricked into losing Rs 1.4 lakh by a cunning artificial intelligence (AI) scam in a sophisticated new scam that is making waves around the globe. The artificial intelligence was designed to mimic her nephew’s voice from Canada, and it persuaded her to send the required amount by telling a story filled with worry and urgency. This tragic story highlights an increasingly common practice in which artificial intelligence is impersonating relatives who live abroad, particularly in places like Canada and Israel, in order to take advantage of the strong emotional ties that bind families together to make money.

When the AI successfully mimicked the nephew’s speech patterns and the well-known Punjabi dialect he spoke with his aunt, the victim fell for the scam. The caller gave the woman a false story that made her fear. She made many anonymous transfers to the account in question.

Unfortunately, Prabhjyot had already made several money transfers to the account provided by the fake caller before she became aware of the call’s false character.

How to stay safe?

Artificial intelligence voice scams are complicated, as illustrated by Prasad Patibandla, Director of Operations at the Centre for Research on Cyber Intelligence and Digital Forensics. Prior to transferring funds, he highlighted that it is imperative to verify the severity of a situation.

The public is urged to be vigilant and confirm the legitimacy of distress calls, particularly from relatives who live overseas, by the authorities, who acknowledge that frauds related to artificial intelligence may not receive widespread media attention.

Keep watching our YouTube Channel ‘DNP INDIA’. Also, please subscribe and follow us on FACEBOOKINSTAGRAM, and TWITTER

Enter Your Email To get daily Newsletter in your inbox

- Advertisement -

Latest Post

Latest News

- Advertisement -