News

Tamil Nadu Police Issues Advisory On New Scam Using AI-assisted Voice Cloning

Tamil Nadu Police Issues Advisory On New Scam Using AI-assisted Voice Cloning
SUMMARY

The Tamil Nadu police said cyber criminals are using AI to replicate the voices of family members and acquaintances over phone calls to trick victims into sending money

The advisory asked people to stay informed about common scams such as voice cloning fraud and look out for warning signs

The development comes at a time when there are concerns about the misuse of AI and a number of deepfake videos of prominent personalities are circulating on the internet

Inc42 Daily Brief

Stay Ahead With Daily News & Analysis on India’s Tech & Startup Economy

The cyber crime unit of the Tamil Nadu police has issued an advisory warning the general public against scamsters using artificial intelligence (AI) to clone voices and make calls to extort money from victims.

Citing a press note issued by the Tamil Nadu police, MediaNama reported that cyber criminals are using AI to replicate the voices of family members and acquaintances over phone calls to trick victims into sending money under the pretext of an emergency.

The advisory said that scamsters get a voice sample of someone the victim knows and trusts, such as family members or a friend, from their social media accounts or by calling using the ‘wrong number’ ruse.

The voice sample is then used to impersonate the person’s voice using AI software. 

This technology allows the fraudster to imitate the voice of a victim’s loved one accurately. The scamster then makes a call to a victim, imitating the voice of their trusted contacts and faking an emergency, which creates a sense of distress among victims, and they are tricked into transferring money immediately.

The cyber criminals employing AI-assisted techniques to scam people often ask victims to use payment methods such as UPI to send money.

In its advisory, the Tamil Nadu police advised people to stay informed about common scams such as voice cloning fraud and look out for warning signs.

It also cautioned the people against unusual requests for money, particularly those involving stressful situations and emotional manipulation.

Furthermore, the advisory calls on the public to verify the identity of people seeking urgent financial assistance using secure communication channels, such as encrypted messaging apps or video calls. 

The development comes at a time when there are concerns about the misuse of AI and deepfakes.

Recently, Prime Minister Narendra Modi, during an interaction with Microsoft cofounder Bill Gates, also flagged the possibility of misuse of AI without proper training.

PM Modi also called for using watermarks for AI-generated content to address the concerns over AI’s misuse.

Meanwhile, the number of deepfakes are also on the rise in the country. Recently, stock exchanges BSE and NSE, in separate statements, warned the general public about fake videos of their respective CEOs.

Actor Ranveer Singh also filed an FIR against an AI-generated deepfake video doing the rounds on social media in which he was purportedly heard voicing his political views.

Note: We at Inc42 take our ethics very seriously. More information about it can be found here.

Inc42 Daily Brief

Stay Ahead With Daily News & Analysis on India’s Tech & Startup Economy

Recommended Stories for You