Artificial intelligence voice cloning identified as significant concern, prompting authorities at the Cyber

schedule
2023-08-23 | 10:54h
update
2023-08-23 | 10:54h
person
theasiaspeaks.com
domain
theasiaspeaks.com
1
Read Time:3 Minute, 48 Second

NASIR ALI

Srinagar: An AI voice cloning scam is a type of scam in which criminals use artificial intelligence to create a convincing replica of someone’s voice. This can be done with as little as three seconds of audio. The criminal then uses the cloned voice to make phone calls, send text messages, or create other types of communications that appear to be from the victim. There are a number of different ways that AI voice cloning scams can work. In one common scenario, the criminal will create a fake voicemail or text message that appears to be from the victim. The message will often contain a request for money or other sensitive information. If the victim falls for the scam, they may provide the criminal with the information they need to steal their identity or commit other crimes.

A recent story reported by CNN highlights an incident where a mother received a call from an unknown number. When she answered the phone, it was her daughter. The daughter had allegedly been kidnapped and was phoning her mother to pass on a ransom demand.

In fact, the girl was safe and sound. The scammers had made a deepfake of her voice. This is not an isolated incident, with variations of the scam including a supposed car accident, where the victim calls their family for money to help them out after a crash.

Recently a report by McAfee suggested that a majority (69 percent) of Indians are unable to distinguish between a genuine human voice and an AI-generated voice. Since each person’s voice is distinctive, it can be considered as a biometric fingerprint that establishes credibility.

However, the prevalent practice of 86 percent of Indian adults sharing their voice data online or through recorded notes at least once a week (on social media and through voice notes) has made voice cloning a potent weapon for cybercriminals.

Advertisement

Talking with ‘Asia Speaks Dy SP Cybercrime Investigation Centre of Excelence, Crime Branch J&K Musadiq Basu said, “Cybercrimes are not static they continue to change, cybercriminals are known for their adaptability and constant evolution in their tactics to exploit vulnerabilities and gain financial benefit. In the past, scammers employed various tactics such as steganography video calling and KYC update calls to exploit individuals. However, due to concerted efforts by concerned authorities, these crimes have seen a decline in reported cases. Mass awareness campaigns have played a pivotal role in equipping the public with knowledge to counter these scams effectively,”

Basu added “As technology continues to advance, new opportunities for cybercrime emerge. Cybercriminals stay updated with the latest developments in software, hardware, and communication technologies to find new ways to exploit them. Cybercriminals often target human psychology through social engineering tactics. They craft convincing phishing emails, messages, and calls to manipulate individuals into revealing sensitive information or taking harmful actions. Cybercriminals frequently take advantage of emerging trends,”

He said “Ultimately, the primary motivation for cybercriminals is financial gain. They adapt their tactics to maximize profits while minimizing the risk of getting caught. Cybersecurity professionals and law enforcement work to develop defenses against cyber threats, when one side develops a new defense mechanism, the other side works to find a way around it. Firstly any cyber crime complaint can be lodged at NCRP portal (https://i4c.mha.gov.in/), or daily 1930,” he said.

In relation to the AI cloning scam, Mr. Basu mentioned that there hasn’t been specific notification about awareness related to this particular scam. Instead, the awareness efforts are generally in line with the guidelines provided by the Indian Cybercrime Coordination Centre. These guidelines have been endorsed and disseminated to the general public to ensure that people are well-informed about potential threats, including AI cloning scams. The aim of these efforts is to equip the masses with the knowledge and understanding needed to recognize and safeguard against such scams.

Meanwhile, the advisory reads, “The Artificial Intelligence based voice cloning is the new cyber crime in news. Apple Ios7 and various apps and sites like Heyge, Murf and Resemble Al etc. are being used for voice cloning,” the criminals are using computer generated voice sound to act as friends and relatives and in the process, money can be lost.

Sharing the modus operandi, the advisory reads further that the “Ai-ml (artificial intelligence/ machine learning) model learns the patterns and characteristics from the person’s voice recordings. (speech patterns, accents, voice inflection and even breathing). Learning includes pronunciation of words, tone of the voice and emotions,” it reads.

Share

Pinterest
Advertisement

Imprint
Responsible for the content:
theasiaspeaks.com
Privacy & Terms of Use:
theasiaspeaks.com
Mobile website via:
WordPress AMP Plugin
Last AMPHTML update:
13.10.2024 - 09:01:21
Privacy-Data & cookie usage: