Have you come across a video of Virat Kohli or Ratan Tata promoting an investment scheme promising incredible returns? If so, you have just seen a deepfake video.
Scammers are setting up over 1,000 phishing domains daily and creating deepfake videos of well-known Indian personalities like Mukesh Ambani and Virat Kohli to push dubious gaming apps, according to a recent report by cyber security firm CloudSEK. The report revealed that scammers manipulate footage of news anchors to produce fake videos, falsely portraying these figures as endorsing fraudulent apps.
"Scammers are not only using deepfake videos to lure people to download the dubious app but have also created a fake Play Store to appear genuine," CloudSEK said in a report published on October 4. The company's research indicated that over 1,000 phishing domains are registered each day to deceive users in more than seven countries.
The report explains how figures like Mukesh Ambani, Virat Kohli, Neeraj Chopra, and international icons like Cristiano Ronaldo, James Donaldson (Mr Beast), and Pakistani actress Hania Aamir have been used in these videos to promote the fake apps. CloudSEK's team, using their deepfake detection technology, uncovered campaigns designed to deceive users in India, Pakistan, Nigeria, and Saudi Arabia.
"The videos lure users by promising substantial financial rewards through minimal investments, claiming users can multiply their money simply by playing the game," CloudSEK said.
For instance, there's a video circulating online of a game called Aviator, an investment game where users can supposedly earn money by investing a small deposit. The video depicts an airplane flying with a multiplier of 17, 50, or 150, and claims that users automatically earn money by investing 1,000 Kenyan shillings. It encourages viewers to download the Aviator app and take advantage of a bonus, making the process seem straightforward and lucrative.
Modus operandi
More From This Section
The scammers' strategy begins by creating numerous fake domains at least a week before publishing ads on social media platforms like Facebook and Instagram. The domains, such as Luckyavin[.]fun, are typically hosted on [.]top, [.]fun, and [.]world top-level domains, with Belize in Central America identified as the origin country. An ISP called IQWeb FZ-LLC is mainly used for these domains.
These phishing domains mimic Google Play Store or Apple App Store layouts, featuring similar designs and logos to mislead users. "To further deceive users, the sites incorporate hard-coded comments and data to appear more authentic," CloudSEK noted.
In some cases, links only redirect to the fake domain if they detect the request originates from Facebook, using specific query parameters like ‘fbp’ to identify these requests.
Deepfake videos as part of the scam
Deepfake videos are a central part of these campaigns, often featuring news anchors like Shweta Singh from Aaj Tak, Arnab Goswami from Republic TV, and Sudhir Chaudhary from Zee News. These fake news segments promote the gaming apps by claiming they have helped people from different backgrounds improve their lives.
The deepfake footage then transitions to well-known personalities like Virat Kohli, who is frequently targeted. The videos show these celebrities allegedly endorsing the app or discussing their own investments in it, to encourage viewers to do the same.
Expansion of target regions
Initially aimed at the European Union (EU) population in early September 2024, these scams have since expanded to target regions such as India, Nigeria, Pakistan, Bangladesh, Saudi Arabia, and Southeast Asia. Interestingly, the report noted no instances of deepfake use in the EU campaigns, suggesting that these tactics might be more common or acceptable in certain markets.
Fake Play Store installations and payment scams
When users attempt to install the app from the fake Play Store, they are prompted to install a proxy_chrome application that then launches a fraudulent ‘1win’ login. Once on the app, users must top up a minimum of Rs 300 to play, making it a basic entry-level scam.
Payment methods vary by country and include options like UPI, AstroPay, VISA, MASTERCARD, and cryptocurrencies like Bitcoin (BTC), Ethereum (ETH), and USDT. This range of options adds to the challenge of tracing transactions and holding scammers accountable.
The method bears similarities to pig butchering scams, where players are given small initial profits to encourage them to invest more heavily later on, leading to greater financial losses. Multiple users have reported that customer support numbers listed on these sites are unresponsive.
In what other cases is deepfake technology used?
While deepfake technology initially gained traction in the movie industry for creating special effects and animations, it has now evolved into a tool with various applications, some of which raise serious ethical and legal concerns.
"Many of these scams utilise audio deepfakes, creating what are known as "voice skins" or "clones" that allow scammers to pose convincingly as prominent figures. If you receive a call from someone claiming to be a business partner or client asking for money, it's crucial to verify their identity, as it could be a scam using this technology, according to Norton, an anti-virus software developer.
In April, India's National Stock Exchange (NSE) warned investors about deepfake videos featuring its CEO, Ashishkumar Chauhan, giving stock recommendations. The NSE issued this alert after detecting that Chauhan's face and voice were being manipulated in videos offering fraudulent investment and stock advice. "Such videos seem to have been created using sophisticated technologies to imitate the voice and facial expressions of Ashishkumar Chauhan," the NSE stated.
In another incident in February, a Hong Kong-based company fell victim to a deepfake scam that led to a financial loss of over $25.6 million (more than Rs 200 crore). An imposter used a deepfake to portray himself as the company's Chief Financial Officer (CFO) during a conference call, ordering a money transfer.
How to spot deepfake videos
According to Norton, several telltale signs can help detect deepfake videos, including:
Unnatural eye movement and lack of blinking
Unnatural facial expressions
Facial morphing, where one image is poorly stitched onto another
Unnatural body shapes or proportions
Synthetic or unrealistic hair
Abnormal skin colours
Awkward head and body positions
Inconsistent head movements
Strange lighting or colour discrepancies
Poor lip-syncing
Robotic-sounding voices
Digital background noise
Blurry or misaligned visuals
Efforts to combat deepfakes
Researchers are making strides in developing technology to detect deepfakes. For instance, a group of students known as "Team Detectd" from Raisoni College of Engineering in Nagpur has developed AI and computational neural networks capable of identifying manipulated audio, images, and videos with over 90% accuracy. As of their latest report, the team had precisely identified more than 7,000 deepfake videos with a 96% success rate.