Eemani Naveen, a resident of Guntur in Andhra Pradesh, was arrested by the IFSO unit of the Delhi Police
The original video was of another woman who was seen entering a lift in a black outfit
The Delhi Police’s Special Cell had taken up the case on November 10 after the video was uploaded on social media
Inc42 Daily Brief
Stay Ahead With Daily News & Analysis on India’s Tech & Startup Economy
Over two months after a deepfake allegedly featuring actor Rashmika Mandanna went viral over social media, the Delhi police on Saturday arrested the mastermind behind the case.
Eemani Naveen (23), a resident of Guntur in Andhra Pradesh, was arrested by the IFSO unit of the Delhi Police, Indian Express reported.
Naveen was pursuing B.tech and had completed a digital certification course in Digital Marketing from Google Garage in 2019. He was running three fan pages, one of which was of Rashmika Mandana, which did not have enough followers, due to which he created a deepfake video of her and posted it on the fan page on Instagram on October 13, after which it’s followers increased from 90K to more than a Lakh, the report said.
The original video was of another woman who was seen entering a lift in a black outfit. The accused allegedly used AI to morph Mandanna’s face onto the face of the woman in the video.
A deepfake is an image, or a video or audio recording, that has been edited using an algorithm to replace the person in the original with someone else in a way that makes it look authentic.
The Delhi Police’s Special Cell had taken up the case on November 10 after the video was uploaded on social media.
Delhi Police has then directed Meta to provide the URL of the account from which the ‘deepfake’ video of Mandanna originated. Besides, they also sought information on users who allegedly shared the fake video on social media platforms.
Soon after the video went viral, Mandanna wrote on her X post, “I feel really hurt to share this and have to talk about the deepfake video of me being spread online. Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused.”
All in all, the recent advancements in the arena of generative AI have only heightened these concerns as internet users face the grim reality of misinformation and fake content online.
At first, a video featuring what looked like actor Rashmika Mandanna made the rounds on the internet. Hot on the heels of this video getting viral, another deepfake video showcasing the face of Katrina Kaif started doing rounds on the internet. If this was not enough, a morphed image of cricket legend Sachin Tendulkar, endorsing the gaming app Skyward Aviator Quest and falsely claiming that his daughter Sara reaping financial benefits from it, made its way online.
Last month, union Minister Rajeev Chandrasekhar said that the government will “keep an eye” on the remedial measures taken by the platforms on the advisories related to deepfakes.
The minister also promised that inaction on their part may prompt amendment to the IT Rules that will be more “prescriptive”.
Earlier, the Ministry of Electronics and Information Technology (MeitY) rolled out a new advisory to all social media platforms to comply with existing IT rules and ensure deepfakes and the misinformation enabled by them are curbed.
The advisory stated that IT ministry directed intermediaries need to ‘clearly and precisely’ inform their respective users about what kind of content is prohibited, especially the ones specified under Rule 3(1)(b) of the IT Rules.
Meanwhile, YouTube India director Ishan John Chatterjee had said deepfakes were not in the Google-owned video-sharing platform’s interest at all as viewers, creators and advertisers want to steer clear of platforms that allow fake news or misinformation.
{{#name}}{{name}}{{/name}}{{^name}}-{{/name}}
{{#description}}{{description}}...{{/description}}{{^description}}-{{/description}}
Note: We at Inc42 take our ethics very seriously. More information about it can be found here.