Saturday, March 22nd, 2025

Kolkata case shows how the dirty business of videos of crimes against women is spreading rapidly

New Delhi: The brutal rape and murder of a trainee doctor in Kolkata has sparked nationwide protests. Now, another disturbing trend has emerged. There has been a sudden surge in online searches for non-consensual sex videos and social media apps are flooded with channels that claim to show a glimpse of ‘rape’ for a fee. Some people are even making money off this tragedy.

This fraud is going on on the internet

Google Trends data shows that keywords like ‘rape video Kolkata’ were searched more on August 15 and then on August 17. At the same time, keywords like ‘Dr xxxx rape video’, ‘xxxx porn’, using the name of the victim, kept trending throughout the week. According to The Publica report, the name of the victim is being searched along with the real pornographic video on Xvideos India. There is a flood of channels selling such videos on messaging apps like Telegram, some of which claim to have 7,000-9,000 subscribers. These links take people to cloud storage apps like Terrabox and pornographic websites. Those selling ‘force videos’ are also active on Instagram. Although the Supreme Court has ordered the removal of all information, pictures and videos related to the Kolkata doctor, this illegal business is flourishing secretly.

Such keywords have been in trend before

Making money is not always the motive, sometimes it is also used to control the victim. Searches for ‘Prajwal Revanna sex tape’ were trending between April 28 and May 4 when the Karnataka MP Prajwal Revanna scandal hit the headlines. he/she is accused of sexually abusing several women and making videos to blackmail them. In January this year, the UP police arrested six men who raped a ninth grade student for over two years and blackmailed her with videos and photographs of the incident. In November 2023, a homestay employee in Agra was also subjected to a similar incident when five men gang-raped her and made a video of it so that she remains silent and obeys them.

‘Videos of small children are also available’

Women’s rights activist Sunitha Krishnan says she receives rape videos every day, sometimes of children as young as six months or a year old. Krishnan, cofounder of anti-trafficking NGO Prajwala, was so disturbed by these videos that she decided to set a trap by posing as a customer. “On November 1 at 1:51 pm, Seller 3 advertised that he/she had new goods for Rs 200. I haggled and got the price reduced to Rs 50 and paid using the previous QR code… Pornographic material in 6 folders was sent to me on a private chat. The folders contained CSAM (child sexual abuse material)… as well as adult pornographic material. There were also videos of gang rape, in which the victims and the perpetrators appeared to be Indians,” she said.

Such videos are easily available

This is an excerpt from the complaint Krishnan filed with the Telangana police in November last year, detailing how easily she managed to procure adult pornographic material, including videos of rape and gangrape. She interacted with people with names like ‘Dada Vyapaar’, ‘Masoom Ladka’ and bought CSAM and rape videos. “In just two days, I was able to buy over 9,000 videos for just Rs 532,” she says. She submitted names and bank accounts of five people who sold objectionable material. They are yet to be caught. If we are not ready to take any action in eight months, how will fear be created?

Social media has made it easier

Cyber ​​psychologist and founder of CyberBAAP, Nirali Bhatia, says that media and social media have normalised voyeurism. Over-reliance on technology and excessive exposure to violent content has desensitised and dehumanised us. Everything has become material, and we no longer feel emotionally attached to anything. She says. This also explains why people start making videos of a drowning person instead of saving him/her.

Criminals are getting help from AI

The challenge for authorities is both the sheer volume of such content and the difficulties in tracking these encrypted channels. AI has made it even easier to create pornographic content. Last week, the city of San Francisco filed suit against 18 websites and apps that ‘proliferate’ women and girls. These sites take a normal, clothed photo of a person and create a fake nude from it. The dissemination is easy too. Start an anonymous account, upload the video and then disappear. In private social media channels like Telegram, encryption ensures that there is no way to trace who is spreading the video. Channels on these apps can be shut down, but criminals can open another channel and continue selling this content.

Many times they are used to target people

Barkha Chakravarty of the NGO Breakthrough Trust, which works with youth in UP, Delhi, Jharkhand and Haryana to combat gender-based violence, says sharing such videos can lead to re-victimisation. Often we find that misinformation and disinformation are used to defame or harass a woman, especially if she is outspoken. This ranges from fake news to targeting a woman’s identity. Dr Nayreen Daruwala, programme director for prevention of violence against women and children at NGO Sneha, says this further increases the pain of the victims. Women who have been victims of sexual violence take a very long time to recover. Recording sexual violence in this way hinders their progress and continues their pain.

Krishnan says she petitioned the Supreme Court in 2015 on the issue of sharing CSAM and rape videos. Our demand to create a national-level investigating agency to investigate those who create and sell such content has not been met, she says. Be it 2015 or 2024, the status quo remains. Breakthrough’s Chakravarti says the only prevention is better awareness. The NGO trains young people on how to spot propaganda and misinformation and how to question what they see.

Share on:

Leave a Reply

Your email address will not be published. Required fields are marked *