
The Bombay High Court has sought information from the Ministry of Information and Broadcasting on reports of an artificial intelligence (AI) bot that turns women's photos into nudes. A cyber research agency tracking deepfakes said in a report that unidentified cyber criminals have rolled out a feature that allows people to create fake nude images of women from regular photographs.
The deepfakes have already targeted at least 1,00,000 women and imposes a challenge in attempts to curb online sexual abuse. Experts have warned that this can also impact law as it can create false evidence to implicate just about anyone in a crime, as mentioned in a report in Hindustan Times.
The Bombay High Court, meanwhile, sought information about the reports. While hearing various PILs on alleged electronic media trial in the Sushant Singh Rajput death case, the court directed Additional Solicitor General (ASG) Anil Singh to gather information from the ministry. "If you can gather from the ministry what print media has reported... We want you to check malice in the report. Kindly check with the ministry," said the bench comprising Chief Justice Dipankar Datta and Justice Girish Kulkarni.
ASG Singh said he had gone through the report and spoken to the concerned officers. He cited Section 69A and 79(3)(b) under which action could be taken.
At present there are roughly 1,04,000 users most of whom seem to be from Russia. There are seven Telegram groups linked to the service.
The AI bot allows a person to upload a photograph of a woman that it feeds back sans the clothing. The tool is available for free but the images would be watermarked. Users can pay around Rs 110 to remove the watermark.
Also read: Supreme Court notice to Google, Amazon, Facebook over data protection on UPI platforms
Also read: 12 private schools to move Supreme Court against Calcutta HC order of 20% fee reduction
Copyright©2025 Living Media India Limited. For reprint rights: Syndications Today