LIJDLR

DEEPFAKE AI AND CRIMINAL LAW: A NEW AGE THREAT TO WOMEN’S SAFETY

Srishti Sehgal, B.A. LL.B (Hons.), K.R. Mangalam University (India)

Technological innovation in Artificial Intelligence (AI) has given rise to “deepfakes” — hyper-realistic synthetic images, videos, and audio generated through deep learning algorithms that can convincingly depict individuals in fabricated scenarios. While this technology has creative potential, its misuse has evolved into a disturbing digital threat, particularly against women. Non-consensual sexual deepfakes, cyberstalking, identity theft, defamation, and extortion have become modern forms of gender-based violence, undermining women’s dignity, privacy, and mental health. This research critically examines the intersection of deepfake technology and criminal law, assessing whether existing legal provisions under the Indian Penal Code (IPC) and the Information Technology Act, 2000 are sufficient to address AI-driven sexual exploitation and image-based abuse. It adopts a doctrinal, comparative, and socio-legal methodology, integrating psychological studies and international legal developments, including the U.S. Take It Down Act (2025), the U.K. Online Safety Act (2023), and the EU AI Act. Through analysis of case law, policy gaps, and emerging judicial responses—such as the Bombay High Court’s 2025 deepfake-takedown order—this paper argues that India’s existing legal mechanisms remain fragmented and inadequate. It advocates for a dedicated deepfake legislation, mandatory takedown timelines, platform accountability, and institutional support systems for victims. The study concludes that safeguarding women in the age of artificial intelligence requires a proactive, rights-based legal framework that harmonizes technological innovation with human dignity.

📄 Type 🔍 Information
Research Paper LawFoyer International Journal of Doctrinal Legal Research (LIJDLR), Volume 3, Issue 4, Page 868–905.
🔗 Creative Commons © Copyright
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License . © Authors, 2025. All rights reserved.