Osnaživanje, stil i inspiracija spajaju se u svakom izdanju našeg magazina.
|
February 5, 2026

Artificial intelligence has become accessible, cheap, requires no programming skills — and remains largely unregulated. The perfect prelude for misogyny to take on a new form: deepfake pornography.
Pause for a moment and imagine this. You type your name into a search engine and an explicit video appears. A woman is having sex with multiple partners — and that alone doesn’t shock you. What does is that the woman has your face. How do you feel? What goes through your mind?
Thanks to advances in artificial intelligence, deepfake pornography has become a real threat faced by women everywhere. Millions of such videos were created last year alone. In Australia, more than 7,000 women were forced to confront the questions posed above — except for them, the situation was not hypothetical, but painfully real.
The term “deepfake” first appeared in 2017 on Reddit, where manipulated pornographic videos of celebrities began circulating. Since then, Hollywood actors, politicians — even the Pope — have been targeted. But Noelle Martin, a lawyer and human rights advocate, was among the first non-famous women to publicly speak about her experience with deepfake pornography.
She was just 18 when, out of boredom, she googled her own name. Instead of an old MySpace profile, she found hundreds of fake pornographic images and videos — all bearing her face.
“I felt physically sick,” Martin told Marie Claire Australia, describing her first reaction.
Although she knew the videos were fake, she experienced a visceral sense of disgust and a desire to hide and cleanse herself — emotions strikingly similar to those reported by survivors of sexual assault. Then came the questions: Has anyone else seen this? Someone I know? Will this affect my family? My job? The situation worsened when she went to the police, ready to pursue justice, only to realize there was nowhere to begin. The law simply did not recognize this form of abuse. Today, at 30, Martin is one of the leading activists fighting against deepfake pornography.
The situation is now far worse than when Noelle Martin first spoke out. AI has handed abusers tools capable of producing highly realistic videos. Anyone who has ever uploaded a photo online can become a target — which means every woman.
A recent NBC News investigation found that websites hosting deepfake content are easily accessible via Google search, while creators use platforms like Discord to sell customized videos. Not long ago, producing a five-minute deepfake video cost around $65. Today, it is free for anyone with access to AI tools.
What once existed on the fringes of incel culture has entered the mainstream. Estimates show that in 2023 alone, 500,000 deepfake files were created. Last year? Over eight million. Around 90 percent of that content is pornographic — and its victims are almost exclusively women.
Some videos are created for extortion, others for harassment, others simply “for fun.” But the consequences for women remain the same.
Shame. Humiliation. Dehumanization. These are the emotions most women describe upon discovering fake pornographic content featuring their face. Many report feeling dirty, violated, attacked. Social rejection often follows. Partners struggle to believe the content is fake. Families express doubt. The public response is even harsher.
In the search for proof — Is it really her? — friends and family often end up watching and spreading the video themselves, rather than stopping it. Psychologists note that the stigma and fear experienced by victims of deepfake pornography closely resemble those felt by survivors of rape.
Paris Hilton, whose intimate video spread online without her consent long before such technology was common or laws existed to protect her, has publicly called for a ban on deepfake pornography. Speaking from Capitol Hill, her message echoed across global media: This is not a scandal. This is abuse.
Despite years of warnings, the scale of women’s vulnerability becomes fully visible only when someone you know becomes a victim. Legal protections remain scarce. In most countries, there is no clear way to prove the content is fake, to prosecute its creator, or to force platforms to remove it. Once it happens, deepfake abuse becomes a living nightmare.
Only a handful of countries have taken meaningful action:
United Kingdom: Creating, sharing or threatening to share intimate images — including deepfakes — without consent is a criminal offense. Online Safety laws enable victims to demand action from UK-registered platforms.
Australia: Amendments adopted in 2024 criminalize the distribution of non-consensual sexual material, including AI-generated content.
South Korea: Penalties for producing and distributing deepfake pornography have been increased, and even possession or viewing may carry legal consequences.
European Union: Directive 2024/1385 addresses violence against women, explicitly including AI-generated non-consensual imagery and victim protection.
Canada: Several provinces have enacted laws sanctioning the distribution of intimate images without consent.
No — but it is a necessary starting point. Without legal frameworks, victims are left completely unprotected. Laws shift blame from the victim to the perpetrator and send a clear public message: this is unacceptable. Still, legislation alone cannot undo the damage. Deepfake content can be created in minutes and spread globally almost instantly. Once harm is done, it cannot be fully erased.
That is why many experts argue the next step must involve regulating or banning tools that enable deepfake creation, holding platforms accountable for hosting and profiting from such content, and requiring clear labeling or watermarking of AI-generated material.
According to victims and analysts alike, the most powerful weapon against deepfake pornography is a shift in public narrative. Today, the first question is often: How do you know it isn’t her? Women are forced to defend themselves — to partners, families, workplaces and the public.
Deepfake pornography is about power and c