Two of the biggest deepfake pornography websites have now started blocking people trying to access them from the United Kingdom. The move comes days after the UK government announced plans for a new law that will make creating nonconsensual deepfakes a criminal offense.
Nonconsensual deepfake pornography websites and apps that “strip” clothes off of photos have been growing at an alarming rate—causing untold harm to the thousands of women they are used to target.
Clare McGlynn, a professor of law at Durham University, says the move is a “hugely significant moment” in the fight against deepfake abuse. “This ends the easy access and the normalization of deepfake sexual abuse material,” McGlynn tells WIRED.
Since deepfake technology first emerged in December 2017, it has consistently been used to create nonconsensual sexual images of women—swapping their faces into pornographic videos or allowing new “nude” images to be generated. As the technology has improved and become easier to access, hundreds of websites and apps have been created. Most recently, schoolchildren have been caught creating nudes of classmates.
https://www.wired.com/story/the-biggest-deepfake-porn-website-is-now-blocked-in-the-uk/
The finding
that 415,000 deepfake images were posted online last year was made by Genevieve
Oh, a researcher who analyzed the top ten websites which host such content.
Popular search engines like Google and Bing are making it easy to surface nonconsensual deepfake pornography by placing it at the top of search results, NBC News reported Thursday.
These controversial deepfakes superimpose faces of real women, often celebrities, onto the bodies of adult entertainers to make them appear to be engaging in real sex. Thanks in part to advances in generative AI, there is now a burgeoning black market for deepfake porn that could be discovered through a Google search, NBC News previously reported.
NBC News uncovered the problem by turning off safe search, then combining the names of 36 female celebrities with obvious search terms like "deepfakes," "deepfake porn," and "fake nudes." Bing generated links to deepfake videos in top results 35 times, while Google did so 34 times. Bing also surfaced "fake nude photos of former teen Disney Channel female actors" using images where actors appear to be underaged.
BOSSIER PARISH, La. (WVUE) - A 32-year-old Louisiana man arrested on child pornography allegations is now the first in the state to face charges under a new law aimed at safeguarding individuals from the misuse of deepfake technology.
Deepfake technology uses artificial intelligence to create highly realistic photos and videos. Deepfakes are becoming more realistic, easier to access, and have added to an era of disinformation.
According to the Bossier Parish Sheriff’s Office, Rafael Valentine Jordan was arrested on Lylac Lane, less than half of a mile from Bellaire Elementary School in Bossier, on Nov. 17 and booked into jail on one count of juvenile pornography.
During the investigation, authorities say they discovered 436 images of child pornography created using deepfake technology. On Dec. 1, a second count of juvenile pornography was added, along with two counts of unlawful deepfake creation.
https://www.wafb.com/2023/12/27/bossier-man-jailed-child-porn-also-states-first-face-new-deepfake-law/
dez23
'Violating and dehumanising': How AI deepfakes are being used to target women
https://news.yahoo.com/violating-dehumanising-ai-deepfakes-being-110449132.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAA9VfaNnCXE8iDgTXcghLxia5OGFGrnxxj2V_YmvTnvRm9TlOd2lQCA9yQ5-fjKPMuHrnAJl6VnxCl6r7rxVx_qNEtBbc44aXLabi5sTw2-AljlAJG9uV4HuvtuzpyHAvSPGi5b_EM3-cPVNe5tMITbsj0wRT3B5mZbnykjlFhLY
dez23
Over 24 million people visit websites that let them use AI to undress women in photos: Study
Graphika, a social network analysis company, revealed that a whopping 24 million people visited these undressing websites in September alone, highlighting a troubling surge in non-consensual pornography driven by advancements in artificial intelligence. Here are the details.
Opinion: The rise of deepfake pornography is devastating for women
Deepfake Porn Is Out of Control
New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
A new analysis of nonconsensual deepfake porn videos, conducted by an independent researcher and shared with WIRED, shows how pervasive the videos have become. At least 244,625 videos have been uploaded to the top 35 websites set up either exclusively or partially to host deepfake porn videos in the past seven years, according to the researcher, who requested anonymity to avoid being targeted online.
https://www.wired.com/story/deepfake-porn-is-out-of-control/
jul23
In an essay for The Atlantic last month, Nina Jankowicz wrote about what it was like to discover that she'd been deepfaked into pornographic material.
Noelle Martin was just 18 when she discovered that pornographic pictures of her were being circulated online. She never recalled taking, let alone sharing, intimate images. However, that was her face in those images - the body, however, wasn’t hers.
She became a victim of what would later be known as deepfakes. Pornographic pictures had been manipulated to look like her by using images she had shared on her personal social media accounts.
A 22-year-old Long Island man has been sentenced to six months in jail and must register as a sex offender for taking photos from social media accounts of nearly a dozen women when they were in high and middle school, altering them to make them sexually explicit and then posting them on a porn website for years, prosecutors say.
Patrick Carey, who was posting the fake images up to within hours of his 2021 arrest, also shared the women's personal identifying information, including full names, phone numbers and addresses -- and encouraged other users on the porn site to harass and threaten them with violence, according to court documents.
Carey pleaded guilty in December to multiple felonies in the deepfake scheme, including promoting a sexual performance by a child, aggravated harassment as a hate crime and stalking.
At Tuesday's sentencing, the Seaford man was ordered to stay away from each of the 11 victims -- a judge issued orders of protection lasting the statutory eight years maximum each. He will also be subject to 10 years of probation on top of the jail time and sex offender requirements.
A gaming YouTuber has said she had to pay hundreds of pounds to remove porn which used deepfake technology to make it look like she was involved.
Sunpi has made a name for herself on YouTube by creating content about video games and gaming culture, with posts showing her travelling to play games, sharing her reactions to new titles and showing off her gaming setup.
Found through Google, bought with Visa and Mastercard: Inside the deepfake porn economy.
Making Deepfakes Gets Cheaper and Easier Thanks to A.I.
Hundreds of sexual deepfake ads using Emma Watson’s face ran on Facebook and Instagram in the last two days
https://www.nbcnews.com/tech/social-media/emma-watson-deep-fake-scarlett-johansson-face-swap-app-rcna73624
fev23
Amid the fallout, the Twitch streamer “Sweet Anita” realized deepfake depictions of her in pornographic videos exist online.
“It’s very, very surreal to watch yourself do something you’ve never done,” Twitch streamer “Sweet Anita” told CNN after realizing last week her face had been inserted into pornographic videos without her consent.
“It’s kind of like if you watched anything shocking happening to yourself. Like, if you watched a video of yourself being murdered, or a video of yourself jumping off a cliff,” she said.
https://edition.cnn.com/2023/02/16/tech/nonconsensual-deepfake-porn/index.html
jan23
Existing and proposed laws will fail to protect EU citizens from nonconsensual pornographic deepfakes—AI-generated images, audio, or videos that use an individual’s likeness to create pornographic material without the individual’s consent. Policymakers should amend current legislative proposals to better protect victims and, in the meantime, encourage soft law approaches.
Although deepfakes can have legitimate commercial uses (for instance, in film or gaming), 96 percent of deepfake videos found online are nonconsensual pornography. Perpetrators superimpose the likeness of an individual—most often an actor or musician, and almost always a woman—onto sexual material without permission. Sometimes perpetrators share these deepfakes for purely lewd purposes, while other times it is to harass, extort, offend, defame, or embarrass individuals. With the increasing availability of AI tools, it has become easier to create and distribute deepfake nonconsensual pornography.
There are no specific laws protecting victims of nonconsensual deepfake pornography, and new proposals will fall short.
https://datainnovation.org/2023/01/eu-proposals-will-fail-to-curb-nonconsensual-deepfake-porn/Designed to abuse? Deepfakes and the non-consensual diffusion of intimate images
Synthese , Article number: 20130 (2023)
Abstract
The illicit diffusion of intimate photographs or videos intended for private use is a troubling phenomenon known as the diffusion of Non-Consensual Intimate Images (NCII). Recently, it has been feared that the spread of deepfake technology, which allows users to fabricate fake intimate images or videos that are indistinguishable from genuine ones, may dramatically extend the scope of NCII. In the present essay, we counter this pessimistic view, arguing for qualified optimism instead. We hypothesize that the growing diffusion of deepfakes will end up disrupting the status that makes our visual experience of photographic images and videos epistemically and affectively special; and that once divested of this status, NCII will lose much of their allure in the eye of the perpetrators, probably resulting in diminished diffusion. We conclude by offering some caveats and drawing some implications to better understand, and ultimately better counter, this phenomenon.
A Lehi man was arrested Tuesday and accused of making "deep fakes" of pornography that included the faces of children placed on adult bodies.
Jesse John Campbell, 44, was booked into the Utah County Jail by agents from the Internet Crimes Against Children task force for investigation of 10 counts of sexual exploitation of a minor, sexual abuse of a minor and lewdness involving a child.
The investigation began in early November when a family member allegedly discovered a video on Campbell's phone. The pornographic video included an adult woman's body with the head of a teen girl superimposed on it, according to a police booking affidavit.
"According to witnesses, Jesse is known to use 'deep fake' software to alter videos. In the video regarding the victim in this investigation, Jesse used the child victim's face and replaced it on another female's body who is engaged in sexually explicit conduct," the affidavit states.
ksl.com/article/50519297/lehi-man-arrested-in-deep-fakes-child-pornography-investigation-
CASO
One quiet winter afternoon, while her son was at nursery, 36-year-old Helen Mort, a poet and writer from South Yorkshire, was surprised when the doorbell rang. It was the middle of a lockdown; she wasn’t expecting visitors or parcels. When Helen opened the door, there stood a male acquaintance – looking worried. “I thought someone had died,” she explains. But what came next was news she could never have anticipated. He asked to come in.
“I was on a porn website earlier and I saw… pictures of you on there,” the man said solemnly, as they sat down. “And it looks as though they’ve been online for years. Your name is listed, too.”
Initially, she was confused; the words ‘revenge porn’ (when naked pictures or videos are shared without consent) sprang to mind. But Helen had never taken a naked photo before, let alone sent one to another person who’d be callous enough to leak it. So, surely, there was no possible way it could be her?
“That was the day I learned what a ‘deepfake’ is,” Helen tells me. One of her misappropriated images had been taken while she was pregnant. In another, somebody had even added her tattoo to the body her face had been grafted onto.
Despite the images being fake, that didn’t lessen the profound impact their existence had on Helen’s life. “Your initial response is of shame and fear. I didn't want to leave the house. I remember walking down the street, not able to meet anyone’s eyes, convinced everyone had seen it. You feel very, very exposed. The anger hadn't kicked in yet.”
Nobody was ever caught. Helen was left to wrestle with the aftereffects alone. “I retreated into myself for months. I’m still on a higher dose of antidepressants than I was before it all happened.” After reporting what had happened to the police, who were initially supportive, Helen’s case was dropped. The anonymous person who created the deepfake porn had never messaged her directly, removing any possible grounds for harassment or intention to cause distress.
https://www.cosmopolitan.com/uk/reports/a41534567/what-are-deepfakes/
Arms folded and eyes smiling, I look completely at ease commanding a boardroom, but there’s one major issue with the photo that’s flashing up on the computer screen before me — I am completely naked.
In less than 15 seconds I have been digitally undressed by an easily accessible AI tool and I am horrified at the computer-generated clone staring back at me.
qual a motivação?
(HOLANDA)
The police arrested a 38-year-old man from Amersfoort for making a deepfake porn video of TV presenter and journalist Welmoed Sijtsma. The police questioned the man and then released him from custody, but he remains a suspect, the Amsterdam Public Prosecution Service (OM) confirmed to AD.
According to the OM, the man used footage of the TV presenter to create a pornographic video using deepfake technology. He placed Sjitsma’s head on the body of a porn actr
Sjitsma, 32, became aware of the fake porn video of her circulating online last year. She decided to make a four-part docuseries about it for broadcaster WNL. The last episode of Welmoed en de sexfakes will air on Thursday evening.
Mr Deepfakes can make you a porn star
His website will hijack your identity
https://unherd.com/2022/10/mr-deepfakes-can-make-you-a-porn-star/Scrolling through her Twitter feed one evening, Kate Isaacs stumbled across a disturbing video among her notifications.
"This panic just washed over me," Kate says, speaking publicly for the first time about what happened. "Someone had taken my face, put it on to a porn video, and made it look like it was me."
Kate had been deepfaked. Someone had used artificial intelligence to digitally manipulate her face onto someone else's - in this case a porn actress.
The deepfake video on Twitter - with Kate, who campaigns against non-consensual porn, tagged - had been made using footage from TV interviews she had given while campaigning. It appeared to show her having sex.
Mr. Deepfakes
Fast forward to today, and a leading site specifically created to house deepfake celebrity porn sees over 13 million hits every month (that’s more than double the population of Scotland). It has performative rules displayed claiming to not allow requests for ‘normal’ people to be deepfaked, but the chatrooms are still full of guidance on how to DIY the tech yourself and people taking custom requests. Disturbingly, the most commonly deepfaked celebrities are ones who all found fame at a young age which begs another stomach-twisting question here: when talking about deepfakes, are we also talking about the creation of child pornography?
It was through chatrooms like this, that I discovered the £5 bot that created the scarily realistic nude of myself. You can send a photograph of anyone, ideally in a bikini or underwear, and it’ll ‘nudify’ it in minutes. The freebie version of the bot is not all that realistic. Nipples appear on arms, lines wobble. But the paid for version is often uncomfortably accurate. The bot has been so well trained to strip down the female body that when I sent across a photo of my boyfriend (with his consent), it superimposed an unnervingly realistic vulva.
https://www.cosmopolitan.com/uk/reports/a41534567/what-are-deepfakes/
ago22
Deep Fakes Are Becoming More Harmful for Women
AI technology is now so sophisticated that you can't always believe your eyes.
https://www.psychologytoday.com/us/blog/womans-place/202208/deep-fakes-are-becoming-more-harmful-womenDeepfake pornographique : "Mon corps n'est pas un objet dont on peut se servir sans ma permission"
https://www.marieclaire.fr/deepfake-pornographique-mon-corps-n-est-pas-un-objet-dont-on-peut-se-servir-sans-ma-permission,1425880.asp
AHMEDABAD: Sextortionists are now turning to AI (artificial intelligence) to up their blackmailing game. Their latest weapon is deepfake videos — digitally altered footage of real people showing stuff they never did. The worst victims are, of course, women.
Top sources say that online sextortionists from Mewat in Haryana, who earlier used screenshots of their male victims in compromising positions on WhatsApp chats to blackmail them, have gotten a tad creative.
They now use deepfake apps to create porn clips starring television actresses to target moneybags.
Using AI, the starlet’s face is superimposed on the face of the woman in the original clip to create a deepfake porn clip which is used to lure gullible men. “The end product will look like the celebrity was part of the porn clip,” said a police official.
Recently, a city lawyer, who was contemplating suicide after losing Rs 3 lakh to sextortionists, had called up a suicide prevention helpline. That is when the police learnt about the new modus operandi. The offenders threatened to make his footage public.
The police are yet to register a case and are trying to track the criminals’ digital fingerprints. According to sources, the offenders first collect information about their ‘prey’ from their public social media profiles.
"This defendant allegedly manipulated the photos of more than a dozen women, taken when they were teenagers, and posted the ‘deepfake’ images online for strangers’ sexual gratification," Acting Nassau County District Attorney Joyce Smith said in a statement. "His depravity deepened when he allegedly shared the victims’ personal identifying information – including their home addresses – encouraging site visitors to harass and threaten the women with sexual violence." Prosecutors say there may potentially be other dozens of other victims and encourage anyone who feels they may have been victimized by Carey to call the Nassau County district attorney's office at 516-571-2553. "These images are illicit and weren't consensually sent in. The people posting them are posting them without permission and that is part of the draw of this website," said Senior Investigative Counsel and Assistant District Attorney Melissa Scannell. "There are probably 50 women he did this to, so we think there are more people out there."
https://www.nbcnewyork.com/news/local/crime-and-courts/ny-man-indicted-in-depraved-deepfake-online-sex-scheme-targeting-at-least-14-women/3454461/
A research called Ajder who worked on a deepfake report for Sensity told the Japan Times: "The vast, vast majority of harm caused by deepfakes right now is a form of gendered digital violence."
They added that they worked on a study which indicated millions of women had been targeted by deepfake porn.
https://www.thesun.co.uk/tech/17101334/deepfake-revenge-porn-websites-apps-exposed-study/
An Australian woman who's life was "shattered" by deepfake porn says proposed law changes to address the attacks in New Zealand will help empower fellow victims. Labour MP Louisa Wall is fighting to ensure victims of the attacks have the same recourse under the Harmful Digital Communications Act as other survivors of online abuse. (...) Noelle Martin, 27, was only a teen when her world was "shattered" after she discovered her image had been used in deepfake porn without her consent. Wall said there's no criminal pathway under the Harmful Digital Communications Act for victims to hold those responsible to account. The MP said she found out about the issue after she had proposed an amendment to the law earlier this year which would explicitly make the posting of intimate images and recordings without consent illegal. https://www.nzherald.co.nz/nz/calls-from-mps-and-survivor-for-protections-for-deepfake-porn-victims/SXIQAOQSEVEO4X6S2AKZEE242A/
nov21
Gibi is a Youtuber who has around 3.8 million subscribers for her ASMR focused YouTube channel. She experienced abuse by people using “deepfakes”, a AI-generated image of someone’s likeness.
“My deepfakes have been around ever since I started my YouTube channel. I’ve seen how it has gotten very good so that makes me extremely nervous because I know how fast technology can advance.
When I first saw a deepfake, I was reading about how the computer learns and gets better at matching your face and putting it onto something pornographic. Watching the videos is very surreal — people believe it’s real. The thing that bothers me is I did not consent for my image to be used that way, they are able to do it with no consequences and it feels very violating. I contemplated deleting my channel because I felt very overwhelmed.
It’s something that I just keep working through and I do my best to protect my privacy. Do I ever feel safe? Not really!
I used to keep tabs on the deepfakes until it felt useless, if you let it consume you it’s going to waste your time and that’s not what I want. Sometimes people will email them to me, like “Gibi, somebody made porn of you!” I even saw that somebody was doing commissions, making money off my doctored photos and videos. They’re running this business, profiting off of my face doing something that I didn’t consent to, like my suffering is your livelihood. It made me really mad, but again, there was nothing I could do.
Once, I was approached by a company taking deepfakes off the internet but their prices were exorbitant. Why should I be using my hard earned money to be paying you to privately take down these videos? I think that lawmakers and governments are extremely overwhelmed by the internet so they just let it go. If somebody’s making a deepfake in a different country, my country doesn’t care because there’s nothing they can do.
For me, justice would be not letting them be anonymous anymore. It’s much too easy to make yourself anonymous online where law enforcement doesn’t care enough to put in the effort to find out who’s doing it. https://www.globalcitizen.org/en/content/online-gender-based-violence-survivor-equality-now/
nov21
Alguns deles estão ajudando a viralizar supostas imagens de um nude de Greta Thunberg. Nas fotos (que, por motivos óbvios, não exibiremos aqui), a moça aparece em um suposto nu frontal. Há, ainda, um suposto vídeo em que ela estaria se exibindo. Seja por curiosidade ou por raiva, o fato é que as fotos se espalharam como água no WhatsApp. Só que o que nem todos apontaram é que as imagens não são da ativista sueca Greta Thunberg. É claro que poucas viralizaram com conotações políticas como no caso de Greta Thunberg. Em uma página do fórum Reddit, internautas, inclusive, identificaram a modelo que emprestou o corpo para a montagem. Chama-se Sweet Pie. https://www.boatos.org/mundo/fotos-greta-thunberg-nua-vazam-internet.html + https://www.swissinfo.ch/spa/efe-verifica-greta-thunberg_no-es-greta-thunberg-bailando-desnuda--es-un--deepfake-/47144616
jan22
Ya el pasado mes de noviembre, la Agencia Efe recibió un deepfake en el que se mostraba a la joven activista de 19 años Greta Thunberg bailando desnuda. Se trataba de un vídeo que estaba circulando sobre todo en grupos de WhatsApp en Brasil y en el que se la ridiculizaba con un mensaje: «Está probando nuevas tácticas en contra del cambio climático».
Bastó una búsqueda en Internet para que los compañeros de Efe encontrasen el vídeo original de la actriz, cuyo rostro había sido reemplazado por el de Thunberg. En este caso, añade la experta de Karisma, el deepfake pornográfico busca «cuestionar si su activismo en medioambiente es creíble teniendo en cuenta que hay un video circulando de ella desnuda» y «dejar su discurso en un segundo plano, quitarle interés a la lucha que está adelantando». https://efeminista.com/deepfakes-pornograficos-silenciar-mujeres/amp/
out21
Martin tried contacting the police, private investigators and government agencies, but because she didn't know where the images originated there was no way to hold the creators accountable. Martin even attempted to contact the operators of the the porn sites that hosted the pornographic photos of her, but those efforts sometimes led to more abuse. "Sometimes I'd get a response and they'd remove it, and then it'll pop up two weeks later," she said. "And then one time, one perpetrator said that they'd only remove the material if I sent them nude photos of myself within 24 hours." LEIS: According to Dodge, the legislative system has been slow to react to the threat women face from deepfakes. "In most states, non-consensual pornography is illegal," he explained. "However, if you create deepfake non-consensual pornography, those laws are not going to apply because it's not the victim's body being portrayed in the video. It's just their face. So the video won't meet the threshold to be prosecuted under that law." "Because I was speaking out about it, the perpetrators decided to create fake videos of me," Martin said. "You only seek to lose when you talk about something like this, because when you dare to speak about this kind of abuse, you expose yourself to more people seeing the very thing you don't want people to see." Despite attempts to silence her, Martin has remained an outspoken activist. She advocated for legal solutions that led Australia to make image-based abuse a criminal offense in 2017. But these years have taken a toll. https://www.cbsnews.com/news/deepfake-porn-woman-fights-online-abuse-cbsn-originals/
nov21
Enora Malagré, victim of deepfake: “The trauma is akin to sexual assault”. “You find yourself, in spite of yourself, at the heart of a pornographic film. Enora Malagré, journalist and author, has been the victim of deepfake (or hypertrucage). This process makes it possible to superimpose one video on another in order, for example, to change the face of a person. This technique can be used to create malicious hoaxes, fake news, but also fake pornographic videos. Enora Malagré was unaware that she was the target of deepfake. It was the journalists of the program “Complément d’études”, who produced an issue on cyberstalking, who informed her. “I was shocked, traumatized,” she comments. She describes the minutes that followed this discovery: “I started looking at the picture. In fact, after 30 to 40 seconds, it feels like it’s you anyway. We know that it is not his body, but we assimilate this body as being his. »Quickly grabbed by« these violent images », she describes what she felt. “We have a trauma and a shock which, I believe, is remotely akin to sexual assault,” she says. In France, Enora Malagré is the first to speak out and express herself on this disturbing phenomenon, which can affect a public figure like a complete anonymous. In the United States, Scarlett Johansson had sounded the alarm in 2019. “The proliferation of deepfake videos is used to harass and humiliate women, whether or not they are on the front of the stage,” she said , before detailing the difficulties in initiating legal proceedings. https://news.in-24.com/lifestyle/news/170538.html + https://www.elle.fr/Societe/News/Enora-Malagre-victime-de-deepfake-Le-traumatisme-s-apparente-a-une-agression-sexuelle-3970580 + https://mrdeepfakes.com/celebrities/enora-malagre
Nov21
combater LEIS
Are there effective legal approaches to trying to stem the spread?
There is quite a lot of action going on around the world right now to think about what we can do about deepfake image abuse in different countries. Also on the state and federal level in the US. There are many states introducing legislation to criminalize the use of nonconsensual fake pornography. The key thing there is it’s criminal. The state would prosecute the offender, the individual would not have to bring that case themselves and pay for that as a civil trial. In the UK, there’s a review going on into intimate image abuse laws, with deepfakes in the crosshairs. You’re seeing in South Korea a big push from fans of K-Pop girl groups, who are one of the biggest groups targeted. That’s one of the most surprising findings of my reporting back in 2019, which was that 25 percent of the victims were South Korean K-Pop singers. In South Korea, there’s a lot of social action trying to get this explicitly outlawed. There is still a good chance that if someone created something like this and was identified and reported to police, they could be charged with harassment or indecent communications, but we probably do need specific laws that acknowledge the specific harms that this new technology can cause. If you can identify who is creating this material on the internet, where anonymity is ubiquitous, chances are there may be recourse in the legal system. But identifying who they are is a big challenge. They may not be in the same jurisdiction, which makes it even harder. The law can do very little to stop the proliferation of the tools for people who really want to find them. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124
nov21
COMBATER
Do these sites ever truly disappear? Can their coding or technology really disappear once they’ve been shared?
It’s a great question and, unfortunately, it’s one where the answer isn’t so optimistic. This bot on Telegram that I discovered was a kind of a Frankenstein mutation of the same tool which was released in June of 2019. That tool went down because it got so much traffic after some reporting on it and people just cloned the software and it sprung up in many different forms. It’s easily accessible in many different forms. You can access it as the raw code, you can access it as a super user-friendly web tool, and you can access it as a website as well. The problem is you can’t regulate mathematics. People know how to replicate this now. In many cases, the software which is used to create these tools comes from perfectly legitimate uses of it and is being perverted by bad actors. Unfortunately, it is very difficult, near impossible, to ever really remove this stuff entirely. When one goes down, others spring up to take its place. There are things we could do to help, to drive it underground. Internet service providers could help, potentially. Responsive action from hosting services, for example. Making sure app stores are all aligned. Ultimately, if someone wants to find the techniques and tools, they’re gonna find them somewhere. We can make a difference by making it as hard to find as possible. Friction is a big thing. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124
nov1
Is there any space for consensual deepfake porn? Are there creative consensual applications? In 2018, the porn company Naughty America announced that it would provide custom, consensual deepfakes for customers.
The question isn’t so much whether consensual deepfake pornography is possible. Of course, it’s possible. More pressing: Is it possible to create the tools for consensual deepfake pornography without them being inevitably misused in a way that causes more harm than good? That tool that I discovered framed itself as a way to put yourself into pornographic footage, but obviously it had no guardrails and I think it was disingenuous. I think they knew exactly what it was going to be used for. The Naughty America thing was a PR stunt, I think. Maybe there’s a way to have that service, but then do you need to have a know-your-customer style verification service? How do you confirm consent as having been granted? Who do you need the consent from—the performer, whose body you’re being swapped onto? If you want to scale that technology, is it possible to do that in a way where women or men are not going to be targeted and cause a lot of harm? I think it will be very hard, unless you’re doing a very bespoke service with contracts being signed and passports and video calls. There’s a lot of layers of security that will be needed to make sure that everything is OK.
There’s a really interesting question of whether making deepfake pornography without sharing it should be considered bad. Obviously, sharing this stuff, to many people, is the primary offense, right? But there’s a really interesting debate: Should there be a simple making offense? This is the idea if you just make a piece of deepfake intimate imagery for your own consumption and you have no intention of sharing it, should that still be considered a criminal act? There’s a guy called Carl Öhman who is a philosopher of technology who coined a term for this: the pervert’s dilemma. One way of looking at it is saying, well, you’re trying to regulate and police fantasy. People fantasize about other people all the time. The other side of it says: By bringing into existence a piece of footage of that nature, even just in the act of making it, you’re violating someone’s dignity in a way. What’s more, you’re bringing into existence something that could do a great amount of harm. I definitely am inclined to fall into the latter camp, which is that, if not explicitly made criminal, it should certainly be highly discouraged and is ethically very dubious. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124
nov21
conceito de RP: pode não envolver pornografia!!!
There’s a video of me attempting an armed robbery. The victim’s head-mounted GoPro camera recorded it. He was a cyclist, minding his business as he made his way down the street. Out of nowhere, the video shows me riding up on a motorcycle and pulling out a gun, forcing him to stop in his tracks; I demand he gives me his rucksack, but he claims he doesn’t understand what I’m sayingYou may think: “I’m not famous, and
no one is ever going to want to see me act in a porno.” But as my deepfake
shows, that’s not the only damaging scenario in which you could be deepfaked. I
have it in my power – as do you and everyone reading this – to download free
deepfaking software right now and create something seriously damaging. What if
I made a video of you kissing a stranger, and threatened to send it to your
spouse or children? Wouldn’t you pay me some blackmail money for that not to
happen? Especially if your spouse knows you’ve been unfaithful in the past? https://www.telegraph.co.uk/news/2021/10/30/deepfaked-committing-crime-should-worried/
Deepfakes first stepped onto the public stage via videos mapping celebrities’ faces onto porn performers’ bodies and have since impacted non-celebrities, sometimes as a doctored version of “revenge porn,” where, say, exes spread digitally created sex tapes that can be practically indecipherable from the real thing. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124
CONCEITO:
JEZEBEL: You prefer to use the phrase “deepfake image abuse” instead of “deepfake porn,” can you explain? HENRY AJDER: Deepfake pornography is the established phrase to refer to the use of face-swapping tools or algorithms that strip images of women to remove their clothes, and it comes from Reddit, where the term first emerged exclusively in this context of nonconsensual sexual image abuse. The term itself, deepfake pornography, seems to imply, as we think of with pornography, that there is a consensual element. It obscures that this is actually a crime, this is form of digital sexual harassment, and it doesn’t accurately reflect what is really going on, which is a form of image abuse.
==========================Aula Out21=======================
out21
Japanese police on Monday arrested a 43-year-old man for using artificial intelligence to effectively unblur pixelated porn videos, in the first criminal case in the country involving the exploitative use of the powerful technology. Masayuki Nakamoto, who runs his own website in the southern prefecture of Hyogo, lifted images of porn stars from Japanese adult videos and doctored them with the same method used to create realistic face swaps in deepfake videos. Nakamoto reportedly made about 11 million yen ($96,000) by selling over 10,000 manipulated videos, though he was arrested specifically for selling 10 fake photos at about 2,300 yen ($20) each. Nakamoto pleaded guilty to charges of copyright violation and displaying obscene images and said he did it for money, according to NHK. He was caught when police conducted a “cyber patrol,” the Japanese broadcaster reported. Photo-realistic images created using AI are increasingly common and have raised many legal and ethical questions concerning privacy, sexual exploitation, copyright, and artistic expression. “This is the first case in Japan where police have caught an AI user,” Daisuke Sueyoshi, a lawyer who’s tried cybercrime cases, told VICE World News. “At the moment, there’s no law criminalizing the use of AI to make such images.” https://www.vice.com/en/article/xgdq87/deepfakes-japan-arrest-japanese-porn Na segunda-feira (18), a Polícia da Prefeitura de Kyoto prendeu um homem que admitiu ser responsável por criar deepfakes de pornografia e vender mais de 2.500 arquivos de vídeo adulterado. Masayuki Nakamoto violou a Lei de Obscenidade do Japão, que proíbe a exibição de “materiais indecentes” com exibição de genitais. O que normalmente acontece é que esses vídeos normalmente são pixelizados ou desfocados, e o acusado reverteu essa pixelização. De acordo com o jornal japonês Mainichi, este parece ser o primeiro caso de prisão relacionada à pornografia no país. akamoto está sendo acusado de violar os direitos autorais de uma produtora de vídeo de Tóquio, já que utilizou vídeos pixelizados protegidos por direitos autorais, tornou-os nítidos e publicou em seu próprio site. Ele também se utilizou de inteligência artificial (IA) para incluir imagens de genitais que não estavam presentes na filmagem original. Na imagem abaixo, fornecida pela Polícia da Prefeitura de Kyoto, é possível visualizar uma demonstração do método de adulteração feito com a inteligência artificial TecoGAN — tecnologia utilizada por Nakamoto no caso em questão. https://mundoconectado.com.br/noticias/v/21110/deepfake-homem-e-preso-por-aplicar-genitais-falsos-em-filmes-porno + https://news.yahoo.com/japanese-man-first-suspect-ever-191405154.html L'homme a utilisé un outil baptisé TecoGAN spécialisé dans la super-résolution, autrement dit l'amélioration de la définition des images, et entraîné sur des images non censurées. Il aurait vendu plus de 10.000 deepfakes sur son site Web et gagné 11 millions de yens (83.000 euros) en tout. Toutefois, il est poursuivi pour seulement 10 des images, vendues 2.300 yens (17 euros) chacune. L'homme a plaidé coupable pour avoir enfreint le droit d'auteur et affiché des images obscènes, et a indiqué l'avoir fait pour l'argent. https://www.futura-sciences.com/tech/actualites/intelligence-artificielle-pornographie-ia-permet-enlever-flou-images-censurees-94345/
THE ISSUE OF “deepfakes” has been widely discussed in Taiwan after YouTuber “Xiao Yu” was found to be responsible for a Telegram group used to sell, commission, and circulate “deepfake” porn videos of women, who were mostly public figures. Xiao Yu, whose real name is Zhu Yu-chen, was subsequently taken into custody. Xiao Yu is primarily known for humor videos, though some of his viral stunts have caused controversy in the past. The public response to the incident was large enough that no less than President Tsai Ing-wen would comment on the matter, drawing attention to the issue of deepfakes with a post on her Facebook. The Telegram group was previously detailed by an investigative report from Mirror Media that was originally released in May, which became widely circulated again after the arrest came to light. https://newbloommag.net/2021/10/19/deepfake-arrest/ A Taiwanese YouTuber suspected of creating and selling deepfake porn videos featuring more than 100 politicians and influencers was on Monday released on bail after being arrested the previous day. Chu Yu-chen (朱玉宸), 26, who uses the name Xiaoyu (小玉) on YouTube, was arrested on Sunday in New Taipei City, along with two suspected accomplices, a 24-year-old YouTuber surnamed Yeh (耶), known as Shaiw Shaiw (笑笑), and a 22-year-old man Chuang (莊).A conviction for distributing obscene videos carries a maximum sentence of two years in prison, which can be converted or added to a NT$90,000 (US$3,222) fine, while a public insult conviction could result in a fine of up to NT$9,000.New Taipei City Deputy Chief Prosecutor Nieh Chung (聶眾) said that while the alleged crimes were serious, it was not necessary to detain the suspects https://www.taipeitimes.com/News/front/archives/2021/10/20/2003766430
A petition posted on the Cheong Wa Dae website in January called for tough punishment for people creating and uploading deepfake porn videos of Korean celebrities and drew around 390,000 signatures.
But increasingly, people's ordinary friends and acquaintances have become victims. Police in North Jeolla Province arrested a man in his 20s who created deepfake porn videos of a woman he met on the Internet and uploaded them on porn websites. http://english.chosun.com/site/data/html_dir/2021/09/30/2021093000739.html
What Deepfake Technology Means for Women. https://www.wnycstudios.org/podcasts/takeaway/segments/what-deepfake-technology-means-women
set21
A horrifying new AI app swaps women into porn videos with a click. Deepfake researchers have long feared the day this would arrive. https://www.technologyreview.com/2021/09/13/1035449/ai-deepfake-app-face-swaps-women-into-porn/ + https://onezero.medium.com/deepfake-porn-when-tech-ruins-womens-lives-3ae99b2c4bed Ajder has seen a lot of disturbing things, but a few months ago, he came across something he’d never seen before. It was a site that allowed users to simply upload a photo of someone’s face and produce a high-fidelity pornographic video, seemingly a digital reproduction of that person. “That is really, really concerting,” he says. Ajder alerted the journalist Karen Hao, an artificial intelligence researcher. Last month, she wrote about the site in the MIT Technology Review, bringing attention to the specter of free, easily-created deepfake porn. “[T]he tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video,” wrote Hao. “All it requires is the picture and the push of a button.” The next day, following an influx of media attention, the site was taken down without explanation. But the site’s existence shows how easy and accessible the technology has become. I found this website a few months ago and have monitored it and I reported it when I found the functionality was evolving and becoming increasingly accessible. It was the first of its kind that provided a library of footage already. All you had to do was upload a picture of someone’s face, choose the video from their pre-selected videos, press the button and it would generate the output. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124
set21
AHMEDABAD: A 46-year-old businessman with a unit in Odhav recently got entrapped in a sextortion racket when he accepte .. request sent by a beautiful woman on social media. Within tw .. exchanged phone numbers and a video call was made by the woman. http://timesofindia.indiatimes.com/articleshow/86020397.cms?utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst
Ago21
Launched in 2020, the site boasts that it developed its own “state of the art” deep-learning image translation algorithms to “nudify” female bodies, and that the technology is so powerful there is not a woman in the world, regardless of race or nationality, who is safe from being “nudified.” But it doesn’t work on men. When fed a photo of a cisgender man’s clothed body, the site gave him breasts and a vulva. https://www.huffpost.com/entry/deepfake-tool-nudify-women_n_6112d765e4b005ed49053822 Leading credit-card companies have come under fire over their alleged links to a popular website that allows users to digitally “undress” women and girls. Mastercard and Visa were listed last week as approved payment methods on a “nudifying” website widely used to create fake naked images of people without their consent. It promises to “make men’s dreams come true”. https://www.thetimes.co.uk/article/credit-card-giants-tied-deepfake-nudify-site-zjhzmrz3v
RP
The issue of deepfakes – convincingly real looking fake videos and pictures – is once again in the spotlight after the Ambernath police arrested a 28-year-old Malad resident for allegedly generating objectionable content of a 25-year-old woman who refused to marry him. Police said that the victim, who is married and pregnant, stays with her family in Ambernath. She approached the police on May 19 this year to complain about several objectionable videos and pictures of her that were circulated on social media through multiple accounts. The police said that the victim was in a panic as she had never posed for any of the pictures or videos, and the repeated emergence of such content on Facebook and Instagram was causing problems in her married life. The police registered a case against unknown persons and started investigating the matter. Details of all the pictures and videos were collected and, with the help of the Thane Police cyber cell, the Ambernath police sought Internet Protocol addresses and other data from Facebook and Instagram. The data, which was received after several days, included a cell phone number, and the police then obtained Call Detail Record and registration details of the number. https://www.hindustantimes.com/cities/others/28yrold-mumbai-man-held-for-circulating-deepfakes-on-social-media-101625514850866.html
A leading legal expert is warning of an "epidemic" of sexual abuse where images of people's faces are merged with pornography and made available online. Deepfake pornography is where computer technology is used to map the faces of celebrities and private citizens on to explicit sexual material. Prof Clare McGlynn said it made it much easier for perpetrators to abuse and harass women. https://www.bbc.com/news/uk-scotland-57254636
Deepfake porn is ruining women’s lives. Now the law may finally ban it
Speaking to BBC Radio 5 Live's Mobeen Azhar, Helen said she wanted to see the creation and distribution of these images made an offence.
There is a popular internet meme called “Rule 34”. It goes, “If it exists, there’s a porn of it.” There is no exception, it is said, to this “rule”. Not Pokémon, not Tetris blocks, not even unicorns. In 2016, Ogi Ogas, a computational neuroscientist at Harvard, published a study on whether the “rule” held up. It did—for the most part. The more obscure pornography could be difficult to access but “it’s out there if you want to find it”, he told The Washington Post. And if it isn’t, there’s the lesser-known “Rule 35”: “If there’s no porn of it, porn will be made of it.” A similar threat is playing out in India today. There are nearly a dozen websites hosting deepfake pornography featuring Indian women. Most of them are from the entertainment industry, including some of the best-known actors from films. India has banned over 800 adult websites since 2015 for allegedly hosting paedophile content but these deepfake videos are only a few clicks away. So far, there has been little discussion and no strategy on how to deal with these. https://lifestyle.livemint.com/news/big-story/deepfakes-when-seeing-is-not-believing-111609504596030.html
https://www.engineering.com/DesignSoftware/DesignSoftwareArticles/ArticleID/20902/When-Tech-Goes-Bad-Revenge-Porn-Explodes-with-Deepfakes.aspx
Much of the non-consensual pornography that has already been produced is targeted at celebrities, inserting them into professionally produced pornographic videos. Targets have included British and American actresses such as Emma Watson, Daisy Ridley and Kristen Bell. https://www.express.co.uk/news/world/1346071/Technology-news-deep-fakes-non-consensual-pornography-synthetic-media
Hayashida and Otsuki stand accused of creating and putting up deepfake videos online between December 2019 and July 2020, defaming the celebrities whose faces were used for the videos, and violating the copyright of the production companies that created the original adult videos. Police have built a case against Hayashida on suspicion of defamation of two celebrities and copyright violations against four production firms, while Otsuki is accused of the same charges in connection with two celebrities and three production companies. Both men have reportedly admitted to the allegations. According to the MPD's Safety Division, Hayashida made some 800,000 yen (approximately $7,600) by releasing the deepfake videos on a website he runs. He is quoted as telling investigators, "I wanted to make money," while Otsuki reportedly told police, "I published the videos to gain recognition from third parties." https://mainichi.jp/english/articles/20201002/p2a/00m/0na/027000c + Futamata has made ad revenue totaling more than 500,000 yen ($4,820) from the website, while Kubo has made more than 1 million yen, police said. http://www.asahi.com/ajw/articles/13944568
Jay-Z recitando a Hamlet u Ocasio-Cortez criticando el socialismo: la simulación de voces se expande y plantea retos vinculados a la propiedad intelectual así como contenidos más creativos https://www.eldiario.es/tecnologia/audio-deepfakes-polemica-derechos-autor-reinvencion-entretenimiento_1_6083312.html
Hundreds of explicit deepfake videos featuring female celebrities, actresses and musicians are being uploaded to the world’s biggest pornography websites every month, new analysis shows. The non-consensual videos rack up millions of views and porn companies are still failing to remove them from their websites.
Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, figures from deepfake detection company Sensity show. The videos continue to break away from dedicated deepfake pornography communities and into the mainstream.Deepfake videos hosted on three of the biggest porn websites, XVideos, Xnxx, and xHamster, have been viewed millions of times. The videos are surrounded by adverts, helping to make money for the sites. XVideos and Xnxx, which are both owned by the same Czech holding company, are the number one and three biggest porn websites in world and rank in the top ten biggest sites across the entire web. They each have, or exceed, as many visitors as Wikipedia, Amazon and Reddit. https://www.wired.co.uk/article/deepfake-porn-websites-videos-law
Last week, on a forum dedicated to cum tributes, people posted images of a 14-year-old YouTuber named Makenna and described in explicit detail the sexual acts they wanted to do to her. (Motherboard is not identifying the website to protect the privacy of the people whose images are still on it.) Makenna makes ASMR videos on her YouTube channel, which has more than 1.6 million subscribers. https://www.vice.com/en_us/article/4aywd3/tribute-porn-site-targeted-youtuber
A quarter of all deepfake pornography features K-pop stars. The ‘Nth room’ case shocked the public earlier this year when details of the inhumane crime was revealed through media reports and investigation results. Although the idea of digital sex crimes wasn’t new, the horrifying atrocities of what had been going on inside the anonymous chat rooms brought the issue to the surface like never before. Among the crimes are some lesser-known forms of digital sex crimes that had gone unnoticed by the public over the years but has come to light with the case. Through this two-part series, the Korea JoongAng Daily looks into the two types of crime that had not been properly addressed before: ‘friend fouling’ and deepfakes. https://koreajoongangdaily.joins.com/2020/05/17/features/deepfake-artificial-intelligence-pornography/20200517190700189.html
https://www.ibtimes.sg/deepfake-videos-new-threat-fight-misinformation-fake-news-regarding-coronavirus-42571
https://www.bizcommunity.com/Article/196/16/197251.html
https://www.thedrum.com/opinion/2020/01/10/what-do-deepfakes-mean-brands
Mai20Risks of economic losses are more dangerous than political manipulation https://www.nojitter.com/security/real-business-impacts-deepfakes
No comments:
Post a Comment