Friday, March 20, 2020

Consequências (OUTRAS; revenge porn) MULHERES

apr24

Two of the biggest deepfake pornography websites have now started blocking people trying to access them from the United Kingdom. The move comes days after the UK government announced plans for a new law that will make creating nonconsensual deepfakes a criminal offense.

Nonconsensual deepfake pornography websites and apps that “strip” clothes off of photos have been growing at an alarming rate—causing untold harm to the thousands of women they are used to target.

Clare McGlynn, a professor of law at Durham University, says the move is a “hugely significant moment” in the fight against deepfake abuse. “This ends the easy access and the normalization of deepfake sexual abuse material,” McGlynn tells WIRED.

Since deepfake technology first emerged in December 2017, it has consistently been used to create nonconsensual sexual images of women—swapping their faces into pornographic videos or allowing new “nude” images to be generated. As the technology has improved and become easier to access, hundreds of websites and apps have been created. Most recently, schoolchildren have been caught creating nudes of classmates.

https://www.wired.com/story/the-biggest-deepfake-porn-website-is-now-blocked-in-the-uk/



jan24

The finding that 415,000 deepfake images were posted online last year was made by Genevieve Oh, a researcher who analyzed the top ten websites which host such content.

Oh also found 143,000 deepfake videos were uploaded in 2023 – more than during the previous six years combined. The videos, published across 40 different websites which host fake videos, were viewed more than 4.2 billion times.

https://www.dailymail.co.uk/news/article-13007753/deepfake-porn-laws-internet.html


jan24

Popular search engines like Google and Bing are making it easy to surface nonconsensual deepfake pornography by placing it at the top of search results, NBC News reported Thursday.

These controversial deepfakes superimpose faces of real women, often celebrities, onto the bodies of adult entertainers to make them appear to be engaging in real sex. Thanks in part to advances in generative AI, there is now a burgeoning black market for deepfake porn that could be discovered through a Google search, NBC News previously reported.

NBC News uncovered the problem by turning off safe search, then combining the names of 36 female celebrities with obvious search terms like "deepfakes," "deepfake porn," and "fake nudes." Bing generated links to deepfake videos in top results 35 times, while Google did so 34 times. Bing also surfaced "fake nude photos of former teen Disney Channel female actors" using images where actors appear to be underaged.

A Google spokesperson told NBC that the tech giant understands "how distressing this content can be for people affected by it" and is "actively working to bring more protections to Search."
https://arstechnica.com/tech-policy/2024/01/report-deepfake-porn-consistently-found-atop-google-bing-search-results/



dez23

BOSSIER PARISH, La. (WVUE) - A 32-year-old Louisiana man arrested on child pornography allegations is now the first in the state to face charges under a new law aimed at safeguarding individuals from the misuse of deepfake technology.

Deepfake technology uses artificial intelligence to create highly realistic photos and videos. Deepfakes are becoming more realistic, easier to access, and have added to an era of disinformation.

According to the Bossier Parish Sheriff’s Office, Rafael Valentine Jordan was arrested on Lylac Lane, less than half of a mile from Bellaire Elementary School in Bossier, on Nov. 17 and booked into jail on one count of juvenile pornography.

During the investigation, authorities say they discovered 436 images of child pornography created using deepfake technology. On Dec. 1, a second count of juvenile pornography was added, along with two counts of unlawful deepfake creation.

https://www.wafb.com/2023/12/27/bossier-man-jailed-child-porn-also-states-first-face-new-deepfake-law/


dez23

'Violating and dehumanising': How AI deepfakes are being used to target women

https://news.yahoo.com/violating-dehumanising-ai-deepfakes-being-110449132.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAA9VfaNnCXE8iDgTXcghLxia5OGFGrnxxj2V_YmvTnvRm9TlOd2lQCA9yQ5-fjKPMuHrnAJl6VnxCl6r7rxVx_qNEtBbc44aXLabi5sTw2-AljlAJG9uV4HuvtuzpyHAvSPGi5b_EM3-cPVNe5tMITbsj0wRT3B5mZbnykjlFhLY


dez23

Over 24 million people visit websites that let them use AI to undress women in photos: Study

Graphika, a social network analysis company, revealed that a whopping 24 million people visited these undressing websites in September alone, highlighting a troubling surge in non-consensual pornography driven by advancements in artificial intelligence. Here are the details.

https://www.indiatoday.in/technology/news/story/over-24-million-people-visit-websites-that-let-them-use-ai-to-undress-women-in-photos-study-2473662-2023-12-08


out23

Opinion: The rise of deepfake pornography is devastating for women

https://edition.cnn.com/2023/10/29/opinions/deepfake-pornography-thriving-business-compton-hamlyn/index.html


out23

Deepfake Porn Is Out of Control
New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
A new analysis of nonconsensual deepfake porn videos, conducted by an independent researcher and shared with WIRED, shows how pervasive the videos have become. At least 244,625 videos have been uploaded to the top 35 websites set up either exclusively or partially to host deepfake porn videos in the past seven years, according to the researcher, who requested anonymity to avoid being targeted online.

https://www.wired.com/story/deepfake-porn-is-out-of-control/



jul23

In an essay for The Atlantic last month, Nina Jankowicz wrote about what it was like to discover that she'd been deepfaked into pornographic material.

"Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn't shocked," wrote Jankowicz, the former executive director for the United States' since-disbanded Disinformation Governance Board. "The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology — and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent."
https://futurism.com/google-nonconsensual-deepfake-porn


ab23

Noelle Martin was just 18 when she discovered that pornographic pictures of her were being circulated online. She never recalled taking, let alone sharing, intimate images. However, that was her face in those images - the body, however, wasn’t hers.

She became a victim of what would later be known as deepfakes. Pornographic pictures had been manipulated to look like her by using images she had shared on her personal social media accounts.

“This is a lifelong sentence,” Martin told Euronews Next. “It can destroy people's lives, livelihoods, employability, interpersonal relationships, romantic relationships. And there is very, very little that can be done once someone is targeted”.
https://www.euronews.com/next/2023/04/22/a-lifelong-sentence-the-women-trapped-in-a-deepfake-porn-hell

ab23

A 22-year-old Long Island man has been sentenced to six months in jail and must register as a sex offender for taking photos from social media accounts of nearly a dozen women when they were in high and middle school, altering them to make them sexually explicit and then posting them on a porn website for years, prosecutors say.

Patrick Carey, who was posting the fake images up to within hours of his 2021 arrest, also shared the women's personal identifying information, including full names, phone numbers and addresses -- and encouraged other users on the porn site to harass and threaten them with violence, according to court documents.

Carey pleaded guilty in December to multiple felonies in the deepfake scheme, including promoting a sexual performance by a child, aggravated harassment as a hate crime and stalking.

At Tuesday's sentencing, the Seaford man was ordered to stay away from each of the 11 victims -- a judge issued orders of protection lasting the statutory eight years maximum each. He will also be subject to 10 years of probation on top of the jail time and sex offender requirements.

Carey didn't address the media after the hearing, but said in court "I can’t chalk up my awful behavior to being a young dumb kid." He also apologized to the victims and their families, though added he didn't expect forgiveness.
https://www.nbcnewyork.com/news/local/crime-and-courts/long-island-man-jailed-in-deepfake-sex-scheme-targeting-14-women-from-his-high-school/4251661/

abr23

A gaming YouTuber has said she had to pay hundreds of pounds to remove porn which used deepfake technology to make it look like she was involved.

Sunpi has made a name for herself on YouTube by creating content about video games and gaming culture, with posts showing her travelling to play games, sharing her reactions to new titles and showing off her gaming setup.

She's earned more than 117,000 subscribers through her content, but last month she was made aware of pornographic deepfake content which used her image.

https://www.unilad.com/technology/sunpi-youtuber-forced-pay-remove-pornographic-deepfakes-161808-20230410

mar23

Found through Google, bought with Visa and Mastercard: Inside the deepfake porn economy.


The nonconsensual deepfake economy has remained largely out of sight, but it's easily accessible, and some creators can accept major credit cards
https://www.nbcnews.com/tech/internet/deepfake-porn-ai-mr-deep-fake-economy-google-visa-mastercard-download-rcna75071


mar23

Making Deepfakes Gets Cheaper and Easier Thanks to A.I.

Meme-makers and misinformation peddlers are embracing artificial intelligence tools to create convincing fake videos on the cheap.
https://www.nytimes.com/2023/03/12/technology/deepfakes-cheapfakes-videos-ai.html

Artificial intelligence tools are being used by meme-makers and misinformation peddlers to create convincing fake videos on social media. These videos, often called deepfakes, were once created using elaborate software, but now many of the tools to create them are available to everyday consumers, even on smartphone apps, and often for little to no money. The new altered videos, mostly the work of meme-makers and marketers, have gone viral on social media sites like TikTok and Twitter. The content they produce, sometimes called cheapfakes by researchers, work by cloning celebrity voices, altering mouth movements to match alternative audio and writing persuasive dialogue.
https://www.globalvillagespace.com/tech/a-i-makes-deepfake-creation-cheaper-and-easier/


mar23

Hundreds of sexual deepfake ads using Emma Watson’s face ran on Facebook and Instagram in the last two days

A deepfake app advertised itself on Meta platforms using faces of actors Watson and Scarlett Johansson.

https://www.nbcnews.com/tech/social-media/emma-watson-deep-fake-scarlett-johansson-face-swap-app-rcna73624

fev23

Amid the fallout, the Twitch streamer “Sweet Anita” realized deepfake depictions of her in pornographic videos exist online.

“It’s very, very surreal to watch yourself do something you’ve never done,” Twitch streamer “Sweet Anita” told CNN after realizing last week her face had been inserted into pornographic videos without her consent.

“It’s kind of like if you watched anything shocking happening to yourself. Like, if you watched a video of yourself being murdered, or a video of yourself jumping off a cliff,” she said.

https://edition.cnn.com/2023/02/16/tech/nonconsensual-deepfake-porn/index.html

jan23

Existing and proposed laws will fail to protect EU citizens from nonconsensual pornographic deepfakes—AI-generated images, audio, or videos that use an individual’s likeness to create pornographic material without the individual’s consent. Policymakers should amend current legislative proposals to better protect victims and, in the meantime, encourage soft law approaches. 

Although deepfakes can have legitimate commercial uses (for instance, in film or gaming), 96 percent of deepfake videos found online are nonconsensual pornography. Perpetrators superimpose the likeness of an individual—most often an actor or musician, and almost always a woman—onto sexual material without permission. Sometimes perpetrators share these deepfakes for purely lewd purposes, while other times it is to harass, extort, offend, defame, or embarrass individuals. With the increasing availability of AI tools, it has become easier to create and distribute deepfake nonconsensual pornography. 

There are no specific laws protecting victims of nonconsensual deepfake pornography, and new proposals will fall short.

https://datainnovation.org/2023/01/eu-proposals-will-fail-to-curb-nonconsensual-deepfake-porn/

jan23

Designed to abuse? Deepfakes and the non-consensual diffusion of intimate images

Abstract

The illicit diffusion of intimate photographs or videos intended for private use is a troubling phenomenon known as the diffusion of Non-Consensual Intimate Images (NCII). Recently, it has been feared that the spread of deepfake technology, which allows users to fabricate fake intimate images or videos that are indistinguishable from genuine ones, may dramatically extend the scope of NCII. In the present essay, we counter this pessimistic view, arguing for qualified optimism instead. We hypothesize that the growing diffusion of deepfakes will end up disrupting the status that makes our visual experience of photographic images and videos epistemically and affectively special; and that once divested of this status, NCII will lose much of their allure in the eye of the perpetrators, probably resulting in diminished diffusion. We conclude by offering some caveats and drawing some implications to better understand, and ultimately better counter, this phenomenon.

https://link.springer.com/article/10.1007/s11229-022-04012-2



nov22
child pornography

A Lehi man was arrested Tuesday and accused of making "deep fakes" of pornography that included the faces of children placed on adult bodies.

Jesse John Campbell, 44, was booked into the Utah County Jail by agents from the Internet Crimes Against Children task force for investigation of 10 counts of sexual exploitation of a minor, sexual abuse of a minor and lewdness involving a child.

The investigation began in early November when a family member allegedly discovered a video on Campbell's phone. The pornographic video included an adult woman's body with the head of a teen girl superimposed on it, according to a police booking affidavit.

"According to witnesses, Jesse is known to use 'deep fake' software to alter videos. In the video regarding the victim in this investigation, Jesse used the child victim's face and replaced it on another female's body who is engaged in sexually explicit conduct," the affidavit states.

ksl.com/article/50519297/lehi-man-arrested-in-deep-fakes-child-pornography-investigation-



set22
CASO


One quiet winter afternoon, while her son was at nursery, 36-year-old Helen Mort, a poet and writer from South Yorkshire, was surprised when the doorbell rang. It was the middle of a lockdown; she wasn’t expecting visitors or parcels. When Helen opened the door, there stood a male acquaintance – looking worried. “I thought someone had died,” she explains. But what came next was news she could never have anticipated. He asked to come in.

“I was on a porn website earlier and I saw… pictures of you on there,” the man said solemnly, as they sat down. “And it looks as though they’ve been online for years. Your name is listed, too.”

Initially, she was confused; the words ‘revenge porn’ (when naked pictures or videos are shared without consent) sprang to mind. But Helen had never taken a naked photo before, let alone sent one to another person who’d be callous enough to leak it. So, surely, there was no possible way it could be her?

“That was the day I learned what a ‘deepfake’ is,” Helen tells me. One of her misappropriated images had been taken while she was pregnant. In another, somebody had even added her tattoo to the body her face had been grafted onto.

Despite the images being fake, that didn’t lessen the profound impact their existence had on Helen’s life. “Your initial response is of shame and fear. I didn't want to leave the house. I remember walking down the street, not able to meet anyone’s eyes, convinced everyone had seen it. You feel very, very exposed. The anger hadn't kicked in yet.”

Nobody was ever caught. Helen was left to wrestle with the aftereffects alone. “I retreated into myself for months. I’m still on a higher dose of antidepressants than I was before it all happened.” After reporting what had happened to the police, who were initially supportive, Helen’s case was dropped. The anonymous person who created the deepfake porn had never messaged her directly, removing any possible grounds for harassment or intention to cause distress.
https://www.cosmopolitan.com/uk/reports/a41534567/what-are-deepfakes/


nov22

Arms folded and eyes smiling, I look completely at ease commanding a boardroom, but there’s one major issue with the photo that’s flashing up on the computer screen before me — I am completely naked.

In less than 15 seconds I have been digitally undressed by an easily accessible AI tool and I am horrified at the computer-generated clone staring back at me.

Welcome to the seedy underworld of deepfake porn — technology that uses ever-improving artificial intelligence that automates entirely realistic nude images of women and inserts their likeness into hardcore pornographic videos.
https://www.sundayworld.com/crime/special-investigations/denise-smith-it-took-just-seconds-to-generate-a-sleazy-deepfake-porn-image-of-me-online/1705842918.html

nov22
qual a motivação?

(HOLANDA)
The police arrested a 38-year-old man from Amersfoort for making a deepfake porn video of TV presenter and journalist Welmoed Sijtsma. The police questioned the man and then released him from custody, but he remains a suspect, the Amsterdam Public Prosecution Service (OM) confirmed to AD.

According to the OM, the man used footage of the TV presenter to create a pornographic video using deepfake technology. He placed Sjitsma’s head on the body of a porn actr

Sjitsma, 32, became aware of the fake porn video of her circulating online last year. She decided to make a four-part docuseries about it for broadcaster WNL. The last episode of Welmoed en de sexfakes will air on Thursday evening.
https://nltimes.nl/2022/11/10/amersfoort-man-arrested-making-deepfake-porn-tv-host-welmoed-sijtsma

out22

Mr Deepfakes can make you a porn star

His website will hijack your identity

https://unherd.com/2022/10/mr-deepfakes-can-make-you-a-porn-star/

out22

Scrolling through her Twitter feed one evening, Kate Isaacs stumbled across a disturbing video among her notifications.

"This panic just washed over me," Kate says, speaking publicly for the first time about what happened. "Someone had taken my face, put it on to a porn video, and made it look like it was me."

Kate had been deepfaked. Someone had used artificial intelligence to digitally manipulate her face onto someone else's - in this case a porn actress.

The deepfake video on Twitter - with Kate, who campaigns against non-consensual porn, tagged - had been made using footage from TV interviews she had given while campaigning. It appeared to show her having sex.

"My heart sank. I couldn't think clearly," she says. "I remember just feeling like this video was going to go everywhere - it was horrendous." 

Kate, 30, founded the #NotYourPorn campaign in 2019. A year later, her activism contributed to the adult entertainment website, Pornhub, having to take down all videos uploaded to the site by unverified users - the majority of its content.

https://www.bbc.com/news/uk-62821117


Set22

Mr. Deepfakes

Fast forward to today, and a leading site specifically created to house deepfake celebrity porn sees over 13 million hits every month (that’s more than double the population of Scotland). It has performative rules displayed claiming to not allow requests for ‘normal’ people to be deepfaked, but the chatrooms are still full of guidance on how to DIY the tech yourself and people taking custom requests. Disturbingly, the most commonly deepfaked celebrities are ones who all found fame at a young age which begs another stomach-twisting question here: when talking about deepfakes, are we also talking about the creation of child pornography?

It was through chatrooms like this, that I discovered the £5 bot that created the scarily realistic nude of myself. You can send a photograph of anyone, ideally in a bikini or underwear, and it’ll ‘nudify’ it in minutes. The freebie version of the bot is not all that realistic. Nipples appear on arms, lines wobble. But the paid for version is often uncomfortably accurate. The bot has been so well trained to strip down the female body that when I sent across a photo of my boyfriend (with his consent), it superimposed an unnervingly realistic vulva.

https://www.cosmopolitan.com/uk/reports/a41534567/what-are-deepfakes/


ago22

Deep Fakes Are Becoming More Harmful for Women

AI technology is now so sophisticated that you can't always believe your eyes.

https://www.psychologytoday.com/us/blog/womans-place/202208/deep-fakes-are-becoming-more-harmful-women

mai22
Empresa por trás do deepfake de Tom Cruise pode ter ligação com pornografia...

A empresa Metaphysic, responsável por viralizar um vídeo com deepfake altamente convincente do ator Tom Cruise, começou recentemente a circular uma campanha em prol do uso "ético" da inteligência artificial. A grande hipocrisia disso é que um dos fundadores da Metaphysic, Chris Umé, está supostamente vinculado a um coletivo de desenvolvedores que faz pornografia com o rosto de pessoas famosas e anônimas, sem o consentimento delas, claro. A história foi publicada no site Motherboard, da revista canadense Vice... - 
Além de Umé, outros nomes que aparecem como colaboradores contumazes do DFL é o do russo Ivan Perov e de membros do Mr. Deepfakes - o maior site de pornografia não consensual da internet, que inclusive monetiza esses vídeos.... - Veja mais em https://www.uol.com.br/tilt/noticias/redacao/2022/05/22/empresa-por-tras-do-deepfake-de-tom-cruise-pode-ter-ligacao-com-pornografia.htm?cmpid=copiaecola
https://www.uol.com.br/tilt/noticias/redacao/2022/05/22/empresa-por-tras-do-deepfake-de-tom-cruise-pode-ter-ligacao-com-pornografia.htm

ab22
According to a report posted on Yahoo.com, the hackers asked the Singaporean man to pay a ransom of $5,800. If he didn’t, they threatened, they would send everyone on his contact list a “deepfake” pornographic video of him.
And, if you are wondering just what a “deepfake” is, it is often when someone takes the image of someone’s face and uses artificial intelligence to superimpose it on someone else’s body. And, if you are thinking that those types of videos would be easy to spot as being fakes, well, in 2022 the folks who do these types of things have gotten pretty, darned good at it. So good, in fact, that it can often be hard to tell what is real and what isn’t
https://www.pennlive.com/crime/2022/04/pay-up-or-end-up-in-a-deepfake-pornographic-video-report-reveals-new-hacker-ransom-scheme.html


ab22
Deepfake pornographique : "Mon corps n'est pas un objet dont on peut se servir sans ma permission"
https://www.marieclaire.fr/deepfake-pornographique-mon-corps-n-est-pas-un-objet-dont-on-peut-se-servir-sans-ma-permission,1425880.asp 

jan22
nem Celebridades nem PR???; Sextortionists


AHMEDABAD: Sextortionists are now turning to AI (artificial intelligence) to up their blackmailing game. Their latest weapon is deepfake videos — digitally altered footage of real people showing stuff they never did. The worst victims are, of course, women.
Top sources say that online sextortionists from Mewat in Haryana, who earlier used screenshots of their male victims in compromising positions on WhatsApp chats to blackmail them, have gotten a tad creative.
They now use deepfake apps to create porn clips starring television actresses to target moneybags.
Using AI, the starlet’s face is superimposed on the face of the woman in the original clip to create a deepfake porn clip which is used to lure gullible men. “The end product will look like the celebrity was part of the porn clip,” said a police official.
Recently, a city lawyer, who was contemplating suicide after losing Rs 3 lakh to sextortionists, had called up a suicide prevention helpline. That is when the police learnt about the new modus operandi. The offenders threatened to make his footage public.
The police are yet to register a case and are trying to track the criminals’ digital fingerprints. According to sources, the offenders first collect information about their ‘prey’ from their public social media profiles.
https://timesofindia.indiatimes.com/city/ahmedabad/now-deepfake-porn-clips-starring-tv-actresses/articleshow/88983601.cms


dez21
From slut-shaming to “revenge porn”, from harassment to the emerging concern of deepfake pornography, the internet can be a distinctly hostile place for women. While much problematic online activity mimics abuses perpetrated offline, others—like deepfakes, for example—have been made possible because of the technology. In this chapter I examine online abuse as an artefact that is richly revealing about our culture. I begin with a discussion of abuse as artefact. I then explore women’s sexual objectification as a central undercurrent of their abuse, and I use the objectification lens to examine some specific forms of abuse disproportionately directed at women—from slut-shaming to image-based sexual abuse—to investigate what online behaviour reveals about sexual politics. https://link.springer.com/chapter/10.1007/978-3-030-83734-1_8

dez21
(motivações: misogenia; troll?)
According to the indictment, roughly 11 women contacted Nassau County detectives between January of this year and September to report they had found images of themselves on a pornographic website. Many of the women indicated that the images, taken when they were in high school and middle school, were re-posted on the website from their own social media platforms and altered to suggest they were engaging in sexual conduct.
Carey allegedly altered the images using "deepfake" technology, convincingly superimposing the victims' faces on other, separate images of women engaging in sexual conduct. The images posted to the website were accompanied by personal identifying information, including full names, addresses and telephone numbers.

"This defendant allegedly manipulated the photos of more than a dozen women, taken when they were teenagers, and posted the ‘deepfake’ images online for strangers’ sexual gratification," Acting Nassau County District Attorney Joyce Smith said in a statement. "His depravity deepened when he allegedly shared the victims’ personal identifying information – including their home addresses – encouraging site visitors to harass and threaten the women with sexual violence.Prosecutors say there may potentially be other dozens of other victims and encourage anyone who feels they may have been victimized by Carey to call the Nassau County district attorney's office at 516-571-2553. "These images are illicit and weren't consensually sent in. The people posting them are posting them without permission and that is part of the draw of this website," said Senior Investigative Counsel and Assistant District Attorney Melissa Scannell. "There are probably 50 women he did this to, so we think there are more people out there."

https://www.nbcnewyork.com/news/local/crime-and-courts/ny-man-indicted-in-depraved-deepfake-online-sex-scheme-targeting-at-least-14-women/3454461/


dez21
O PROBLEMA que justifica este trabalho

A research called Ajder who worked on a deepfake report for Sensity told the Japan Times: "The vast, vast majority of harm caused by deepfakes right now is a form of gendered digital violence."

They added that they worked on a study which indicated millions of women had been targeted by deepfake porn.

 https://www.thesun.co.uk/tech/17101334/deepfake-revenge-porn-websites-apps-exposed-study/ 


dez21

An Australian woman who's life was "shattered" by deepfake porn says proposed law changes to address the attacks in New Zealand will help empower fellow victims. Labour MP Louisa Wall is fighting to ensure victims of the attacks have the same recourse under the Harmful Digital Communications Act as other survivors of online abuse. (...) Noelle Martin, 27, was only a teen when her world was "shattered" after she discovered her image had been used in deepfake porn without her consent. Wall said there's no criminal pathway under the Harmful Digital Communications Act for victims to hold those responsible to account. The MP said she found out about the issue after she had proposed an amendment to the law earlier this year which would explicitly make the posting of intimate images and recordings without consent illegal. https://www.nzherald.co.nz/nz/calls-from-mps-and-survivor-for-protections-for-deepfake-porn-victims/SXIQAOQSEVEO4X6S2AKZEE242A/


nov21

Gibi is a Youtuber who has around 3.8 million subscribers for her ASMR focused YouTube channel. She experienced abuse by people using “deepfakes”, a AI-generated image of someone’s likeness. 

“My deepfakes have been around ever since I started my YouTube channel. I’ve seen how it has gotten very good so that makes me extremely nervous because I know how fast technology can advance. 

When I first saw a deepfake, I was reading about how the computer learns and gets better at matching your face and putting it onto something pornographic. Watching the videos is very surreal — people believe it’s real. The thing that bothers me is I did not consent for my image to be used that way, they are able to do it with no consequences and it feels very violating. I contemplated deleting my channel because I felt very overwhelmed. 

It’s something that I just keep working through and I do my best to protect my privacy. Do I ever feel safe? Not really!

I used to keep tabs on the deepfakes until it felt useless, if you let it consume you it’s going to waste your time and that’s not what I want. Sometimes people will email them to me, like “Gibi, somebody made porn of you!” I even saw that somebody was doing commissions, making money off my doctored photos and videos. They’re running this business, profiting off of my face doing something that I didn’t consent to, like my suffering is your livelihood. It made me really mad, but again, there was nothing I could do. 

Once, I was approached by a company taking deepfakes off the internet but their prices were exorbitant. Why should I be using my hard earned money to be paying you to privately take down these videos? I think that lawmakers and governments are extremely overwhelmed by the internet so they just let it go. If somebody’s making a deepfake in a different country, my country doesn’t care because there’s nothing they can do. 

For me, justice would be not letting them be anonymous anymore. It’s much too easy to make yourself anonymous online where law enforcement doesn’t care enough to put in the effort to find out who’s doing it. https://www.globalcitizen.org/en/content/online-gender-based-violence-survivor-equality-now/ 

nov21

Alguns deles estão ajudando a viralizar supostas imagens de um nude de Greta Thunberg. Nas fotos (que, por motivos óbvios, não exibiremos aqui), a moça aparece em um suposto nu frontal. Há, ainda, um suposto vídeo em que ela estaria se exibindo. Seja por curiosidade ou por raiva, o fato é que as fotos se espalharam como água no WhatsApp. Só que o que nem todos apontaram é que as imagens não são da ativista sueca Greta Thunberg. É claro que poucas viralizaram com conotações políticas como no caso de Greta Thunberg. Em uma página do fórum Reddit, internautas, inclusive, identificaram a modelo que emprestou o corpo para a montagem. Chama-se Sweet Pie. https://www.boatos.org/mundo/fotos-greta-thunberg-nua-vazam-internet.html + https://www.swissinfo.ch/spa/efe-verifica-greta-thunberg_no-es-greta-thunberg-bailando-desnuda--es-un--deepfake-/47144616

jan22

Ya el pasado mes de noviembre, la Agencia Efe recibió un deepfake en el que se mostraba a la joven activista de 19 años Greta Thunberg bailando desnuda. Se trataba de un vídeo que estaba circulando sobre todo en grupos de WhatsApp en Brasil y en el que se la ridiculizaba con un mensaje: «Está probando nuevas tácticas en contra del cambio climático».

Bastó una búsqueda en Internet para que los compañeros de Efe encontrasen el vídeo original de la actriz, cuyo rostro había sido reemplazado por el de Thunberg. En este caso, añade la experta de Karisma, el deepfake pornográfico busca «cuestionar si su activismo en medioambiente es creíble teniendo en cuenta que hay un video circulando de ella desnuda» y «dejar su discurso en un segundo plano, quitarle interés a la lucha que está adelantando». https://efeminista.com/deepfakes-pornograficos-silenciar-mujeres/amp/



out21

Martin tried contacting the police, private investigators and government agencies, but because she didn't know where the images originated there was no way to hold the creators accountable. Martin even attempted to contact the operators of the the porn sites that hosted the pornographic photos of her, but those efforts sometimes led to more abuse.  "Sometimes I'd get a response and they'd remove it, and then it'll pop up two weeks later," she said. "And then one time, one perpetrator said that they'd only remove the material if I sent them nude photos of myself within 24 hours." LEIS: According to Dodge, the legislative system has been slow to react to the threat women face from deepfakes.  "In most states, non-consensual pornography is illegal," he explained. "However, if you create deepfake non-consensual pornography, those laws are not going to apply because it's not the victim's body being portrayed in the video. It's just their face. So the video won't meet the threshold to be prosecuted under that law." "Because I was speaking out about it, the perpetrators decided to create fake videos of me," Martin said. "You only seek to lose when you talk about something like this, because when you dare to speak about this kind of abuse, you expose yourself to more people seeing the very thing you don't want people to see." Despite attempts to silence her, Martin has remained an outspoken activist. She advocated for legal solutions that led Australia to make image-based abuse a criminal offense in 2017. But these years have taken a toll. https://www.cbsnews.com/news/deepfake-porn-woman-fights-online-abuse-cbsn-originals/

nov21
Enora Malagré, victim of deepfake: “The trauma is akin to sexual assault”. “You find yourself, in spite of yourself, at the heart of a pornographic film. Enora Malagré, journalist and author, has been the victim of deepfake (or hypertrucage). This process makes it possible to superimpose one video on another in order, for example, to change the face of a person. This technique can be used to create malicious hoaxes, fake news, but also fake pornographic videos. Enora Malagré was unaware that she was the target of deepfake. It was the journalists of the program “Complément d’études”, who produced an issue on cyberstalking, who informed her. “I was shocked, traumatized,” she comments. She describes the minutes that followed this discovery: “I started looking at the picture. In fact, after 30 to 40 seconds, it feels like it’s you anyway. We know that it is not his body, but we assimilate this body as being his. »Quickly grabbed by« these violent images », she describes what she felt. “We have a trauma and a shock which, I believe, is remotely akin to sexual assault,” she says. In France, Enora Malagré is the first to speak out and express herself on this disturbing phenomenon, which can affect a public figure like a complete anonymous. In the United States, Scarlett Johansson had sounded the alarm in 2019. “The proliferation of deepfake videos is used to harass and humiliate women, whether or not they are on the front of the stage,” she said , before detailing the difficulties in initiating legal proceedings. https://news.in-24.com/lifestyle/news/170538.html + https://www.elle.fr/Societe/News/Enora-Malagre-victime-de-deepfake-Le-traumatisme-s-apparente-a-une-agression-sexuelle-3970580 + https://mrdeepfakes.com/celebrities/enora-malagre 

Nov21

combater LEIS

Are there effective legal approaches to trying to stem the spread?

There is quite a lot of action going on around the world right now to think about what we can do about deepfake image abuse in different countries. Also on the state and federal level in the US. There are many states introducing legislation to criminalize the use of nonconsensual fake pornography. The key thing there is it’s criminal. The state would prosecute the offender, the individual would not have to bring that case themselves and pay for that as a civil trial. In the UK, there’s a review going on into intimate image abuse laws, with deepfakes in the crosshairs. You’re seeing in South Korea a big push from fans of K-Pop girl groups, who are one of the biggest groups targeted. That’s one of the most surprising findings of my reporting back in 2019, which was that 25 percent of the victims were South Korean K-Pop singers. In South Korea, there’s a lot of social action trying to get this explicitly outlawed. There is still a good chance that if someone created something like this and was identified and reported to police, they could be charged with harassment or indecent communications, but we probably do need specific laws that acknowledge the specific harms that this new technology can cause. If you can identify who is creating this material on the internet, where anonymity is ubiquitous, chances are there may be recourse in the legal system. But identifying who they are is a big challenge. They may not be in the same jurisdiction, which makes it even harder. The law can do very little to stop the proliferation of the tools for people who really want to find them. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124


nov21

COMBATER

Do these sites ever truly disappear? Can their coding or technology really disappear once they’ve been shared?

It’s a great question and, unfortunately, it’s one where the answer isn’t so optimistic. This bot on Telegram that I discovered was a kind of a Frankenstein mutation of the same tool which was released in June of 2019. That tool went down because it got so much traffic after some reporting on it and people just cloned the software and it sprung up in many different forms. It’s easily accessible in many different forms. You can access it as the raw code, you can access it as a super user-friendly web tool, and you can access it as a website as well. The problem is you can’t regulate mathematics. People know how to replicate this now. In many cases, the software which is used to create these tools comes from perfectly legitimate uses of it and is being perverted by bad actors. Unfortunately, it is very difficult, near impossible, to ever really remove this stuff entirely. When one goes down, others spring up to take its place. There are things we could do to help, to drive it underground. Internet service providers could help, potentially. Responsive action from hosting services, for example. Making sure app stores are all aligned. Ultimately, if someone wants to find the techniques and tools, they’re gonna find them somewhere. We can make a difference by making it as hard to find as possible. Friction is a big thing. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124


nov1

Is there any space for consensual deepfake porn? Are there creative consensual applications? In 2018, the porn company Naughty America announced that it would provide custom, consensual deepfakes for customers.

The question isn’t so much whether consensual deepfake pornography is possible. Of course, it’s possible. More pressing: Is it possible to create the tools for consensual deepfake pornography without them being inevitably misused in a way that causes more harm than good? That tool that I discovered framed itself as a way to put yourself into pornographic footage, but obviously it had no guardrails and I think it was disingenuous. I think they knew exactly what it was going to be used for. The Naughty America thing was a PR stunt, I think. Maybe there’s a way to have that service, but then do you need to have a know-your-customer style verification service? How do you confirm consent as having been granted? Who do you need the consent from—the performer, whose body you’re being swapped onto? If you want to scale that technology, is it possible to do that in a way where women or men are not going to be targeted and cause a lot of harm? I think it will be very hard, unless you’re doing a very bespoke service with contracts being signed and passports and video calls. There’s a lot of layers of security that will be needed to make sure that everything is OK.

There’s a really interesting question of whether making deepfake pornography without sharing it should be considered bad. Obviously, sharing this stuff, to many people, is the primary offense, right? But there’s a really interesting debate: Should there be a simple making offense? This is the idea if you just make a piece of deepfake intimate imagery for your own consumption and you have no intention of sharing it, should that still be considered a criminal act? There’s a guy called Carl Öhman who is a philosopher of technology who coined a term for this: the pervert’s dilemma. One way of looking at it is saying, well, you’re trying to regulate and police fantasy. People fantasize about other people all the time. The other side of it says: By bringing into existence a piece of footage of that nature, even just in the act of making it, you’re violating someone’s dignity in a way. What’s more, you’re bringing into existence something that could do a great amount of harm. I definitely am inclined to fall into the latter camp, which is that, if not explicitly made criminal, it should certainly be highly discouraged and is ethically very dubious. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124

nov21

conceito de RP: pode não envolver pornografia!!!

There’s a video of me attempting an armed robbery. The victim’s head-mounted GoPro camera recorded it. He was a cyclist, minding his business as he made his way down the street. Out of nowhere, the video shows me riding up on a motorcycle and pulling out a gun, forcing him to stop in his tracks; I demand he gives me his rucksack, but he claims he doesn’t understand what I’m sayingYou may think: “I’m not famous, and no one is ever going to want to see me act in a porno.” But as my deepfake shows, that’s not the only damaging scenario in which you could be deepfaked. I have it in my power – as do you and everyone reading this – to download free deepfaking software right now and create something seriously damaging. What if I made a video of you kissing a stranger, and threatened to send it to your spouse or children? Wouldn’t you pay me some blackmail money for that not to happen? Especially if your spouse knows you’ve been unfaithful in the past? https://www.telegraph.co.uk/news/2021/10/30/deepfaked-committing-crime-should-worried/

Deepfakes first stepped onto the public stage via videos mapping celebrities’ faces onto porn performers’ bodies and have since impacted non-celebrities, sometimes as a doctored version of “revenge porn,” where, say, exes spread digitally created sex tapes that can be practically indecipherable from the real thing. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124

CONCEITO: 

JEZEBEL: You prefer to use the phrase “deepfake image abuse” instead of “deepfake porn,” can you explain? HENRY AJDER: Deepfake pornography is the established phrase to refer to the use of face-swapping tools or algorithms that strip images of women to remove their clothes, and it comes from Reddit, where the term first emerged exclusively in this context of nonconsensual sexual image abuse. The term itself, deepfake pornography, seems to imply, as we think of with pornography, that there is a consensual element. It obscures that this is actually a crime, this is form of digital sexual harassment, and it doesn’t accurately reflect what is really going on, which is a form of image abuse.


==========================Aula Out21=======================

out21

Japanese police on Monday arrested a 43-year-old man for using artificial intelligence to effectively unblur pixelated porn videos, in the first criminal case in the country involving the exploitative use of the powerful technology. Masayuki Nakamoto, who runs his own website in the southern prefecture of Hyogo, lifted images of porn stars from Japanese adult videos and doctored them with the same method used to create realistic face swaps in deepfake videos. Nakamoto reportedly made about 11 million yen ($96,000) by selling over 10,000 manipulated videos, though he was arrested specifically for selling 10 fake photos at about 2,300 yen ($20) each. Nakamoto pleaded guilty to charges of copyright violation and displaying obscene images and said he did it for money, according to NHK. He was caught when police conducted a “cyber patrol,” the Japanese broadcaster reported. Photo-realistic images created using AI are increasingly common and have raised many legal and ethical questions concerning privacy, sexual exploitation, copyright, and artistic expression. “This is the first case in Japan where police have caught an AI user,” Daisuke Sueyoshi, a lawyer who’s tried cybercrime cases, told VICE World News. “At the moment, there’s no law criminalizing the use of AI to make such images.” https://www.vice.com/en/article/xgdq87/deepfakes-japan-arrest-japanese-porn Na segunda-feira (18), a Polícia da Prefeitura de Kyoto prendeu um homem que admitiu ser responsável por criar deepfakes de pornografia e vender mais de 2.500 arquivos de vídeo adulterado. Masayuki Nakamoto violou a Lei de Obscenidade do Japão, que proíbe a exibição de “materiais indecentes” com exibição de genitais. O que normalmente acontece é que esses vídeos normalmente são pixelizados ou desfocados, e o acusado reverteu essa pixelização. De acordo com o jornal japonês Mainichi, este parece ser o primeiro caso de prisão relacionada à pornografia no país. akamoto está sendo acusado de violar os direitos autorais de uma produtora de vídeo de Tóquio, já que utilizou vídeos pixelizados protegidos por direitos autorais, tornou-os nítidos e publicou em seu próprio site. Ele também se utilizou de inteligência artificial (IA) para incluir imagens de genitais que não estavam presentes na filmagem original. Na imagem abaixo, fornecida pela Polícia da Prefeitura de Kyoto, é possível visualizar uma demonstração do método de adulteração feito com a inteligência artificial TecoGAN — tecnologia utilizada por Nakamoto no caso em questão. https://mundoconectado.com.br/noticias/v/21110/deepfake-homem-e-preso-por-aplicar-genitais-falsos-em-filmes-porno + https://news.yahoo.com/japanese-man-first-suspect-ever-191405154.html L'homme a utilisé un outil baptisé TecoGAN spécialisé dans la super-résolution, autrement dit l'amélioration de la définition des images, et entraîné sur des images non censurées. Il aurait vendu plus de 10.000 deepfakes sur son site Web et gagné 11 millions de yens (83.000 euros) en tout. Toutefois, il est poursuivi pour seulement 10 des images, vendues 2.300 yens (17 euros) chacune. L'homme a plaidé coupable pour avoir enfreint le droit d'auteur et affiché des images obscènes, et a indiqué l'avoir fait pour l'argenthttps://www.futura-sciences.com/tech/actualites/intelligence-artificielle-pornographie-ia-permet-enlever-flou-images-censurees-94345/


out21

THE ISSUE OF “deepfakes” has been widely discussed in Taiwan after YouTuber “Xiao Yu” was found to be responsible for a Telegram group used to sell, commission, and circulate “deepfake” porn videos of women, who were mostly public figures. Xiao Yu, whose real name is Zhu Yu-chen, was subsequently taken into custody. Xiao Yu is primarily known for humor videos, though some of his viral stunts have caused controversy in the past.  The public response to the incident was large enough that no less than President Tsai Ing-wen would comment on the matter, drawing attention to the issue of deepfakes with a post on her Facebook. The Telegram group was previously detailed by an investigative report from Mirror Media that was originally released in May, which became widely circulated again after the arrest came to light. https://newbloommag.net/2021/10/19/deepfake-arrest/ A Taiwanese YouTuber suspected of creating and selling deepfake porn videos featuring more than 100 politicians and influencers was on Monday released on bail after being arrested the previous day. Chu Yu-chen (朱玉宸), 26, who uses the name Xiaoyu (小玉) on YouTube, was arrested on Sunday in New Taipei City, along with two suspected accomplices, a 24-year-old YouTuber surnamed Yeh (耶), known as Shaiw Shaiw (笑笑), and a 22-year-old man Chuang (莊).A conviction for distributing obscene videos carries a maximum sentence of two years in prison, which can be converted or added to a NT$90,000 (US$3,222) fine, while a public insult conviction could result in a fine of up to NT$9,000.New Taipei City Deputy Chief Prosecutor Nieh Chung (聶眾) said that while the alleged crimes were serious, it was not necessary to detain the suspects https://www.taipeitimes.com/News/front/archives/2021/10/20/2003766430


set21

A petition posted on the Cheong Wa Dae website in January called for tough punishment for people creating and uploading deepfake porn videos of Korean celebrities and drew around 390,000 signatures.

But increasingly, people's ordinary friends and acquaintances have become victims. Police in North Jeolla Province arrested a man in his 20s who created deepfake porn videos of a woman he met on the Internet and uploaded them on porn websites. http://english.chosun.com/site/data/html_dir/2021/09/30/2021093000739.html



set21

set21
What Deepfake Technology Means for Women. https://www.wnycstudios.org/podcasts/takeaway/segments/what-deepfake-technology-means-women

set21

A horrifying new AI app swaps women into porn videos with a click. Deepfake researchers have long feared the day this would arrive. https://www.technologyreview.com/2021/09/13/1035449/ai-deepfake-app-face-swaps-women-into-porn/ + https://onezero.medium.com/deepfake-porn-when-tech-ruins-womens-lives-3ae99b2c4bed  Ajder has seen a lot of disturbing things, but a few months ago, he came across something he’d never seen before. It was a site that allowed users to simply upload a photo of someone’s face and produce a high-fidelity pornographic video, seemingly a digital reproduction of that person. “That is really, really concerting,” he says. Ajder alerted the journalist Karen Hao, an artificial intelligence researcher. Last month, she wrote about the site in the MIT Technology Review, bringing attention to the specter of free, easily-created deepfake porn. “[T]he tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video,” wrote Hao. “All it requires is the picture and the push of a button.” The next day, following an influx of media attention, the site was taken down without explanation. But the site’s existence shows how easy and accessible the technology has become. I found this website a few months ago and have monitored it and I reported it when I found the functionality was evolving and becoming increasingly accessible. It was the first of its kind that provided a library of footage already. All you had to do was upload a picture of someone’s face, choose the video from their pre-selected videos, press the button and it would generate the output. https://jezebel.com/deepfake-porn-is-getting-easier-and-easier-to-make-1847839124

set21

AHMEDABAD: A 46-year-old businessman with a unit in Odhav recently got entrapped in a sextortion racket when he accepte .. request sent by a beautiful woman on social media. Within tw .. exchanged phone numbers and a video call was made by the woman. http://timesofindia.indiatimes.com/articleshow/86020397.cms?utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst

Ago21

Launched in 2020, the site boasts that it developed its own “state of the art” deep-learning image translation algorithms to “nudify” female bodies, and that the technology is so powerful there is not a woman in the world, regardless of race or nationality, who is safe from being “nudified.” But it doesn’t work on men. When fed a photo of a cisgender man’s clothed body, the site gave him breasts and a vulva. https://www.huffpost.com/entry/deepfake-tool-nudify-women_n_6112d765e4b005ed49053822 Leading credit-card companies have come under fire over their alleged links to a popular website that allows users to digitally “undress” women and girls. Mastercard and Visa were listed last week as approved payment methods on a “nudifying” website widely used to create fake naked images of people without their consent. It promises to “make men’s dreams come true”. https://www.thetimes.co.uk/article/credit-card-giants-tied-deepfake-nudify-site-zjhzmrz3v 


jun21
RP
The issue of deepfakes – convincingly real looking fake videos and pictures – is once again in the spotlight after the Ambernath police arrested a 28-year-old Malad resident for allegedly generating objectionable content of a 25-year-old woman who refused to marry him. Police said that the victim, who is married and pregnant, stays with her family in Ambernath. She approached the police on May 19 this year to complain about several objectionable videos and pictures of her that were circulated on social media through multiple accounts. The police said that the victim was in a panic as she had never posed for any of the pictures or videos, and the repeated emergence of such content on Facebook and Instagram was causing problems in her married life. The police registered a case against unknown persons and started investigating the matter. Details of all the pictures and videos were collected and, with the help of the Thane Police cyber cell, the Ambernath police sought Internet Protocol addresses and other data from Facebook and Instagram. The data, which was received after several days, included a cell phone number, and the police then obtained Call Detail Record and registration details of the number. https://www.hindustantimes.com/cities/others/28yrold-mumbai-man-held-for-circulating-deepfakes-on-social-media-101625514850866.html 


jun21
https://english.lokmat.com/photos/technology/legal-experts-warn-deepfake-pornography-could-become-the-next-sexual-pandemic/?utm_source=english.lokmat.com&utm_medium=infiniteGallery

JU n21
Coreia do Sul
When compromising photographs of 21-year-old K-pop star Nancy from the girl group Momoland were taken without her knowledge in January 2021, these images were manipulated using deepfake technology. It was the first time the Korean general public became aware of the thus-far underground K-pop deepfake problem, and citizens quickly began circulating a petition addressed to their presidential office. Officials responded, explaining that they were working to track down creators through messaging platforms like Telegram and Discord, and Nancy’s entertainment agency released a statement saying they’d be taking legal action. In a digitally dystopian fashion, Nancy quickly returned to fans’ screens to promote her group’s latest release, smiling and dancing along to choreographed moves as though nothing was wrong. At the time of writing, her agency has yet to update fans on the case.
https://i-d.vice.com/en_uk/article/k78zzy/k-pop-deepfake-porn-idols-cyber-investigation


maii21

A leading legal expert is warning of an "epidemic" of sexual abuse where images of people's faces are merged with pornography and made available onlineDeepfake pornography is where computer technology is used to map the faces of celebrities and private citizens on to explicit sexual material. Prof Clare McGlynn said it made it much easier for perpetrators to abuse and harass women. https://www.bbc.com/news/uk-scotland-57254636 


Ab21
MINORIAS
This technology poses a particular threat to marginalized communities. If deepfakes cause society to move away from the current “seeing is believing” paradigm for video footage, that shift may negatively impact individuals whose stories society is already less likely to believe. The proliferation of video recording technology has fueled a reckoning with police violence in the United States, recorded by bystanders and body-cameras. But in a world of pervasive, compelling deepfakes, the burden of proof to verify authenticity of videos may shift onto the videographer, a development that would further undermine attempts to seek justice for police violence. To counter deepfakes, high-tech tools meant to increase trust in videos are in development, but these technologies, though well-intentioned, could end up being used to discredit already marginalized voices. 
https://www.brookings.edu/techstream/the-threat-posed-by-deepfakes-to-marginalized-communities/
 
out20
IMPORTANTE (ver fontes)
Notably, however, it is important to state that the individuals were arrested for copyright infringement and the defamation of their celebrity victims. Though positive in that such damaging behaviour is being closed down, there is a distinct neglect on the impact of the celebrity victims whose likeness have now been widely spread with no guarantee that any associated images can be taken down completely. Moreover, with a growing number of scholars including deepfaking under the umbrella term of image-based sexual offending, sanctioning such actions under copyright law has the potential to undermine the pervasive and damaging impact. We know from the literature underpinning the dissemination of private sexual images that victims may be impacted through depression, anxiety, and poor self-image
https://www.drdeanfido.com/post/deepfake-pornography-arrests

mar21
The Real Threat of Deepfake Pornography: A Review of Canadian Policy
https://www.liebertpub.com/doi/abs/10.1089/cyber.2020.0272


fev21
En el contexto de la pandemia, esta tendencia es aún más preocupante. Sophie Mortimer, que gestiona la organización británica sin ánimo de lucro Revenge Porn Helpline, dijo a Technology Review que el número de casos de la línea de ayuda casi se ha duplicado desde el inicio de la cuarentena. "Las relaciones abusivas existentes han empeorado, y el abuso digital ha experimentado un aumento a medida que las personas se han ido aislando y han pasado más tiempo en línea", advirtió.
https://mundo.sputniknews.com/20210219/los-deepfake-en-el-porno-un-nuevo-tipo-de-violencia-contra-las-mujeres-1107960405.html

fev21
Deepfake porn is ruining women’s lives. Now the law may finally ban it
https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/

jan21
Casos com raparigas e homens:
 https://www.philstar.com/entertainment/2021/01/28/2073722/deepfake-stars-who-fell-victim-edited-nude-photos-videos

jan21
In other words, the proliferation of these tools means that any woman who has a regular image or video posted online either by themselves or by another person, ones that were never sexually explicit, could potentially be the target of ‘revenge porn’, a type of  image-based abuse that includes the “nonconsensual sharing and creation of sexual images” through deepfakes. https://www.trtworld.com/magazine/deepfakes-and-cheap-fakes-the-biggest-threat-is-not-what-you-think-43046

jan21
A woman who has been the victim of deepfake pornography is calling for a change in the law. Last year, Helen Mort discovered that non-sexual images of her had been uploaded to a porn website. Users of the site were invited to edit the photos, merging Helen's face with explicit and violent sexual images.
Speaking to BBC Radio 5 Live's Mobeen Azhar, Helen said she wanted to see the creation and distribution of these images made an offence.
https://www.bbc.com/news/technology-55546372

jan21

There is a popular internet meme called “Rule 34”. It goes, “If it exists, there’s a porn of it.” There is no exception, it is said, to this “rule”. Not Pokémon, not Tetris blocks, not even unicorns. In 2016, Ogi Ogas, a computational neuroscientist at Harvard, published a study on whether the “rule” held up. It did—for the most part. The more obscure pornography could be difficult to access but “it’s out there if you want to find it”, he told The Washington Post. And if it isn’t, there’s the lesser-known “Rule 35”: “If there’s no porn of it, porn will be made of it.” A similar threat is playing out in India today. There are nearly a dozen websites hosting deepfake pornography featuring Indian women. Most of them are from the entertainment industry, including some of the best-known actors from films. India has banned over 800 adult websites since 2015 for allegedly hosting paedophile content but these deepfake videos are only a few clicks away. So far, there has been little discussion and no strategy on how to deal with these. https://lifestyle.livemint.com/news/big-story/deepfakes-when-seeing-is-not-believing-111609504596030.html


2018
But while the current targets are at least somewhat protected by fame — “People assume it’s not actually me in a porno, however demeaning it is,” Johansson said in a 2018 interview — abuse-survivor advocate Adam Dodge figures that non-celebrities will increasingly be targeted, as well. Old-fashioned revenge porn is a ubiquitous feature of domestic violence cases as it is, says Dodge, who works with victims of such abuse as the legal director for Laura’s House, a nonprofit agency in Orange County, California. And now with deepfakes, he says, “unsophisticated perpetrators no longer require nudes or a sex tape to threaten a victim. They can simply manufacture them.” https://knowablemagazine.org/article/technology/2020/synthetic-media-real-trouble-deepfakes


dez20
la tabla de los 25 vídeos con más visualizaciones en YouTube, dentro de los 50 vídeos observados para hacer este trabajo, invita también a futuras reflexiones anhelando más comprensión de fenómenos sociales en los contextos digitales actuales. Ha dejado tres ideas importantes: los protagonistas de deepfakes son mayoritariamente varones; los creadores de deepfakes, en casi todos los casos, crean un perfil cuyo nombre se refiere directamente de la técnica; y un alto número de seguidores no asegura directamente la viralidad de un vídeo. Con ello, surgen interrogantes novedosos: ¿por qué los protagonistas son varones?, ¿hay deepfakes de mujeres políticas?, ¿la representación es diferente o desigual?, ¿los usuarios que mantienen esas cuentas son profesionales o aficionados?, ¿ofrecen otros vídeos?,
Alfabetización moral digital para la detección de deepfakes y fakes audiovisuales
Víctor Cerdán Martínez1; María Luisa García Guardia2; Graciela Padilla Castillo


DEZ20
Pornhub's removal of as many as 10 million videos Monday — a content-removal earthquake on a scale the web has rarely seen before — sent tremors through a tech industry built on user-generated contentDriving the news: Following a New York Times expose of underage and nonconsensual content on Pornhub, Mastercard and Visa stopped providing service to the site.
https://www.axios.com/pornhubs-video-purge-legal-riddle-7ce822bf-03b6-4d72-bb2d-fe482e0b6fc1.html

dez20
PR
When Tech Goes Bad. Revenge Porn Explodes with Deepfakes
https://www.engineering.com/DesignSoftware/DesignSoftwareArticles/ArticleID/20902/When-Tech-Goes-Bad-Revenge-Porn-Explodes-with-Deepfakes.aspx

nov20
Mulheres 
Pornhub also has policies prohibiting people from posting deepfakes, although this is still where most deepfakes can be found. Former Sensity researcher Henry Ajder says: “they’ve basically just banned people from labelling deepfakes. People could still upload them and use a different name.” Companies may have policies in place that allow them to remove deepfakes and even punish users, but there are few legal precedents for companies or individuals to take action against people misusing deepfakes. Although some deepfakes may infringe on copyright laws, data protection laws, or be considered defamatory, they are not in themselves illegal. However, many countries also have laws that criminalize revenge porn. Yet, these laws do not always allow deepfakes to be classified as revenge porn. In some countries, like Britain, these laws explicitly exclude images that have been created by altering an existing image from those laws, which would include deepfakes. Additionally, images shared on the internet do not abide by international borders; therefore, it can be extremely difficult for governments to implement policies that would grant true justice and protection to citizens from malicious deepfakes. The inconsistency of governmental policies across borders and lack of legal precedents suggests that social media platforms are better equipped to police the misuse of deepfakes on the web.

 https://www.mironline.ca/what-the-rise-of-deepfakes-means-for-the-future-of-internet-policies/
Pornhub has pushed back against accusations that it allows child sexual abuse materials on its popular online pornography website. The article includes comments from various people whose lives were ruined as minors after their nude images were displayed without their knowledge on Pornhub. Mindgeek is based in Montreal but it operates globally and jurisdiction is difficult to determine because it hosts content outside of Canada, said RCMP spokeswoman Cpl. Caroline Duval. LINK


A autora enviou: For example, in Britain, revenge porn laws do not criminalize images that were created using existing images, such as the way deepfakes are created. So, someone would not be able to seek justice against revenge porn if they were deepfakes. In contrast, the law regarding revenge porn in Scotland includes deepfakes because it does not exclude images that were altered or otherwise created using an existing image. I hope that provides clarification.

nov20
MULHERES
Samantha Cole, staff writer at Vice, sheds light on the key question of gender dynamics shaping the majority of the videos today. “Deepfake’s origin was in making celebrity porn. There was Photoshop with static images and then deepfakes took it a step further using algorithms to add them to videos. The people I talk to aren’t machine learning experts but rather people interested in using this new technology. But what they miss out on is consent. Most of the deepfakes that exist online are porn.” As she states, creators making this porn are not engineers but rather people without technical expertise who utilize the tools.
https://immerse.news/prepare-dont-panic-for-deepfakes-c77f9f683f30

nov20
PORN e ganhar dinheiro
According to MIT, the deepfake bot has a relatively simple operation: users can send any image of a woman through Telegram’s desktop or mobile application. The bot will recreate a naked picture in a few minutes. Unlike DeepNude, where the cost per nude was $50, the Telegram bot only needs $1.5 if you want to remove the watermark or speed up the process. Otherwise, the nude conversion is completely free. https://belatina.com/deepfakes-nudes-ai-technology-misogyny/ + No último mês, o Telegram se envolveu em uma enorme polêmica: alguns usuários da região da Rússia e países vizinhos criaram grupos com robôs dotados de uma Inteligência Artificial capaz de gerar fotos pornográficas de mulheres conhecidas, os DeepNudes. Atualmente, a empresa ainda não solucionou o problema e é acusada de ignorar os avisos de diversos especialistas. https://www.tecmundo.com.br/software/206975-telegram-ainda-nao-bloqueou-bots-deepfakes-pornograficos.htm


nov20
PORNOG
Several adult content websites are using deepfake technology to show Indian film stars ncluding those in Bollywood, in explicit videos. https://timesofindia.indiatimes.com/india/adult-deepfakes-of-indian-film-stars-thrive-online/articleshow/79140509.cms

Nov20
RP: O período de confinamento viu surgir um aumento no número de pessoas que contataram a Revenge Porn Helpline, uma linha de apoio para vítimas de abuso de imagens íntimas. https://www.wort.lu/pt/sociedade/reino-unido-revenge-porn-aumenta-com-o-confinamento-5fa40e65de135b9236546302

out20
(IMPORTANTE) The implications of deepfake pornography seep into all facets of victims’ lives. Not only does deepfake pornography shatter these victims’ sexual privacy, its online permanency also inhibits their ability to use the internet and find a job. Although much of the scholarship and media attention on deepfakes has been devoted to the implications of deepfakes in the political arena and the attendant erosion of our trust in the government, the implications of deepfake pornography are equally devastating. https://search.proquest.com/openview/1c6183d8c774ed079d433b1599350e11/1?pq-origsite=gscholar&cbl=49232

out20
In our investigation for this story, Rolling Stone found more than two dozen examples of prominent TikTok creators being featured in deepfake porn. Most of them are originally posted on one of a handful of websites devoted exclusively to posting deepfakes, but they are also not difficult to find on social media platforms like Twitter and Reddit, even though both platforms have policies banning deepfakes. LINK
=========================aula TCM================

ou20
Danielle Citron, professor of law at the Boston University School of Law and vice president of Cyber Civil Rights Initiative, says the psychological impact of appearing in a pornographic deepfake video cannot be overestimated, and some have also compared nonconsensual pornography to a form of digital rape. “When you see a deepfake sex video of yourself, it feels viscerally like it’s your body. It’s an appropriation of your sexual identity without your permission. It feels like a terrible autonomy and body violation, and it rents space in your head,” she says. According to Patrini, appearing in a deepfake porn video can have a significant and deleterious financial impact on a creator’s business model, citing an example of a prominent YouTuber who lost a brand partnership after someone posted a deepfake video of her on a porn site. https://www.rollingstone.com/culture/culture-features/tiktok-creators-deepfake-pornography-discord-pornhub-1078859/

out20
Não é a primeira vez que a tecnologia conhecida como “deepfake” é usada para fins malignos: em junho de 2019, o app “DeepNude” trazia essa mesma premissa, usando da inteligência artificial (IA) para criar versões nuas de imagens de celebridades na internet

out20
Um bot disponível para Telegram usa da tecnologia “deepfake” para criar falsos nudes de mulheres para compartilhá-los na internet. O robô tem capacidade de extrair qualquer imagem comum e trocar partes vestidas do corpo feminino por partes nuas. Até o momento, foram identificadas mais de 100 mil imagens falsas compartilhadas online, mas o bot pode ter produzido até milhões que ainda não foram rastreadas. Uma pesquisa conduzida sobre o bot descobriu que a maior parte das vítimas é de mulheres comuns, cujas imagens foram capturadas de postagens normais em redes sociais como Instagram e Facebook - algumas estavam postando fotos de biquínis ou maiôs, outras, com camisetas longas enquanto - e isso é o mais aterrorizante - algumas outras vítimas eram claramente menores de idade. Todas, porém, mulheresNo caso do bot do Telegram, o risco é duplamente amplificado: não apenas as suas fotos no Instagram estão em risco, mas qualquer pessoa que inadvertidamente tire uma foto sua, digamos, em algum momento do dia, pode usá-la para criar a sua “versão nua”. A situação torna-se ainda pior quando o bot, que cria nudes falsos a partir de imagens normais enviadas a ele pelo app de mensagens, já compartilhou imagens de criançashttps://olhardigital.com.br/fique_seguro/noticia/bot-do-telegram-usa-deepfake-para-criar-mais-de-100-mil-falsos-nudes-de-mulheres/108953 
+ https://sensity.ai/automating-image-abuse-deepfake-bots-on-telegram/?utm_medium=email&_hsmi=97887695&_hsenc=p2ANqtz-94P6tswOYy8qHktUZJyRpg5yB9cNsKix96Om-XTEGktZYh40MvMBsRGkLaEfxIw6gyyWGaFlkIqEhDlxqGuuqxtXthBuYkvwAEoXJ4aQMK3DxuNUc&utm_content=97887695&utm_source=hs_email
+https://www.cnet.com/news/deepfake-bot-on-telegram-is-violating-women-by-forging-nudes-from-regular-pics/
https://www.theverge.com/2020/10/24/21531725/italian-authorities-investigating-telegram-deepfake-bot-nudes

out20

Much of the non-consensual pornography that has already been produced is targeted at celebrities, inserting them into professionally produced pornographic videos. Targets have included British and American actresses such as Emma Watson, Daisy Ridley and Kristen Bell. https://www.express.co.uk/news/world/1346071/Technology-news-deep-fakes-non-consensual-pornography-synthetic-media


out20

Hayashida and Otsuki stand accused of creating and putting up deepfake videos online between December 2019 and July 2020, defaming the celebrities whose faces were used for the videos, and violating the copyright of the production companies that created the original adult videos. Police have built a case against Hayashida on suspicion of defamation of two celebrities and copyright violations against four production firms, while Otsuki is accused of the same charges in connection with two celebrities and three production companies. Both men have reportedly admitted to the allegations. According to the MPD's Safety Division, Hayashida made some 800,000 yen (approximately $7,600) by releasing the deepfake videos on a website he runs. He is quoted as telling investigators, "I wanted to make money," while Otsuki reportedly told police, "I published the videos to gain recognition from third parties." https://mainichi.jp/english/articles/20201002/p2a/00m/0na/027000c + Futamata has made ad revenue totaling more than 500,000 yen ($4,820) from the website, while Kubo has made more than 1 million yen, police saidhttp://www.asahi.com/ajw/articles/13944568



Direitos de autor
Audio Deepfakes: entre la polémica por derechos de autor y la reinvención del entretenimiento
Jay-Z recitando a Hamlet u Ocasio-Cortez criticando el socialismo: la simulación de voces se expande y plantea retos vinculados a la propiedad intelectual así como contenidos más creativos https://www.eldiario.es/tecnologia/audio-deepfakes-polemica-derechos-autor-reinvencion-entretenimiento_1_6083312.html


Para as mulheres:
Set20 Addison Rae vittima di deepfake: la sua immagine usata (a sua insaputa) per un film a luci rosse
  [Addison Rae, vítima de falsificação profunda: a imagem dela usada (sem o seu conhecimento) para um filme dpara adultos pelo Sr. Keba] https://www.webboh.it/addison-rae-deepfake/

set20
ago20

Hundreds of explicit deepfake videos featuring female celebrities, actresses and musicians are being uploaded to the world’s biggest pornography websites every month, new analysis shows. The non-consensual videos rack up millions of views and porn companies are still failing to remove them from their websites.

Up to 1,000 deepfake videos have been uploaded to porn sites every month as they became increasingly popular during 2020, figures from deepfake detection company Sensity show. The videos continue to break away from dedicated deepfake pornography communities and into the mainstream.Deepfake videos hosted on three of the biggest porn websites, XVideos, Xnxx, and xHamster, have been viewed millions of times. The videos are surrounded by adverts, helping to make money for the sites. XVideos and Xnxx, which are both owned by the same Czech holding company, are the number one and three biggest porn websites in world and rank in the top ten biggest sites across the entire web. They each have, or exceed, as many visitors as Wikipedia, Amazon and Reddit. https://www.wired.co.uk/article/deepfake-porn-websites-videos-law


jun20

jun20
"Fiquei chocada, porque esse é o meu rosto", disse Bell, 39 anos, ao portal Vox. "Pertence a mim!... É difícil pensar que estou sendo explorada", desabafou. A atriz de 'Good Place' foi vítima de um deepfake, ou seja, um vídeo com imagens manipuladas por inteligência artificial que substitui as imagens e o áudio existentes de uma pessoa pelo rosto e voz de outra pessoa.https://revistamonet.globo.com/Celebridades/noticia/2020/06/atriz-de-frozen-fala-sobre-choque-ao-descobrir-que-seu-rosto-estava-sendo-usado-em-sites-pornograficos.html

jun20
“There’s a lot of talk about the challenges that come with the advancements in deepfake technology,” Martin said. “But I think what is often missed from the discussion is the impact to individuals right now. Not in a few years, not in a couple of months. Right now.” https://www.vox.com/2020/6/8/21284005/urgent-threat-deepfakes-politics-porn-kristen-bell

jn20

Last week, on a forum dedicated to cum tributes, people posted images of a 14-year-old YouTuber named Makenna and described in explicit detail the sexual acts they wanted to do to her. (Motherboard is not identifying the website to protect the privacy of the people whose images are still on it.) Makenna makes ASMR videos on her YouTube channel, which has more than 1.6 million subscribers. https://www.vice.com/en_us/article/4aywd3/tribute-porn-site-targeted-youtuber


maio20
No sé si el primer deepfake fue porno. El segundo seguro que sí", bromea. "Todo o casi todo lo que se hace hoy de deepfake es en el porno. Nosotros, los que no, somos sólo la punta del iceberg. Tengo suerte de que, hasta ahora, nadie me ha pedido que haga porno, aunque no te digo yo que no lo haría. Si me pagasen bien, pues igual. En el momento en el que se podían cumplir las fantasías de gente dispuesta a pagar por ello, aquello explotó. Todo ha evolucionado muy rápido y el foro principal de gente que ha hecho deepfakes es porno". https://www.elconfidencial.com/cultura/2020-05-27/deep-fake-imagenes-manipulacion-cine-politica_2609236/

Mai20 

A quarter of all deepfake pornography features K-pop stars. The ‘Nth room’ case shocked the public earlier this year when details of the inhumane crime was revealed through media reports and investigation results. Although the idea of digital sex crimes wasn’t new, the horrifying atrocities of what had been going on inside the anonymous chat rooms brought the issue to the surface like never before. Among the crimes are some lesser-known forms of digital sex crimes that had gone unnoticed by the public over the years but has come to light with the case. Through this two-part series, the Korea JoongAng Daily looks into the two types of crime that had not been properly addressed before: ‘friend fouling’ and deepfakes. https://koreajoongangdaily.joins.com/2020/05/17/features/deepfake-artificial-intelligence-pornography/20200517190700189.html

abr20 South Korea was shaken by the Telegram NthRoom scandal that used deepfake videos while harassing and abusing minor girls, trapping them into recording abusive sexual video content. The main accused Baksa aka Jo Joo Bin had used deepfake technology and masked the faces of minor girls being forced into sexual acts with Korea's top idols and sold the videos on a higher rate in the space called NthRoom.
https://www.ibtimes.sg/deepfake-videos-new-threat-fight-misinformation-fake-news-regarding-coronavirus-42571

https://www.bizcommunity.com/Article/196/16/197251.html
96% of these videos are of female celebrities having their likenesses swapped into sexually explicit videos - without their knowledge or consent.” (LINK)
sobre o ataque às mulheres:
para as atrizes/atores: JNA 20 Less discussed, however, is the impact deepfakes will have on a wholly different group of women: the actors whose bodies and sexual performances are used as the basis for manufactured pornography. LINK
JAN20 Las profesionales del mundo del entretenimiento para adultos se quejan de que ellas también son víctimas de extorsiones y delitos por esta serie de montajes digitales LINK
Jan20 Troll armies, 'deepfake' porn videos and violent threats. How Twitter became so toxic for India's women politicians LINK



Fev20 The report by the U.S. Government Accountability Office said “deepfakes could influence elections and erode trust but so far have mainly been used for non-consensual pornography.”
The report said while researchers and technology companies have come up with detection tools for deepfakes, more still needs to be done. https://www.wftv.com/news/local/orange-county/report-online-deepfake-videos-mainly-involve-nonconsensual-pornography/JWUOAL5SBVBORH6JKI36D7F4NQ/

tecnologia deepfake se tornará tão realista que levará pessoas inocentes para a prisão, alertou Shamir Allibhai, CEO da empresa de verificação de vídeos Amber.

Allibhai acredita que muitos tipos de conteúdo estarão sujeito a manipulação grosseira por meio deste tipo de tecnologia. https://renovamidia.com.br/tecnologia-deepfake-pode-enviar-inocentes-para-cadeia/
Most of the discussion around deepfakes centres on their political implications in the era of fake news, and on our perception of truth — but the majority of victims aren't politicians. Deepfakes are mostly weaponised against women and girls; a dangerous new frontier in so-called "revenge porn". https://www.abc.net.au/news/2019-08-30/deepfake-revenge-porn-noelle-martin-story-of-image-based-abuse/11437774?sf218411423=1&fbclid=IwAR2MyEb-OGaKcCeaTazCuuRSn50eJLXim8lm5dOWkUtL3bP9nCVoCefDdzA
(sobre o porn revenge e o problema de não conseguirmos distinguir o verdadeiro do falso)
Uma empresa do setor pornográfico lançou nesta segunda-feira (20) uma forma de monetizar os deepfake, vídeos em que se usa Inteligência Artificial (IA) para trocar rostos das pessoas. A produtora Naughty America estreou um serviço em que seus clientes podem realizar uma fantasia com a ajuda da tecnologia: colocar seus rostos sobre o de atores pornô, e vice-versa e ser tornar estrelas do entretenimento adulto nos seus próprios filmes. LINK
MAR20 A explicação: Dans un communiqué de presse, Pornhub a pourtant dévoilé la liste des célébrités les plus recherchées sur le site pornographique, et les résultats sont étonnants. Mais comment ces stars ont-elles débarqué sur Pornhub ? http://www.letribunaldunet.fr/sexualite/porno-miley-cyrus-ariana-grande-selena-gomez-stars-les-plus-recherchees-pornhub.html

FEV20 Deepfake Porn Nearly Ruined My Life. Two years ago, Noelle Martin was emailed graphic videos of herself performing sex acts. Except, she hadn’t actually enacted any of the sexual scenes in the videos. The footage was a deepfake, but who would believe her? LINK + OUt21 https://newsnationusa.com/news/entertainment/movie-review/completely-horrifying-dehumanizing-degrading-one-womans-fight-against-deepfake-porn/
Para as  marcas/Negocios: 
https://www.thedrum.com/opinion/2020/01/10/what-do-deepfakes-mean-brands

Mai20Risks of economic losses are more dangerous than political manipulation https://www.nojitter.com/security/real-business-impacts-deepfakes


No comments:

Post a Comment