Over the past 20 years, technology has changed the face of the planet irrevocably. Now it’s estimated that the average person spends 20 hours a week online and that’s not including the time that we spend messaging our friends on our increasingly advanced smartphones.
However, as has been the case with so many once great inventions, technology is now developing a much darker side.
The concept of artificial intelligence is nothing new. Hollywood movies like Terminator predicted a world where humans would be overthrown by machines, and they could have predicted the future, with an AI robot named Sophia even saying that she wants to destroy all humans.
Almost anything that you can imagine is possible with technology, and AI’s darker side has now surfaced in a disturbing Reddit thread entitled “Deepfake” created by “userdeepfakeapp” which uses advanced facial recognition technology to face swap celebrities and adult movie stars.
When this story first surfaced in December, celebrity victims of this disturbing trend already included Wonder Woman star Gal Gadot, Taylor Swift, Scarlett Johansson, and Game of Thrones actor Maisie Williams, according to Motherboard.
This technology also means that anyone, celebrity or not, could find themselves a victim of revenge porn.
While face swaps have been around for some time now and easily accessible on apps like Snapchat, the technology used by Deepfakes is so realistic that it’s hard to believe that what you are seeing on the screen is not a genuine leaked celebrity sex tape.
The Reddit user who started the Deepfake phenomenon subsequently shared this technology with other users as an app, enabling them to create their dream celebrity X-rated movies, providing, of course, that they are willing to spend 12 hours making one short clip.
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” deepfakeapp told Motherboard.
“Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
Pictured below is a still from a clip where an adult movie star’s face has been swapped with Gal Gadot’s.
The app has potentially devastating consequences for all of us. Those who are unfamiliar with the technology could easily be duped into believing that what they are seeing on screen is real – ultimately creating a world where no media can really be trusted.
Katy Petty had also been a victim of the app, with users uploading seemingly explicit clips of her to Reddit.
“I think we’re on the cusp of this technology being really easy and widespread,” Peter Eckersley, the chief computer scientist for the Electronic Frontier Foundation, said.
“You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two.”
We still live in a world where sex workers are shamed for their profession, so ordinary people who have their videos uploaded to explicit websites or social media have to through the same thing, something which can result in them suffering mentally and physically.
In the video below, American YouTuber Chrissie Chambers explains how she was a victim of revenge porn:
A number of laws are now being put into place around the world to protect victims of revenge porn.
In the US, the majority of states now have laws against revenge porn and they will inevitably become more extensive as technology continues to dominate our lives. In California, anyone who uploads explicit material of another person is guilty of “disorderly conduct”.
Pictured below is a warning from the Scottish Government about the consequences of distributing revenge porn.
Thankfully, Chrissie won the case against her ex-boyfriend who uploaded explicit material of her to the internet without her consent.
“[This] should serve as a severe warning to those who seek to extort and harm with revenge porn: you cannot do this with impunity, and you will be held accountable for your actions,” she said after the case.
“To every victim of this insidious kind of attack, I am here to say: You can fight back, and win. You will heal and move on – and you will not have to take those steps alone.”
When it comes to explicit material, the parties involved must fully consent to it being made and distributed, regardless of whether it is real or not, and so the “Deepfakes” trend has potentially devastating consequences for celebrities and ordinary people alike.