Sexual Health

Deep Fakes Can Cause Deep Damage

Imagine a situation where the use of your image is completely out of your control. Deep fakes are fake video or audio recordings that allow this to happen. The concept of such software, which allows the editing of media has been around for decades, but it’s not until lately that the technology is available to anyone and has gotten to a standard where the results are believable.

Initially, the advanced editing with Artificial intelligence was restricted to Hollywood films, organisations such as the CIA and the very rich, who could afford to pay for the technology; but this is no longer the case. The freely available application ‘Fakeapp’, allows anybody to superimpose people’s faces on other bodies. It is also possible to literally put words in people’s mouths by gathering information about a person’s voice and creating new sentences.

You may not be surprised by any of this, as the technology has been slowly advancing for years and it is possible that, subconsciously, people already saw it coming. However, the ways in which this technology is being used and manipulated is a major cause for concern.

The main use of ‘deep fakes’ so far, is within the porn industry. Anyone who dedicates the shortest amount of time to find their way around the app is able to put somebody’s face, particularly a celebrity’s, on a porn star’s body and posting the new version. Ariana Grande and Selena Gomez are recent victims of this act. This is a severe case of defamation, as many people that may surf the web and find this footage, may not have heard of deep fakes and genuinely believe they are seeing a celebrity’s sex tape. If an ordinary person’s face is superimposed into one of these videos it could cause even more harm as it is more believable and their reputation would be damaged far quicker.

On the other hand, it is damaging towards the original artist, as it is turning their work into something they did not intend it to be. There is also no way of taking legal action as the video’s origination cannot be traced. Long-form Youtube documentary maker, Shane Dawson, recently covered deep fakes in a conspiracy theory series; in the video, he talked to Raquel Roper, who is a victim of deep fake porn, as Selena Gomez’s face was superimposed into her adult content. She states: ’I just don’t think that people realise how harmful it can be. It can really affect people in a lot of different ways’.

The porn industry is only one way in which this technology can be used in a harmful way. Since it is so hard now to tell the difference between deep fakes and reality, it may not be spotted until the damage is already done and it is too late. One recording of a politician saying something controversial before a big vote or people using it for revenge are only two other examples of the detrimental impact this form of artificial intelligence could have.