Proliferation of ‘deepfakes’ brings up issues of consent and authenticity of the digital image

The fake news universe is gaining a new dimension in 2018: fake videos have recently reached huge popularity in Reddit social network topics. Google Trends data shows peaks in searches for this content in mid-December 2017 and in late January and early February 2018.

Called “deepfakes”, the videos in question are generated by the application of a technology similar to the “face swap” used in Instagram and other networks.

The algorithm has been used to superimpose faces of women – many of them female personalities – to people in pornographic videos. The result can be highly realistic fake videos.

The major concern of experts and others, though, is that face-swap videos spread as a new form of virtual sexual violence against women. And that’s just what happened to Noelle Martin, who has been fighting deepfake pornography for six years. Your photos have been stolen from your personal networks and used on porn sites.

The situation escalated, even more, Martin told TNW: “It then moved to doctoring images of me into graphic pornography, on the cover of pornographic DVDs to fake images of me being ejaculated on.”

That’s why it’s so important to protect your images online. Facechex joined the movement to help make the internet safer and secure for everyone.

 

Leave a Reply

Your email address will not be published. Required fields are marked *