Feb 7, 2018
Deepfake pornographic videos banned by Twitter
Twitter is the latest platform to ban a new type of pornographic video that replaces the original actor's face with that of another person. The San Francisco-based company acted six hours after a Twitter account dedicated to publishing deepfake clips was publicised on a Reddit forum. The development followed an announcement by Pornhub that it too would remove deepfake clips brought to its attention. Until now, the adult site had been a popular source for the material, with some deepfake videos attracting tens of thousands of views. Not all of the clips generated have been pornographic in nature - many feature spoofs of US President Donald Trump, and one user has specialised in placing his wife's face in Hollywood film scenes.
Make a complaint about Twitter by viewing their customer service contacts.