Perhaps inevitably, the technology has become synonymous with pornography, with graphic videos apparently but falsely depicting celebrities now banned by Twitter, Reddit and even Pornhub. Now, new research at Carnegie Mellon could take deepfakes to the next level with a technique called Recycle-GAN, which can take the detailed content of one video or performer and apply it to another, keeping the style of the latter intact.
It's easier to understand when you see it in action, so here's a quick (wholly safe for work) example taking the visual content from a film of Martin Luther King Jr. and applying it to a video of Barack Obama.
The first thing you'll notice is that Recycle-GAN, like other deepfake technology, is only visual. It doesn't transpose sound. But the technique is impressive all the same, marking an evolution in the AI methods used to transfer content from one video to another.