New conspiracies over YouTube shooting are dangerously built on a real threat
The video in question, “Y Does the Youtube Shooter Looks Like An A.I. Computer Program?” has only garnered around 86,000, views but it’s still quite a weird one, suggesting that Aghdam is actually an AI creation.
And this is where a conspiracy can often turn dangerous, by including a kernel of truth. Because these face-swapped/fake AI videos — called deepfakes — are, indeed, a real thing and a real problem.
There’s a new trend on the interwebs called ‘Deepfakes’, a machine learning system that can be trained to paste one person’s face onto another person’s body, complete with facial expressions.
The effect isn’t yet more convincing than conventional computer graphics techniques, but it could democratize Hollywood-level special effects fakery — and, potentially, lead to a flood of convincing hoaxes.
I’ll explain how DeepFakes works both programmatically and theoretically in this video. It’s essentially 2 autoencoders trained on 2 image datasets and then we reconstruct image A using image B’s decoder.
“I was going to tell a science fiction story about face-swapping, and mass blackmail. Then the news broke about unethical face-swapping videos, and software designed and marketed for creating them: and I realised the future had arrived faster than I thought.”