YouTube shooting surfaces new deepfake conspiracy theories

New conspiracies over YouTube shooting are dangerously built on a real threat

The video in question, “Y Does the Youtube Shooter Looks Like An A.I. Computer Program?” has only garnered around 86,000, views but it’s still quite a weird one, suggesting that Aghdam is actually an AI creation.

And this is where a conspiracy can often turn dangerous, by including a kernel of truth. Because these face-swapped/fake AI videos — called deepfakes — are, indeed, a real thing and a real problem.

You can read the full article on Mashable

DeepFakes Explained

Video by Siraj Raval

There’s a new trend on the interwebs called ‘Deepfakes’, a machine learning system that can be trained to paste one person’s face onto another person’s body, complete with facial expressions.

The effect isn’t yet more convincing than conventional computer graphics techniques, but it could democratize Hollywood-level special effects fakery — and, potentially, lead to a flood of convincing hoaxes.

I’ll explain how DeepFakes works both programmatically and theoretically in this video. It’s essentially 2 autoencoders trained on 2 image datasets and then we reconstruct image A using image B’s decoder.

Code for this video (with coding challenge):