Where there’s innovation, there’s masturbation — at least in one dark corner of the internet, where nearly 80,000 people have gathered to share fabricated videos of celebrity women having sex and Nicolas Cage uncovering the Ark of the Covenant.
The technology is relatively easy to use, which has created an enthusiast community on Reddit (since banned), where users compare notes and swap their latest work: “Emma Watson sex tape demo ;-),” “Lela Star x Kim Kardashian,” and “Giving Putin the Trump face” among them.
“We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people).
And that sleazy application is just the tip of the iceberg.
As Julian Sanchez tweeted, “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy. But my first thought is that we have not even scratched the surface of how bad ‘fake news’ is going to get.” Indeed.
Companies such as BaDoink are offering to assist Hollywood in getting deepfake content removed from the web, although they seem to agree that Deepfakes are here to stay and will be pretty much unstoppable.
“Given the rogue nature of Deepfakes, I don’t see how it can be effectively stopped,” agreed Alec Helmy, president and publisher of the adult industry publication XBiz.
“There’s an interesting world of artificial CGI porn that will be happening the next decade, where a fan can easily put any face on anybody in a porn scene,” agrees Grayson. “But I think of that for personal consumption rather than public humiliation.”
The Stop Enabling Sex Traffickers Act would open a crack in that prohibition. The bill would allow the government to prosecute websites which knowingly help or promote sex trafficking, and also allow users to sue those websites.
Ah well, Reddit shit its pants and banned /r/deepfakes , /r/deepfakeservice and /r/facesets – partly, maybe mostly, due to someone called Shane and also the usual corp dick-wads thinking they can put a stop to this if Reddit removes this content.
If you want to see videos of people dying though, you can still go to Reddit for that \o/
Reddit’s new policy… “prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked”.”
Following on from the FakeApp v1.1 tutorial (which should be used as a primer) – comes the newest version 2.1 – make sure you follow the instructions carefully, if you need help you can find it within the Reddit communities.
One-button video creation.
One-button dataset creation.
More streamlined, cohesive UI.
Packaged into one installer, potential for desktop/start menu shortcuts.
Abstracted out command prompts.
Packaged FFMPEG, removing the need for manual video-to-image conversion.
Text fields replaced with more intuitive drop-downs.
Split some videos with your two desired faces into two sets of a few hundred frames each with a tool like FFMPEG. If you use FFMPEG, the command you want is: ffmpeg -i scene.mp4 -vf fps=[FPS OF VIDEO] "out%d.png". After splitting, run both directories of split frames through the “Extract” tool to produce training data
Switch to the “Train” tool, and input the paths of the training data produced in step 1 (it should be in a folder called “aligned”) as well as the “models” folder along with this project (which you can move somewhere convenient)
Train until the preview window shows results you are satisfied with
Split the video to be faked into frames and run the “Convert” tool on them to create faked frames, which can then be re-merged into a deepfaked video
Copy and reuse the same encoders for faster results in future fakes
-CUDA 8.0 must be installed, and its bin folder must be included in the PATH environment variable.
-At least a few GB of free space on disk to allow the app to create Temp files
-Run fakeapp.bat to launch the app
–RuntimeError: module compiled against api version 0xc but this version numpy is 0xb is just a warning related to how the alignment libraries were installed, the app will run properly despite it appearing if no other errors occur
-It may take 30-45 seconds after pressing the Start button for the app to unpack and start the training/merging scripts the first time
-You can still quit training by focusing the training window and pressing “q”
-Paths to models/data must be absolute, not relative