Deepfakes are coming for American democracy

Deepfakes are coming for American democracy. Here’s how we can prepare.

There are a number of plausible reasons why cheapfakes have outpaced deepfakes in the political domain. One is that, despite their crudeness, cheapfakes spread widely and can capture public debate and discourse.

On pure cost-benefit grounds, fakers may opt to get more bang for their buck by using existing, proven techniques for editing and manipulating media.

This is a snippet from an article by the washingtonpost, you can read the full article here….

Deep Fake Yourself into Movie & TV Scenes & Viral GIFs with the Reface App

Deep fakes, the art of leveraging artificial intelligence to insert the likeness and/or voice of people into videos they don’t otherwise appear in, typically focus on celebrity parodies or political subterfuge.

This is a snippet from an article by gadgethacks, read the full article here…

This Open-Source Program Deepfakes You During Zoom Meetings, in Real Time

Avatarify runs on Skype and Zoom, and face-swaps your own face with a celebrity in live video calls.

Video conferencing apps like Zoom and Skype are usually boring and often frustrating.

With more people than ever using this software to work from home, users are finding new ways to spice up endless remote meetings and group hangs by looping videos of themselves looking engaged, adding wacky backgrounds, and now, using deepfake filters for impersonating celebrities

This is a snippet from an article by Vice, you can read the full article here…

Deepfakes for voice are here

Everyone reading this will likely be familiar with deepfakes, the sometimes humorous and oftentimes scary technology that makes it possible to alter an existing image or video by digitally replacing a person’s likeness.

But while deepfakes are well known, the audio equivalent — capable of simulating the voice of a real person — hasn’t received quite the same level of coverage.

This is a snippet from an article by Digital Trends, read the full article here…

Five top cyber threats that may decisively disrupt businesses in 2021

Deepfakes coming in at number 2!

Cyber threats, what were once considered to be a nuisance of big organizations, have now become a meandering menace. The estimations that cybercrime will cost the world $6 trillion by the end of 2020 is a standing testimony to the spreading tentacles of cyber threats. Small or big ventures, financial or automation sector, no segment is an exception to fly under its radar.

This is a snippet from an article by IEN Europe, read the full article here…

Realistic Deepfakes in 5 Minutes on Colab

And why we should educate the public about the existence of such technology.

Ladies and gentlemen, Deepfake videos are so easy to create, that anyone can make one. You do not need a PhD, and you don’t have to train models for hours, you don’t even have to take a course on Generative Adversarial Network.

All that you have to do is, to record a video of yourself, and pick one photo of a person you want to impersonate. Machine learning will animate the person in the picture the way you want it in your video.

This is a snippet from an article by

This Deepfake App Helps You Make Fun GIF Memes

Just when you thought the wave of deepfakes has come to an end, a new app is here to prove you wrong. Instead of creating offensive or explicit deepfakes, this app aims to use the technology on a lighter note by letting you put yourself into any GIF.

The app is named Doublicat and it uses RefaceAI, a Generative Adversarial Network (GAN) behind the scenes. If that sounds familiar, that’s because its the same AI technology Elon Musk used to morph him into Dwayne Johnson.

This is a snippet from an article by beebom, you can read the full article here…

Reddit bans impersonation content, including deepfakes

Reddit updated its policies on Thursday to ban impersonation on its platform. The ban encompasses everything from deepfakes to individuals making false claims about their identities. 

In a post about the ban, a Reddit admin explains that even though instances of impersonation are rare, they pose a threat if people impersonate a journalist or a politician, or fake domains representing as something else. The post attributes the changes to ensuring “appropriate rules and processes” ahead of the 2020 election. 

This is a snippet from an article by digital trends, you can read the full article here…

Facebook global policy chief says addressing manipulated videos is a top priority

Deepfakes and other manipulated videos put the integrity of democratic elections at risk, a group of experts told the House Committee on Energy and Commerce Wednesday. What to do about it is a thorny question.

A hearing, titled “Americans at Risk: Manipulation and Deception in the Digital Age” and held by the subcommittee on Consumer Protection, focused on the wide range of online fraud and manipulation on the internet. Monika Bickert, the vice president of Facebook global policy management, was joined by three other experts on the topic.

This is a snippet from an article by Cnet, you can read the full article here…

Now fake Facebook accounts are using fake faces

Artificially-generated faces of people who don’t exist are being used to front fake Facebook (FB) accounts in an attempt to trick users and game the company’s systems, the social media network said Friday. Experts who reviewed the accounts say it is the first time they have seen fake images like this being used at scale as part of a single social media campaign.

The accounts, which were removed by Facebook on Friday, were part of a network that generally posted in support of President Trump and against the Chinese government, experts who reviewed the accounts said. Many of the accounts promoted links to a Facebook page and website called “The BL.” Facebook said the accounts were tied to the US-based Epoch Media Group, which owns The Epoch Times newspaper, a paper tied to the Falun Gong movement that is similarly pro-Trump.
This is a snippet from an article by cnn, read the full article here…