Avatarify runs on Skype and Zoom, and face-swaps your own face with a celebrity in live video calls.
Video conferencing apps like Zoom and Skype are usually boring and often frustrating.
With more people than ever using this software to work from home, users are finding new ways to spice up endless remote meetings and group hangs by looping videos of themselves looking engaged, adding wacky backgrounds, and now, using deepfake filters for impersonating celebrities
Everyone reading this will likely be familiar with deepfakes, the sometimes humorous and oftentimes scary technology that makes it possible to alter an existing image or video by digitally replacing a person’s likeness.
But while deepfakes are well known, the audio equivalent — capable of simulating the voice of a real person — hasn’t received quite the same level of coverage.
Cyber threats, what were once considered to be a nuisance of big organizations, have now become a meandering menace. The estimations that cybercrime will cost the world $6 trillion by the end of 2020 is a standing testimony to the spreading tentacles of cyber threats. Small or big ventures, financial or automation sector, no segment is an exception to fly under its radar.
And why we should educate the public about the existence of such technology.
Ladies and gentlemen, Deepfake videos are so easy to create, that anyone can make one. You do not need a PhD, and you don’t have to train models for hours, you don’t even have to take a course on Generative Adversarial Network.
All that you have to do is, to record a video of yourself, and pick one photo of a person you want to impersonate. Machine learning will animate the person in the picture the way you want it in your video.
Just when you thought the wave of deepfakes has come to an end, a new app is here to prove you wrong. Instead of creating offensive or explicit deepfakes, this app aims to use the technology on a lighter note by letting you put yourself into any GIF.
The app is named Doublicat and it uses RefaceAI, a Generative Adversarial Network (GAN) behind the scenes. If that sounds familiar, that’s because its the same AI technology Elon Musk used to morph him into Dwayne Johnson.
Reddit updated its policies on Thursday to ban impersonation on its platform. The ban encompasses everything from deepfakes to individuals making false claims about their identities.
In a post about the ban, a Reddit admin explains that even though instances of impersonation are rare, they pose a threat if people impersonate a journalist or a politician, or fake domains representing as something else. The post attributes the changes to ensuring “appropriate rules and processes” ahead of the 2020 election.
Deepfakes and other manipulated videos put the integrity of democratic elections at risk, a group of experts told the House Committee on Energy and Commerce Wednesday. What to do about it is a thorny question.
A hearing, titled “Americans at Risk: Manipulation and Deception in the Digital Age” and held by the subcommittee on Consumer Protection, focused on the wide range of online fraud and manipulation on the internet. Monika Bickert, the vice president of Facebook global policy management, was joined by three other experts on the topic.
Artificially-generated faces of people who don’t exist are being used to front fake Facebook (FB) accounts in an attempt to trick users and game the company’s systems, the social media network said Friday. Experts who reviewed the accounts say it is the first time they have seen fake images like this being used at scale as part of a single social media campaign.
The accounts, which were removed by Facebook on Friday, were part of a network that generally posted in support of President Trump and against the Chinese government, experts who reviewed the accounts said. Many of the accounts promoted links to a Facebook page and website called “The BL.” Facebook said the accounts were tied to the US-based Epoch Media Group, which owns The Epoch Times newspaper, a paper tied to the Falun Gong movement that is similarly pro-Trump.