Microsoft Releases Tool to Combat Deepfakes
Microsoft has unveiled a new tool designed to help fight deepfakes in real-time.
‘Deepfake’ is a term used to describe photos or video that have been altered by artificial intelligence (AI). This is in contrast to so-called ‘shallowfakes,’ or media that is altered using traditional methods and software. As the technology continues to evolve and improve, it will become increasingly difficult to identify a deepfake—much harder than identifying shallowfakes.
Security experts have been warning that deepfakes will begin to have major ramifications across all aspects of business, politics and life in general. A deepfake released at the right moment could have profound implications on an election, ruin a person’s career or make a person vulnerable to blackmail. As a result, companies have been scrambling to come up with ways to reliably identify altered photos and videos.
Microsoft’s latest effort, Microsoft Video Authenticator, is a major step in that direction. Video Authenticator is designed to “analyze a still photo or video to provide a percentage chance, or confidence score, that the media is artificially manipulated,” write Tom Burt, Corporate Vice President of Customer Security & Trust and Eric Horvitz, Chief Scientific Officer.
When analyzing videos, the software is able to score each frame in real-time by looking at elements the naked eye cannot see, such as blending boundaries, fading and greyscale elements.
It’s safe to say that deepfakes will likely be the next digital arms race between those trying to promote them and those working to identify and fight against them. Fortunately, companies like Microsoft are pulling out all the stops to stay ahead of the threat.