Verifying Video Authenticity for the Canadian Armed Forces

Video obtained in military environments needs to be protected from tampering at all costs, and the Canadian Armed Forces required a way to identify inauthentic footage. Using AI to analyse full motion video (FMV), we were able to prevent tampering going unnoticed, declassify certain objects within a video, and detect fake content.


The Canadian Armed Forces, and other security organizations who utilise video data, need the ability to protect videos from tampering and verify their authenticity. Malicious tampering performed after a video is captured can be difficult or impossible to detect and trace to their source. The ability to maintain version control is needed when downgrading and declassifying videos. A solution was required to detect and protect full-motion video from tampering during capture, storage, retrieval, or processing operations.


We developed a set of three tools to

  • create a type of digital signature to prevent video tampering and improve video authentication,
  • semi-automatically declassify FMV using computer vision to detect objects of interest or remove sequences of frames from a video feed and
  • detect fake content in the rapidly evolving field of deepfake videos, where users are able to create videos of individuals or events that look real, but are not.
94 Accuracy


The developed machine-learning-based tools not only protect video from malicious tampering, but also allow for downgrading and declassifying video by automatically removing classified objects or scenes while maintaining the authenticity of the video. On the sample datasets we tested our algorithm on we achieved a 94% accuracy.

We developed a method to authenticate visual media that can be used to track valid edits and identify tampering. This method can be used to generate keys for media that has been validated by our tampering detection techniques. The other toolset we developed declassifies videos by blurring sensitive information, such as faces and text, and can be used to automatically obscure this information in videos that would be published. We also created tools for detecting deepfake videos, copy move forgery, and spliced images.

Leave a Reply

Your email address will not be published.