Deepfaking: How Bad Actors Are Using It

Social media graphic

Deepfakes can be used to persuade, spread disinformation or just be fun. Bad actors are finding ways to use artificial intelligence (AI) and deepfakes in order to heighten their cyber scams. AI resources and deepfake software is readily available. So miscreants are using this to their advantage to terrorize, extort and scam their victims.

Sextortion
This form of extortion intends to convince the victim that they are in possession of some explicit content and demand a payment to prevent it from being released on the Internet or with friends and family. The cyber crook has discovered some content of the victim on various social media sites and other public forums. Then they will fake the appearance or audio in a way that is embarrassingly lewd.

Business Identity Risks
Deepfakes can be used to copy an executive’s image and/or voice to convince employees to tranfer funds or share sensitive documents. In some cases, it can use the image and/or voice to threaten and harrass the victim. They can make it appear as if the person said or did something unethical.

Grandparent Scam
With the use of AI, scammers are able to clone a young child’s voice and then use this to call parents/grandparents claiming to be in trouble and needing money. The clips of audio used by the crooks can be found on social media sites that host video. Very little audio can be used to clone the voice.

Cheapfakes are similar but will use simpler methods. This could entail speeding up or slowing down the audio in order to make the presenter look intoxicated or disjointed. Many times this is used as a strategy to cause reputational damage. Photos or documents, such as a drivers license, can be maniputated to show incorrect addresses or other information. Invoices and other documents can then be used to make fraudulent insurance claims.

The risk to authentication strategies is another area used by AI and deepfake technology. Face or voice recognition authentication can be compromised.

Proactive steps to avoid becoming a victim include education of employees. Raise the awareness of this type of crime and provide a mechanism for them to report suspicious activity. Some deepfakes are difficult to spot. Provide tips to the employee on how to spot this malicious venture. Listen to the audio for gaps and pauses or run-together sentences. The voice may sound a bit off or lifeless. In video, it there is a long period of time void of blinking, patchy skin coloring or the mouth and audio timing are off it could be fake. Strengthen your identity verification practices by including MFA. Invoke the principle of ‘least privilege’ to ensure that people only have access to resources required to perform their job.

Resources:
Deepfaking it: What to know about deepfake-driven sextortion schemes