Beware of Deepfake video technology scams

May 7, 2023 10:44

Raising vigilance with calls asking to borrow money, including confirmation via video call, is a warning that banks and authorities have continuously issued recently.

By understanding the psychology of social network users being more wary of money transfer messaging tricks, high-tech criminals have now taken advantage of face and voice splicing software (Deepfake AI) that looks exactly like the victim's acquaintances to create fake videos to commit fraud.

According to Vietnam Prosperity Joint Stock Commercial Bank (VPBank), criminals often hijack Zalo and Facebook accounts or create accounts that look exactly like the victims' relatives, then collect images and voices and use Deepfake technology to create fake videos.

Then, they will use fake accounts or stolen social media accounts to text to borrow money or ask for money transfers or notify the victim's relatives in danger and need money urgently...

This trick is not new but has become more sophisticated when the scammer simultaneously makes fake video calls and plays fake videos to verify information for the fake account, increasing credibility to successfully carry out the scam. These videos can make viewers believe that they are seeing and hearing a real person talking.

VPBank warns that fake videos have a relatively high similarity to the appearance of the person being faked, so it is difficult to distinguish between real and fake. To identify Deepfake videos, users can rely on factors such as whether it is a pre-made video or not, whether the answer content is directly related to the question or the content is general, not suitable for the actual communication context.

However, to cover up the easily detectable flaws, subjects often create videos with hard-to-hear sound and unclear images, like a video call with an unstable signal, made in an area with weak mobile signal or wifi...

Deepfake AI was created for entertainment purposes, helping users to insert their faces and voices into their favorite characters in videos, while still ensuring that the operation is similar to that of real-life recordings. However, currently, in addition to the above entertainment purposes, criminals have taken advantage of this technology to create fake videos of other people, helping them to commit scams or spread fake news online.

The use of Deepfake technology for fraud is predicted to increase in the near future. And with fraud cases through Deepfake application videos, it is very complicated, time-consuming and requires the participation of many parties such as network operators or police for users to prove to the bank that they have been scammed.

Therefore, users should also learn about this technology to recognize the characteristics to distinguish between real and fake videos and images. Specifically, fake videos are often short in duration, have blurry images, jerky movements, facial expressions that are not synchronized with speech, little body movement, difficult to hear sounds, voices that are not smooth, and no pauses...

To avoid becoming a victim of Deepfake technology, VPBank recommends that users of high-tech devices and technology applications limit sharing images, videos with real voices, and personal information on social networks.

Be especially vigilant with calls asking to borrow money. Pay attention to carefully checking the information of the receiving account, and call the phone number you know to verify with relatives before transferring money.

If you suspect or are a victim of Deepfake technology fraud, you should immediately notify your relatives and acquaintances to minimize damage. At the same time, immediately report to the nearest police station, bank hotline or nearest bank transaction office for support.

In addition, according to the National Cyber ​​Security Monitoring Center (NCSC) - Department of Information Security (Ministry of Information and Communications), if you share videos or clips online, you should distort your voice and replace it with a robot voice or AI voice to avoid bad guys knowing your real voice.

In case of fake information, images, videos, users should immediately notify everyone and report to the authorities at canhbao.ncsc.gov.vn or report to the chongluadao project at https://chongluadao.vn.

NCSC also provides information to raise awareness of cyber fraud at: dauhieuluadao.com.

According to VNA

(0) Comments
Highlights
    Latest News
    Beware of Deepfake video technology scams