Home » today » Technology » Deepfake Technology: A Threat to Celebrities and Private Users – Experts Warn of Cybercriminal Tactics

Deepfake Technology: A Threat to Celebrities and Private Users – Experts Warn of Cybercriminal Tactics

“A single mother from Warsaw became a millionaire thanks to the blogger’s application Mr Beast” – this is the title of the video circulating on the Internet, which was supposed to be broadcast in the news program “19:30” on TVP1. In the video, journalist Marek Czyż encourages viewers to download a gambling application.

The short “report” also features the alleged lottery winner and MrBeast, a popular American YouTuber, and the whole thing adds credibility to the image of the PKO BP bank and its banking application. The video ends with an encouragement to click on the embedded link and “enter the game”.

Watch the video Did you find a new smartphone under the Christmas tree? Five free apps to get you started [TOPtech]

Of course, TVP did not broadcast a report about a “single mother” who won millions thanks to the gambling application of a famous YouTuber. The material was completely prepared, for which cybercriminals used deepfake technology.

Videos created by AI allow you to attribute any words to almost anyone, even those they never said. The only condition for creating such material is access to a larger number of authentic video and audio recordings with the participation of a given person. It is on their basis that AI generates new material.

– explained by experts from ESET.

Fraudsters use the image of popular people to attract the attention of potential victims, gain their trust and then extort money. In such a fake recording, a celebrity may, for example, encourage a risky investment or – as in the case of the alleged material with the participation of Marek Czyż – to install an application with malicious software embedded in it.

The ultimate goal of fraudsters is usually to extort data or financial resources from unaware users, and taking advantage of the trust we place in public figures is one of the cynical tactics.

Deepfake using the image of Marek Czyſa photo: ESET

Deepfake technology poses a threat not only to celebrities. “It could destroy our reputation”

Just a few years ago, unmasking doctored videos and images was relatively easy, but over time it has become much more difficult. Currently, it is increasingly difficult to determine whether a given material is a product of AI. Tools for creating deepfake materials are at the fingertips of every cybercriminal.

– explains Kamil Sadkowski, analyst of the ESET antivirus laboratory.

The expert also draws attention to the fact that fraud using the image of celebrities is just one of the problems related to the development of deepfake technology. Another is the possibility of fabricating such recordings with the participation of private users.

Many people have published video materials over the years, which can now be used to steal our image and create false statements. In this way, we can deceive, for example, our loved ones and even destroy our reputation. It may soon turn out that blackmail or fraud through films will be as common as the so-called stealing ‘for grandson’

– warns Sadkowski. “That is why it is so important to carefully filter the content we post on the Internet, as well as to verify information, even with the participation of people we know or trust,” he adds.

2024-01-08 11:48:00
#Marek #Czy #TVP #PKO #fell #victim #perfidious #deepfake #Beware #false #report #p.m

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.