How crooks use deepfakes to scam your business • The Register

All the materials and tools needed to create deepfake videos – from source code to publicly available images and account authentication bypass services – are readily available and for sale on the public internet and underground forums.

According to researchers at Trend Micro, cybercriminals take advantage of this easy access to resources and use deepfakes to build on today’s criminal techniques like Business Email Compromise (BEC) to make even more money. Furthermore, deepfakes are used in web ads to make Elon Musk, security specialists and others appear to endorse products they are unrelated to.

“The increasing occurrence of deepfake attacks is significantly changing the threat landscape for organizations, financial institutions, celebrities, political figures and even ordinary people,” said Vladimir Kropotov, Fyodor Yarochkin, Craig Gibson and Stephen Hilt of the security firm warned in research published Tuesday.

We’ve been told that deepfakes are a particular concern for businesses as criminals begin to create fake people, e.g.

Over the summer, the FBI announced that it had received an increasing number of complaints related to the use of Deepfake videos during interviews for tech jobs that involve access to sensitive systems and information.

Once they convince someone to hire them, deepfake actors can use fake identities to trick unsuspecting customers or employees into sharing payment information, or use that network access to explore IT assets, steal corporate data, deliver ransomware or worse.

Just last month, a Binance PR executive claimed crooks created a deeply fake “AI hologram” of him fraudulent cryptocurrency projects via Zoom video calls.

“It turns out that a sophisticated hacking team used previous news interviews and television appearances over the years to create a ‘deep fake’ of me,” said Patrick Hillman, chief communications officer at Crypto Hyper-Mart, claims back then. “Aside from being noticeably absent from the 15 pounds I gained during COVID, this deep fake was sophisticated enough to fool several highly intelligent members of the crypto community.”

The Binance deepfake is notable as Trend Micro says the topic of how to bypass verification with deepfakes has been a hot topic on underground forums since 2021. In general, many users of these forums are looking for ways to cheat online banking and digital financial verification, according to the security researchers. You explain:

Additionally, deepfake production tools are bought and sold on nefarious online souks – or are available in the open GitHub — along with bots that can facilitate the creation of deepfake videos, the researchers added, citing the Telegram bot RoundDFbot as an example.

Deepfake + existing scam = more money for crooks

According to Trend Micro, criminals use deepfakes for a variety of proven attack vectors and scams, and researchers expect to see more of them in the near future.

These include messenger scams and BEC who have proved extremely profitable even without fake videos. Rogues can use deepfakes to impersonate executives or business partners to request money transfers, making these scams even more believable to the victims involved.

Criminals can also use stolen identities in combination with deepfake videos to open new bank accounts or create accounts for government services, the security researchers warn. Likewise, criminals can take over accounts that use video calls for identity verification.

“You can hijack a financial account and just withdraw or transfer money,” wrote Kropotov, Yarochkin, Gibson, and Hilt. “Some financial institutions require online video verification to enable certain features in online banking applications. Obviously, such checks could also be a target of deepfake attacks.”

While we have already seen deepfakes used in disinformation campaigns, particularly related to the Russian invasion of Ukraine, these fake videos can also be used in extortion-related attacks – imagine fake “evidence” being created to infiltrate organizations to force them to pay a ransom, the researchers note.

Trend Micro also puts Amazon’s Alexa “on the target list of deepfake criminals.” However, Alexa is not alone. Any device that uses voice recognition – whether it’s to reorder cat food or open the door to a secure wing of a hospital – could be hijacked by deepfakes.

The good news is that organizations can take steps to protect themselves. At the top of that list is the use of multi-factor authentication, which security firms say “should be the standard for any sensitive or critical account authentication.”

Use three things to authenticate users: something the user has, something the user knows, and something the user is.

Also, educate employees on what to watch out for and watch out for when it comes to deepfake technology. “For verification of sensitive accounts (e.g. bank or company profiles), users should prioritize using biometric templates that are less accessible to the public, such as irises and fingerprints,” the researchers advised. ®

https://www.theregister.com/2022/09/28/trend_deepfake_video/ How crooks use deepfakes to scam your business • The Register

Laura Coffey

World Time Todays is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@worldtimetodays.com. The content will be deleted within 24 hours.

Related Articles

Back to top button