If you grew up in the 1980s, you may remember Max Headroom, the fictional, computer-generated TV character with a grainy appearance and crackling sound. Max probably wouldn’t have been considered a ‘deepfake’ by his contemporaries – the term was not even coined until decades later – but he is an early example.
Max has come a long way
Fast-forward to today. Advances in artificial intelligence (AI) have grown by leaps and bounds in terms of clarity and believability. Not to mention the technology has emerged into the mainstream with software and apps readily available to create fake images and sounds – no experience necessary.
The greatest advancement with deepfakes has come from the development of Generative Adversarial Networks (GANs), a type of machine learning that uses a given data set to train two separate networks – a generator network and a discriminator network – to compete against each other in order to create fake data that looks real. For example, the generator network takes inputs such as audio and video recordings of an executive and uses them to create a realistic copy of that executive. The discriminator network is trained to determine whether it is real or fake, effectively pushing the generator to continuously improve its attempts to mimic humans. While there are business benefits to be realized from the use of GANs – increased efficiencies and content creation, to name a few – the danger posed by the deepfakes they help create is very real.
With the ability to create a true replication of anyone’s voice and image, the rise in deepfake incidents is not surprising. According to identity verification company Onfido’s 2024 Identity Fraud Report, there was a 3000% increase in deepfake attempts from 2022 to 2023 alone.
Seeing shouldn’t always be believing
The ability to create dupes that are so convincing an individual believes they are having a live conversation with someone they know and trust should be of significant concern to the finance and banking sector in particular.
In a widely reported incident earlier this year, a finance employee at a large global company unknowingly paid out $25 million to bad actors after receiving a request from the company CFO during a video call that turned out to be a deepfake. While shocking to think this could happen, the incident highlights the significant risk that exists for fraudulent financial transactions and the high-quality technology driving that risk. In fact, there was a 700% increase in deep-fake incidents aimed at the fintech sector between 2022 to 2023, according to a report from identity verification platform Sumsub.
From identity theft and account takeover fraud to the reputational risk that exists from processing fraudulent transactions, the magnitude of damage that can be done by a deepfake attack cannot be overstated. A recent report from Deloitte’s Center for Financial Services predicts that generative AI could enable fraud losses to reach $40 billion in the U.S. by 2027, up from $12.3 billion in 2023.
How to protect your company…and your customers
Banks and other financial institutions are already tasked by regulators with following certain Know Your Customer (KYC) protocols to prevent financial crimes, such as money laundering, but the increasing prevalence of deepfakes highlights the need to do more.
Investing in deepfake detection tools such as biometrics can be costly, but luckily there are steps to take that require very little or no investment, such as:
- Ongoing and mandatory fraud awareness training to help employees recognize common schemes and suspicious behaviors and understand the importance of following all protocols all the time.
- Routinely reviewing internal controls and updating processes to reflect the evolving risk environment.
- Implementing dual authorization processes for individuals with account access, payments responsibilities and other financial roles.
- Validating every customer and vendor transaction request with a phone call to the number on record. Being asked to call a different number should always be a red flag.
Your banking partner can help
Financial institutions also play a role in ensuring their customers are aware of common fraud schemes and that they understand how to help mitigate the risk. Maintaining a close relationship with your banker so they understand you, your business and your transaction habits can go a long way toward catching fraud early.
It is equally important to look for a financial institution that challenges you to think critically about your processes so that you know they are as airtight as possible. Your financial institution should encourage you to:
- Enable receipt of fraud alerts when there is suspicious account activity;
- Enable system notifications for when account details have been changed;
- Utilize multi-factor authentication; and
- Confirm the authenticity of vendor payment change requests by calling the vendor at the number on record.
It is okay to stop, hit pause and double check the authenticity of a request – no matter who it is from. Your company’s security depends on it. For more information on how Synovus can help your organization mitigate BEC fraud, complete a short form and a Synovus Treasury Consultant will contact you.
Aubrey LaBoda is Executive Director and Head of Treasury Management Sales at Synovus Bank.