No, helping strangers use their smartphones will not expose you to ‘AI biometric identity fraud’
IN SHORT: Warnings that strangers asking for help using a smartphone may actually be scammers conducting “AI biometric identity fraud” are just scaremongering. Even if well-intentioned, these warnings misrepresent how biometric security and impersonation scams actually work.
“STOP HELPING STRANGERS WITH THEIR PHONES,” warns a frantic message that has been widely shared on Facebook and WhatsApp, where it has been forwarded to Africa Check for verification.
Follow us on WhatsApp | LinkedIn for the latest headlines
The message warns of an increase in a “help-seeking scam”, in which a scammer approaches strangers and asks for help with a smartphone, adding that: “Thirty minutes could ruin your life financially!!!”
The scam is described as “AI biometric identity fraud”. The message claims that scammers secretly record a victim’s biometric details – identifying characteristics of a person, such as fingerprints and appearance – and then use these details to create a “digital clone” of that person. This “clone” can supposedly be used later on to bypass the biometric security checks associated with digital banking and other financial services.
Except this is not true. There is no evidence that this particular method of fraud has become popular or is even in use. Similar forms of impersonation are possible but work very differently and don’t require physical interaction between scammers and victims.
Digital impersonation doesn’t work this way
If there is a sliver of truth to this warning, it is the increasing ability of technology to mimic characteristics like someone’s voice or appearance. However, the warning gets many things wrong about how this kind of fraud works and is carried out.
For example, take the following claims from the message: “When you touch the phone (fingerprint), read numbers or verification codes (voice), or face the screen while talking or operating it (facial movements), your three core biometric identities – *fingerprint, voice, and face* – may be stolen. Modern AI can create a digital clone almost identical to you.”
A scammer wouldn’t need to trick you into holding a phone in order to record some of your biometric details; a smartphone might actually make the process less convenient. This is because most modern smartphones use obvious icons or lights called “privacy indicators” or “recording indicators” to let users know when the phone’s microphone or camera is in use. A particularly tech-savvy scammer could disable these indicators, but no technical knowledge is needed to use an ordinary microphone or camera to record a person’s face or voice.
Some biometric data would also be useless to a scammer because of how biometric security works. Smartphones such as the iPhone store fingerprint and facial recognition data on the device; it is never sent to their manufacturers, a banking service, or any other party.
For example, imagine logging in to a banking app on a smartphone. The first time you logged in, the app would ask you for the secure password associated with your digital banking account. The app can then save this password (safely, so that it’s not accessible to everyone who picks up your phone) and reuse it to log you in next time. However, the app still needs a way to confirm that not just anyone is trying to access your banking details. This is where biometric security comes in.
The bank assumes that, if you have set up fingerprint ID on your phone, you have told your phone to trust only your own fingerprints. When you log into your banking app with a fingerprint, your phone checks it against the list of prints you’ve previously told it to trust, and sends your banking app a message that essentially says, “I trust this print!” It doesn’t tell your bank who the print belongs to, or even what it looks like, but this is enough to instruct the app to log you in using the secure password it saved earlier.
All this means that, if a stranger has taken a copy of your fingerprint, they would still need access to your smartphone in order to access your banking app. They could not simply present a copy of your fingerprint to the bank and gain access to your account.
Deepfakes and the way fraud really works
There are certain kinds of fraud in which artificial intelligence (AI) is increasingly used for impersonation. Rather than trying to trick biometric security systems, fraudsters are usually trying to trick other humans.
For example, in 2024, a finance worker at a multinational company was tricked into making a HK$200 million (about US$25 million) payment to scammers after being instructed to do so on a video call by someone he thought was the company’s chief financial officer. In reality, the scammers had used sophisticated technology to mimic the faces and voices of not just the CFO but also other employees at the company, including the worker’s colleagues.
The Southern African Fraud Prevention Service’s information platform Yima describes these as tactics of impersonation scams:
Scammers may pretend to be government officials, or representatives from banks, telecoms, retailers, insurers, etc. They may even impersonate colleagues, public figures, friends or family. They use emails, SMSs, messaging apps, social media platforms, and phone calls to impersonate legitimate organisations or people. Often, the caller ID or email domains are ‘spoofed’ to make them appear authentic but are actually fake.
Once again, physically interacting with a victim or asking them for help with a smartphone is neither common nor an efficient way to impersonate them. It is possible to create convincing fake video or audio recordings of a victim using publicly available materials, such as videos posted on social media.
Banks and other organisations that require strict security have been working to combat impersonation like this. And as a result, they may require additional confirmation that you are who you say you are when signing in to an account on a new device or from an unusual location. Measures like multi-factor authentication add steps to the login process and make it much more secure.
If you follow the best security practices recommended by your bank and other financial service providers, you can minimise your chances of becoming a target of such a scam. One important thing you can do is be cautious about any communications that supposedly come from friends and family members.
Impersonation scams often take the form of “urgent” messages or calls that ask for money or personal information. Even if you believe an urgent call is coming from a trusted family member, pause before sharing any sensitive information and make sure that you are talking to the real person.
You can learn more about impersonation scams on the Yima website.
And as long as you’re not giving away personal information, you don’t need to worry about briefly helping a stranger use their phone.
