Protect yourself from AI scams
Learn how to spot the use of AI in scams and how you can protect yourself.
AI voice cloning scams
Scammers can use AI voice cloning to impersonate someone you know – like a friend or family member, to convince you to send them money.
AI voice cloning has become so advanced that it can be hard to tell the difference between AI and the real person.
Example scam
You get a call from your 'sibling' asking for urgent help and an immediate money transfer. But you never received a call from your sibling. A scammer used AI to mimic your sibling's voice.
View tips on AI voice cloning scams
Vocal inconsistencies
Listen for subtle signs that suggests the person on the other end is not real:
Monotoned
Lack of human sounds (e.g. clearing throat)
Unusual pacing and pauses in sentences
Repetitive phrasing
Sense of urgency
Whether AI or not, nearly all scams create a sense of urgency for you to act quickly. Make sure you always pause to question the out-of-the-blue call even if it is from a loved one.
Unknown number
Be wary any time you receive calls from an unknown number even if it sounds like a family member or friend.
Unable to answer personal questions
Ask questions that only your loved one would know e.g. 'when did I last see you and what did we do?' This can help you figure out whether the person you are speaking to is a scammer or not.
Unwillingness to explain further
Ask probing questions. Scammers are less likely to provide details and would continue to urge you to transfer money.
Strange behaviour
Watch for any uncharacteristic behaviour over the phone. If your ‘loved one’ sounds different, hang up and contact them on a different number.
Deepfake scams
AI can create realistic fake videos, known as deepfakes, showing people saying or doing things they never did.
Deepfake videos can be convincing as AI studies photos and recordings of a person to create their deepfake version.
Example scam
A celebrity video appears online promoting an investment that promises huge returns. You invest, only to discover that the video, and the offer, were fake.
View tips on Deepfake scams
Too good to be true?
If something seems too good to be true, chances are it is. Even if a video is viral, always make sure you question if the opportunity is real. Always crosscheck the information you see with other sources.
Look for glitches
Keep an eye out for instances where the face or head on the video seems to shift or vibrate unnaturally. Deepfake videos can also have mismatched lip-syncing where the audio doesn’t perfectly align to the video.
Unnatural facial expressions
Watch closely for any unnatural blinking, eye movement or facial expressions. In deepfake videos, facial expressions may not fully match the emotion being conveyed.
Blurry edges
Look closely at the edges of the face. There may be some blurry edges and distortions where the face meets the background.
AI chatbot scams
Some scammers use AI chatbots to pose as real people – from online love interests to fake customer service agents.
They're designed to build trust and eventually ask for money or personal information.
Example scam
You've been chatting online for months. They suddenly ask for money to buy a plane ticket to visit you – and then disappear. The person never existed, the images of them were AI and the messaging back and forth was to an AI chatbot.
View tips on AI chatbot scams
Verify who they are
There are some tools you can use to verify the person you are talking to is real (reverse image and video calls) however the best way to verify if the person is real is to meet them face-to-face.
Never send money to a stranger
Do not send money to anyone you meet online regardless of the urgency of the request.
Be wary of incoming direct messages
Get a direct message (DM) out of the blue from a celebrity? It's likely a scam. If you get a DM from a stranger online, be skeptical and do your research before responding. Ask yourself, 'why would a celebrity contact me?'
AI phishing scams
Phishing emails and texts are getting smarter.
With AI, scammers can now create perfectly written, highly personalised messages that look just like they came from your bank or other trusted companies.
Example scam
You receive an email from your 'bank' informing you of suspicious activity and asking you to click a link to secure your account. The link leads to a fake login page designed to steal your details.
View tips on AI Phishing scams
Suspicious links
Be wary of clicking any unfamiliar links in emails or SMS. Even if the email seems legitimate, always keep an eye out for any inconsistencies or errors before clicking on a URL.
Fake domains
Scam email addresses don't always look legitimate. If you do not recognise the email address, do not action anything on the email before verifying the email by calling the sender.
Request for more information
Phishing scams always request some form of information from you. Do not input any information through links in emails or SMS. Call the sender of the email to verify any requests.
Multi-factor authentication (MFA)
Adding another form of authentication to your email accounts and Internet Banking makes it harder for a scammer to gain access.
AI is always evolving
We have seen that AI is constantly evolving, and with that, so will AI scams. But who knows how to beat a scam? You do. That's who.
Stay alert, question anything that feels unusual, and double-check before acting on out-of-the-blue calls, emails or offers. If in doubt, pause and verify with a trusted friend or family member.
Sources
ABC News article: Eddie McGuire deepfake video financial scam
McAfee blog: A guide to deepfake scams and AI voice spoofing
The information in this communication is general in nature and is intended to raise awareness about common scam tactics and preventative measures. While the information may assist you in mitigating your exposure to scams and fraud, this is not guaranteed in any way. Examples are illustrative only and are subject to the assumptions and qualifications disclosed. Whilst care has been taken in preparing the content, no liability is accepted for any errors or omissions in this communication, and/or losses or liabilities arising from any reliance on this communication.