Home
Communities
Airdrops
Leaderboard
Meme Coins
AboutFAQ
AI Voice Scams Are Exploding: Here’s How to Outsmart Criminals Imitating Your Loved Ones

AI voice scams are rising


Scammers use AI to clone a loved one’s voice (often from short clips online) and combine it with spoofed caller ID to sound convincing. They tell urgent, emotional stories (arrest, accident, kidnapping, lost wallet, etc.) to pressure targets into sending money quickly.


Why it works

Urgency + familiarity = panic. When you see a known name and hear a familiar voice, your brain may bypass doubt and give in to the request.


How to protect your family


1] Agree on a family safe word


Pick a private phrase only your family knows. Ask for it immediately whenever money is requested; hang up if it’s wrong.


2] Limit personal info online

Reduce public posts that reveal travel plans, names, or voice clips, these make scams easier to personalize.


3]Strengthen banking safeguards

Enable real-time transaction alerts or limits that require extra confirmation to slow down rushed payments.


4] Screen suspicious calls with tech

Use reverse phone lookup tools and AI call-blocking to check or block dubious numbers before you engage.


Key red flags to watch for


  • Urgent secrecy (“don’t tell anyone,” “need it now”)
  • Refusal to verify identity
  • Pressure to send money quickly instead of confirming via trusted channels

If in doubt, hang up and call the person back on their real number.


Related:


Personal Cybersecurity: How Individuals Defend Themselves in the Age of Digital War

1
0.00
0 Comments

No Comments Yet