ad
ad

How phone scammers are using AI to imitate voices

News & Politics


Introduction

Phone scams have long been a menace, but advancements in artificial intelligence (AI) are making these scams increasingly sophisticated and believable. Scammers have begun employing AI to clone voices, creating an alarming tactic that can trick unsuspecting victims. In a recent report by CBS's Carter Evans, the terrifying implications of this technology were explored through the experience of Jennifer De Stefano, whose daughter fell victim to this scheme.

Jennifer De Stefano will never forget the frantic call she received from her 15-year-old daughter, Brianna, who was pleading for help. Initially, Brianna’s voice came through, sounding distressed, as she cried out for her mother: "Mommy! Help me! Help me!" Shortly after, an aggressive man came on the line, claiming to have her daughter and demanding a ransom of one million dollars. When Jennifer explained that she couldn’t possibly pay that amount, the scammer lowered the demand to fifty thousand dollars. Fortunately, Jennifer did not pay anything, later discovering that the call was a sophisticated AI scam utilizing voice cloning technology.

Americans lost nearly $ 9 billion to fraud last year alone, which marked a staggering 150% increase in losses over just two years. Interestingly, younger individuals are experiencing fraud and financial losses at higher rates compared to older people, but the latter group often has more at stake.

Cybersecurity expert Pete Nicoletti demonstrated the alarming capabilities of voice cloning by recreating his own voice using old news reports available online. He showcased how the technology could mimic one's voice convincingly. To illustrate the risks, Evans called his mother, posing as himself and requesting sensitive information such as her driver's license number. Shockingly, she fell for the AI-generated deception, highlighting just how persuasive such technology can be.

Nicoletti emphasized that in today’s digital landscape, individuals can no longer trust voices, photos, or videos, as AI tools are evolving to the point where they can be manipulated and weaponized against people. It is just as vital to safeguard one’s voice as it is to protect fingerprints. Soon, scammers may even be able to interact with victims in real-time, further complicating the issue.

Given the scary advancements in AI scams, protecting oneself has become imperative. Some immediate steps include creating a safe word known only to family members, ensuring social media accounts are private to limit the information available to scammers, and, if receiving a suspicious call, hanging up and verifying the situation by calling back.


Keywords

  • Phone scams
  • Artificial intelligence (AI)
  • Voice cloning
  • Cybersecurity
  • Fraud
  • Safe word
  • Social media privacy

FAQ

1. What are AI voice scams?
AI voice scams involve the use of artificial intelligence technology to clone the voices of individuals, creating realistic impersonations that scammers use to deceive victims.

2. How much money did Americans lose to scams last year?
Americans lost nearly $ 9 billion to fraud last year, representing a 150% increase in losses over a two-year period.

3. What steps can I take to protect myself from AI voice scams?
Create a safe word only known to family members, make your social media accounts private, and verify any suspicious calls by calling back to confirm the situation.

4. Why are younger individuals more affected by fraud?
Younger people tend to experience fraud and financial losses more frequently than older individuals, potentially due to increased online interactions and less awareness of scam tactics.

5. What role does social media play in these scams?
Scammers can use publicly available information from social media to create more convincing impersonations and manipulate their victims effectively.