ad
ad

How criminals are using deepfake audio

News & Politics


Introduction

The emergence of deepfake voice programs has raised new concerns about the potential for scams as artificial intelligence technology advances. These systems are becoming increasingly adept at creating speech that closely resembles that of an individual person. Social media is teeming with examples of deepfakes, including one particularly remarkable creation in which an AI-generated audio of Morgan Freeman is nearly indistinguishable from the real Hollywood star.

One notable phrase shared in this context is: "I am not Morgan Freeman, and what you see is not real." This statement underscores the ease with which deepfake AI can be exploited, especially in situations where individuals are not face-to-face and thus have no means to verify the identity of the person they are interacting with.

Criminals can use a sufficiently large voice sample of a target to create convincing audio using commercially available AI software. These fake voices can then be employed by scammers to contact the loved ones of the target, often under the guise of an emergency, pleading for financial assistance. Because the imitations are so realistic, victims frequently find themselves deceived.

Businesses are also at risk. A prominent example is from January 2020, when a bank in the UAE lost a staggering $ 35 million after a branch manager was tricked into transferring funds after being contacted by someone impersonating the company's director. With the continuous improvement of AI technology, detecting these deepfake audio scams is likely to become increasingly challenging.

Keywords

deepfake, audio, scams, artificial intelligence, Morgan Freeman, impersonation, financial assistance, emergency, realistic imitations, AI technology

FAQ

Q: What are deepfake audio programs?
A: Deepfake audio programs are AI-driven technologies that can create speech that mimics specific individuals, making it difficult to distinguish from the real person's voice.

Q: How are criminals using deepfake audio?
A: Criminals use deepfake audio to impersonate individuals and scam their loved ones, often posing as someone in need of emergency financial assistance.

Q: Can deepfake audio be used against businesses?
A: Yes, businesses are at risk as demonstrated by incidents where company representatives have been tricked into transferring large amounts of money to scammers impersonating executives.

Q: How convincing are deepfake audio scams?
A: Deepfake audio scams can be highly convincing, leading victims to believe they are communicating with real individuals, often resulting in financial losses.

Q: Will it become easier or harder to detect deepfake scams over time?
A: With ongoing advancements in AI technology, it is expected that detecting deepfake scams will become increasingly difficult.