AI Voice Cloning Is Now Being Used in Romance Scam Phone Calls, Researchers Warn
Security researchers and law enforcement officials report that romance scammers are increasingly using AI voice cloning to conduct convincing phone calls, moving beyond fake profile photos to real-time audio impersonation.
What Researchers Found
Security researchers and FBI analysts have identified a growing subset of romance scam operations that are incorporating AI voice cloning into their methodology. Rather than avoiding phone calls — the traditional scammer approach — these operations conduct calls using synthetic voices generated from short audio samples. The technique allows scammers to speak with targets in real time, dramatically increasing the perceived authenticity of the relationship.
The Washington Times reported in March 2026 that AI-generated voices are now present in a meaningful share of the romance scam complaints filed with the FTC, with some victims describing calls that "sounded completely real."
How Voice Cloning Is Used in Scams
The typical deployment begins after a text-based relationship has been established. The scammer proposes a voice call — an act that traditionally signals legitimacy. Instead of a real person, the target hears a synthetic voice generated in real time from a cloned voice model. The scammer types responses into a system that converts text to speech using the cloned voice, introducing a slight delay that is attributed to a "bad connection."
Voice cloning requires only a few seconds of audio from the target persona — easily obtained from public social media videos if the scammer is impersonating a real person, or generated from scratch if the persona is entirely fabricated.
Beyond the Profile Photo
The emergence of AI voice calls represents an escalation beyond profile photo fraud. Historically, the request for a live call was the simplest way to verify whether a match was genuine — a real person could get on a call; a scammer operating at scale could not. AI voice cloning removes this safeguard.
The next reliable verification layer is a live video call, which is significantly harder to fake in real time, though AI video synthesis is advancing. Beyond video, reverse face search to confirm the person has a consistent public identity remains the most accessible off-platform verification step.
How to Verify When a Call Feels Real
When a voice call feels convincing but other signals are ambiguous, several verification steps remain reliable:
- Request a live video call rather than just an audio call. Ask the person to wave or perform a specific gesture in real time to confirm it is not a pre-recorded video.
- Run a reverse face search on their profile photo. A voice can be cloned from anyone; the profile photo is the identity anchor.
- Search the person's name and claimed employer independently before the conversation reaches a financial request.
- Note the call delay. AI voice synthesis in 2026 still introduces a 0.5–2 second response latency that exceeds normal conversational rhythm.
Any contact that has passed a voice call test but still introduces investment links, crypto platforms, or payment requests should be treated as a red flag regardless of how natural the call felt.
Sources
Tags
Verify a face before you trust it
Upload a photo to 221B and search the public web for matching faces. A real person leaves traces; a fake one usually does not.
Upload photo to search →
