ResearchUSA·221B News··3 min read

Survey: 84% of U.S. Online Daters Say AI Has Made It Harder to Trust Their Matches

New survey data reveals a deepening crisis of trust on dating platforms, with the overwhelming majority of users saying they cannot reliably tell whether a profile photo is real or AI-generated.

Abstract AI and digital identity concept

Key Survey Findings

A survey of active online daters in the United States published in early 2026 found that 84% of respondents said AI deepfakes had made it harder to trust their matches. A further 61% reported having encountered at least one profile they believed was fake or AI-generated in the past 12 months.

The numbers represent a significant shift from similar surveys conducted in 2022 and 2023, when the equivalent figures were 41% and 53% respectively — suggesting that user awareness of AI fakery is growing rapidly alongside the technology itself.

Why Trust Has Collapsed

Survey respondents identified three main factors driving distrust: AI-generated profile photos that look professional but feel "off" in ways users could not articulate; matches that were reluctant or unable to do a live video call; and profiles with sparse post histories and no mutual connections.

A separate study from a university research group found that participants correctly identified AI-generated faces as real more often than they correctly identified real faces — confirming that the human ability to spot AI images has not kept pace with generator quality.

What Users Are Doing About It

In response to deepening distrust, a growing share of online daters report using external verification tools before agreeing to meet a match. Reverse image search was the most commonly cited method (used by 38% of respondents), followed by Google search of the person's name (34%) and requesting a live video call before meeting (71%).

Reverse face search — which searches for the same face across public web pages rather than matching pixel-for-pixel — was used by 12% of respondents, a figure that has tripled since 2024.

The Verification Gap

Consumer researchers describe a "verification gap" between what users want — reliable confirmation that a profile is attached to a real person — and what available tools deliver. Reverse image search catches stolen photos but misses AI-generated faces. Platform verification features confirm liveness but not identity claims like name, age, or profession.

Reverse face search tools that check whether a face has a public footprint across the web sit in this gap: they do not confirm identity, but they can confirm whether the face has any verifiable public history — a signal most AI-generated faces will fail.

Sources

Tags

online dating trust AI 2026dating app fake profiles statisticsAI deepfake dating surveycatfishing statistics 2026online dating safety research

Verify a face before you trust it

Upload a photo to 221B and search the public web for matching faces. A real person leaves traces; a fake one usually does not.

Upload photo to search →