Made with 🧠 and 🫀 by Youssef Bouksim

Back to library
👥

Social Proof

Users look to others' behavior to guide their own decisions. Testimonials, ratings, user counts, and activity indicators all leverage social proof — but only when they feel authentic. Fabricated proof backfires.

5 min readMarketing · Product · UX

In the 1970s, Robert Cialdini was trying to understand why people do what they do. One of his clearest findings was this: when people are uncertain, they do not deliberate -- they look sideways. They check what others are doing and use that as evidence for what they should do. He called it social proof, and documented it across everything from TV laugh tracks (we laugh more when a studio audience is laughing) to restaurant queues (we assume a busy restaurant is worth waiting for) to tip jars (the ones with money already in them receive more tips).

The logic is evolutionary and basically sound: if everyone around you is running, you should probably run too, even before you have seen what they are running from. Collective behaviour is usually better evidence than individual judgment. The problem is that this shortcut -- “other people are doing it, so it must be right” -- is trivially easy to fake. And once you understand how powerful social proof is, the temptation to manufacture it is enormous.

In product design, social proof shows up everywhere: user counts, star ratings, testimonials, press logos, “trending now” labels, “X people are viewing this” notifications, and “most popular” badges. Some of this is honest and valuable. Some of it is fabricated or heavily curated to mislead. The mechanism is the same in both cases. The ethics are not.

✦ Six forms of social proof in product design
✓
User numbers. “10 million users,” “trusted by 50,000 teams.” These work because scale implies validation -- that many people can not all be wrong. They are honest when accurate and current. They become misleading when the number is outdated, inflated, or represents signups rather than active users.
✓
Ratings and reviews. Star ratings aggregate many individual opinions into a single signal. They are among the most trusted forms of social proof because they feel harder to fake than copy. The dark patterns here include selectively displaying only positive reviews, incentivising reviews (which biases the sample), and suppressing negative ones.
✓
Testimonials. Specific quotes from named, real people. The most persuasive testimonials are specific (“it saved me 6 hours a week”), come from someone the reader identifies with, and include enough context to feel credible. Generic testimonials (“great product!”, no last name, no company) are perceived as fabricated by most readers and do more harm than good.
✓
Expert proof. Endorsements from recognised authorities -- press coverage, certifications, awards, academic partnerships. These borrow credibility from an institution rather than from users. They are most effective when the authority is genuinely relevant to what is being evaluated (a cybersecurity certification on a security product) and least effective when they are not (a generic business award on a niche SaaS tool).
✓
Peer activity signals. “3 people bought this in the last hour.” “27 people are viewing this.” “Trending in Morocco.” These create real-time social proof that is highly persuasive -- and highly abused. Many e-commerce and travel platforms display fabricated or algorithmically inflated activity numbers.
✓
Social network proof. “Your colleague Sara uses this.” “Followed by people you know.” These are among the most effective forms because they are personal -- you trust the specific people referenced more than anonymous crowds. Privacy implications mean this form needs careful handling.
“We view a behaviour as more correct in a given situation to the degree that we see others performing it.”
— Robert Cialdini, Influence, 1984

Testimonials -- the most used, most abused form of social proof

A testimonial is only as convincing as it is specific and credible. The same product can be represented by a testimonial that actively undermines trust or one that genuinely persuades. The difference is in the details -- how specific the claim is, who is making it, and whether it reads as real.

Before — Weak testimonials
yourapp.com/reviews
What our customers say
4.9 out of 5
“This product is amazing! It completely changed how I work. Highly recommended to everyone!”
SM
Sarah M.
Verified buyer
“Great product, very easy to use. The team is super helpful. Love it!”
JD
John D.
Verified buyer
“5 stars. Would recommend to anyone. Does exactly what it says.”
AK
Alex K.
Verified buyer

No specific claims, no full names, no context. Most users read testimonials like these as fabricated — and they are often right.

After — Strong testimonials
yourapp.com/reviews
What our customers say
4.9 from 2,841 verified reviews
“We cut our weekly reporting time from 6 hours to 40 minutes. Our analyst was spending half her Monday on dashboards — now she spends it on actual analysis.”
MS
Mia Santos
Head of Growth, Fintech startup, Lagos
“I was sceptical — every tool promises this. Switched from Notion and had my first automated report in 22 minutes. That was four months ago. We have not looked back.”
JP
James Park
Solo founder, SaaS tool, Seoul
“Setup took longer than I expected — about a day to connect all our sources. But since then, zero issues. The time savings have been significant for a team our size.”
AC
Ana Costa
Operations Manager, Agency, Lisbon

Full names, titles, locations. Specific numbers. The 4-star review with a real limitation makes all three feel more trustworthy.

The third testimonial is 4 stars and includes a complaint about setup time. This is counterintuitive -- why would you put a critical review on your marketing page? Because it signals that the reviews are real. A page of exclusively 5-star glowing testimonials reads as curated. A page that includes a 4-star review with a specific criticism reads as honest -- and ironically increases trust in the positive reviews alongside it.


Live activity signals -- the most frequently faked social proof

Booking.com, hotel sites, and e-commerce platforms pioneered “X people are viewing this right now” notifications. The implication: this is popular, you are competing, decide quickly. In many cases, these numbers are real -- or at least based on real data. In many others, they are algorithmically inflated, fabricated, or persist long after the actual activity has ended.

The two product pages below show the same item with the same stock level. The bad version uses every dark pattern in the live-activity playbook. The good version uses the same mechanism honestly.

Before — Fake urgency
shop.yourapp.com/headphones
HOT ITEM
SoundPro X3 Headphones
$129.99
$79.99
47 people are viewing this right now
Offer ends in: 11:47
Only 3 left in stock — 12 sold in the last hour!
847 sold this week

Viewers, countdown, ‘only 3 left’ — all on the same product. Most numbers are fabricated or recycled; the countdown resets on refresh.

After — Honest activity signals
shop.yourapp.com/headphones
4.6 / 2,841 reviews
SoundPro X3 Headphones
$79.99
Free shipping / Ships in 2 days
1,240 people bought this in the last 30 days
8 remaining — we restock every Tuesday
30-day returns / 2-year warranty

Real 30-day purchase count. Stock level shown only when actually low, with a restock day. No fake scarcity. The genuine 4.6 rating does the persuasion work.

The honest version uses fewer signals and is more persuasive to anyone paying attention. A real 30-day purchase count (1,240) is a strong signal -- it is specific, it is a real time period, and it implies sustained demand rather than a single spike. A stock level shown only when genuinely low, with a restock date, is credible. Compare that to “only 3 left” which has been “only 3 left” for weeks -- experienced online shoppers know this pattern and discount it entirely.


Applying this to your work

Social proof is among the most powerful persuasion tools available -- which is exactly why it requires the most care. Every fabricated review, every inflated user count, every fake countdown timer trains users to distrust all social proof, including yours. The platforms that have abused these mechanisms most aggressively have also produced the most sceptical users.

The bar for honest social proof is simple: would the signal change if you stopped controlling it? A real star rating from real users goes up and down based on product quality. A fabricated one does not. A real “X people bought this” counter reflects actual sales data. A fake one is a static number chosen to feel impressive. Real social proof is self-correcting. Fake social proof is fragile -- it only works until users notice it does not change.

✓ Apply it like this
→Use specific, verifiable numbers -- "2,841 reviews" and "1,240 bought in 30 days" are more convincing than round numbers like "thousands of happy customers."
→Show full names, job titles, and locations in testimonials -- specificity signals authenticity. An initial and a grey avatar signals fabrication.
→Include some criticism -- a handful of 3-4 star reviews mixed in makes the positive ones more believable. A page of only 5-star reviews raises suspicion.
→Only show scarcity signals when they are genuinely true -- "8 remaining" is persuasive when it is real; users who see it reset learn to ignore it entirely.
✗ Common mistakes
→Fabricated activity signals -- "47 people viewing this" when the number is generated by an algorithm or simply made up. Experienced users recognise these and distrust everything else on the page.
→Countdown timers that reset -- a sale that "ends in 11:47" but resets every time you visit is a lie, not a frame. It destroys trust the moment the user notices.
→Incentivised reviews without disclosure -- asking users to leave reviews in exchange for discounts biases the sample and is required to be disclosed on most platforms.
→Outdated user counts -- "10 million users" from a press release three years ago, displayed as present-tense fact, is misleading if the active user base is far smaller.

Cialdini, R. B. (1984). Influence: The Psychology of Persuasion. William Morrow. Spiegel Online, M. (2011). Reviews, reputation, and revenue: The case of Yelp.com. Harvard Business School Working Paper. Chevalier, J. A., & Mayzlin, D. (2006). The effect of word of mouth on sales. Journal of Marketing Research, 43(3), 345--354.