Users look to others' behavior to guide their own decisions. Testimonials, ratings, user counts, and activity indicators all leverage social proof — but only when they feel authentic. Fabricated proof backfires.
In the 1970s, Robert Cialdini was trying to understand why people do what they do. One of his clearest findings was this: when people are uncertain, they do not deliberate -- they look sideways. They check what others are doing and use that as evidence for what they should do. He called it social proof, and documented it across everything from TV laugh tracks (we laugh more when a studio audience is laughing) to restaurant queues (we assume a busy restaurant is worth waiting for) to tip jars (the ones with money already in them receive more tips).
The logic is evolutionary and basically sound: if everyone around you is running, you should probably run too, even before you have seen what they are running from. Collective behaviour is usually better evidence than individual judgment. The problem is that this shortcut -- “other people are doing it, so it must be right” -- is trivially easy to fake. And once you understand how powerful social proof is, the temptation to manufacture it is enormous.
In product design, social proof shows up everywhere: user counts, star ratings, testimonials, press logos, “trending now” labels, “X people are viewing this” notifications, and “most popular” badges. Some of this is honest and valuable. Some of it is fabricated or heavily curated to mislead. The mechanism is the same in both cases. The ethics are not.
“We view a behaviour as more correct in a given situation to the degree that we see others performing it.”
— Robert Cialdini, Influence, 1984
A testimonial is only as convincing as it is specific and credible. The same product can be represented by a testimonial that actively undermines trust or one that genuinely persuades. The difference is in the details -- how specific the claim is, who is making it, and whether it reads as real.
No specific claims, no full names, no context. Most users read testimonials like these as fabricated — and they are often right.
Full names, titles, locations. Specific numbers. The 4-star review with a real limitation makes all three feel more trustworthy.
The third testimonial is 4 stars and includes a complaint about setup time. This is counterintuitive -- why would you put a critical review on your marketing page? Because it signals that the reviews are real. A page of exclusively 5-star glowing testimonials reads as curated. A page that includes a 4-star review with a specific criticism reads as honest -- and ironically increases trust in the positive reviews alongside it.
Booking.com, hotel sites, and e-commerce platforms pioneered “X people are viewing this right now” notifications. The implication: this is popular, you are competing, decide quickly. In many cases, these numbers are real -- or at least based on real data. In many others, they are algorithmically inflated, fabricated, or persist long after the actual activity has ended.
The two product pages below show the same item with the same stock level. The bad version uses every dark pattern in the live-activity playbook. The good version uses the same mechanism honestly.
Viewers, countdown, ‘only 3 left’ — all on the same product. Most numbers are fabricated or recycled; the countdown resets on refresh.
Real 30-day purchase count. Stock level shown only when actually low, with a restock day. No fake scarcity. The genuine 4.6 rating does the persuasion work.
The honest version uses fewer signals and is more persuasive to anyone paying attention. A real 30-day purchase count (1,240) is a strong signal -- it is specific, it is a real time period, and it implies sustained demand rather than a single spike. A stock level shown only when genuinely low, with a restock date, is credible. Compare that to “only 3 left” which has been “only 3 left” for weeks -- experienced online shoppers know this pattern and discount it entirely.
Social proof is among the most powerful persuasion tools available -- which is exactly why it requires the most care. Every fabricated review, every inflated user count, every fake countdown timer trains users to distrust all social proof, including yours. The platforms that have abused these mechanisms most aggressively have also produced the most sceptical users.
The bar for honest social proof is simple: would the signal change if you stopped controlling it? A real star rating from real users goes up and down based on product quality. A fabricated one does not. A real “X people bought this” counter reflects actual sales data. A fake one is a static number chosen to feel impressive. Real social proof is self-correcting. Fake social proof is fragile -- it only works until users notice it does not change.
Cialdini, R. B. (1984). Influence: The Psychology of Persuasion. William Morrow. Spiegel Online, M. (2011). Reviews, reputation, and revenue: The case of Yelp.com. Harvard Business School Working Paper. Chevalier, J. A., & Mayzlin, D. (2006). The effect of word of mouth on sales. Journal of Marketing Research, 43(3), 345--354.