Made with 🧠 and πŸ«€ by Youssef Bouksim

Back to library
πŸ‘€

Hawthorne Effect

People modify their behavior when they are aware of being watched or measured. In UX research, this means test participants behave differently than real users. In products, visibility features like activity feeds, dashboards, and streaks deliberately harness this effect to drive engagement.

5 min readProduct Β· UX Β· Research

Between 1924 and 1932, researchers at the Hawthorne Works factory in Illinois ran a series of experiments on worker productivity. They varied lighting levels, break schedules, and work hours, expecting to find optimal physical conditions. Instead, they found something unexpected: productivity increased regardless of what they changed. Brighter lights? Productivity went up. Dimmer lights? Productivity went up. More breaks? Up. Fewer breaks? Also up.

The conclusion was that the workers were not responding to the physical changes. They were responding to the attention. They knew they were being observed, and that awareness -- not the experimental conditions -- was driving the behavioral change. Henry Landsberger later named this the Hawthorne effect: people modify their behavior when they know they are being watched.

In product design, the Hawthorne effect operates in two directions. First, it is a research bias: usability test participants behave differently than real users because they know they are being observed. Second, it is a design tool: features that make user activity visible -- contribution graphs, streak counters, activity feeds, public profiles -- deliberately harness the effect to change behavior.

✦ Three things to know
βœ“
Observation changes behavior even when there are no consequences. Fitbit users take more steps on days they wear the device. GitHub contributors code more when their contribution graph is visible. Duolingo users practice more when their streak is tracked. In none of these cases is anyone watching in real time. The mere awareness of being measured -- even by a machine -- is enough to change behavior.
βœ“
Usability testing is always distorted by the Hawthorne effect. Test participants are more methodical, more patient, more verbal about confusion, and more likely to complete tasks than real users. They are performing for the observer. This does not mean usability testing is useless -- it means the findings are directional, not absolute. Always supplement test findings with real usage analytics.
βœ“
The effect fades over time without reinforcement. The initial behavioral change from being observed diminishes as the novelty wears off. This is why activity dashboards need periodic nudges (weekly summaries, milestone celebrations) to maintain the effect. A contribution graph that is never referenced loses its motivational power.
β€œThe mere awareness of being measured changes the thing being measured. In research, this is a problem. In product design, it is a feature.”

The contribution graph -- how visibility drives behavior

GitHub's contribution graph is one of the most effective applications of the Hawthorne effect in product design. It turns coding activity into a public, visible artifact. Developers do not just code -- they code to maintain the appearance of consistency. The green squares become a performance for an audience, even when no one is watching.

Private activity -- no observation, no behavioral pressure
devplatform.io/profile
A
Alex Chen
Full-stack developer
Activity
Your activity is private. Only you can see your contribution history.
Total commits this year: 47
Activity is private and hidden. There is no visual record, no pattern to maintain, no audience to perform for. The developer contributes when they feel like it -- 47 commits in a year.
Public contribution graph -- observation drives consistent activity
devplatform.io/profile/alexchen
A
Alex Chen
Full-stack developer 47-day streak
312 contributions this yearLast 12 weeks
Less
More
Same developer, but now activity is public and visual. The contribution graph and streak counter create a performance for an audience. 312 contributions vs 47 -- the Hawthorne effect in action.

The contribution graph works because it creates a permanent, visible record. Every gap in the graph is conspicuous. Every streak is an investment the developer does not want to lose. The observation is not from a person -- it is from the graph itself, and from anyone who might view the profile. The behavioral change is real: developers with public contribution graphs contribute more consistently than those without them.


The Hawthorne effect in user testing -- what observation hides

Every usability test is contaminated by the Hawthorne effect. The participant knows they are being watched, recorded, and evaluated. This awareness changes how they interact with the product in specific, predictable ways. Understanding these distortions is essential to interpreting test findings accurately.

How observation distorts user testing
In a usability test
Task completion
Participants try harder
92%
Time on task
More methodical
3.2 min
Error rate
More careful
8%
Satisfaction score
Social desirability bias
4.6 / 5
Real analytics data
Task completion
Users abandon freely
61%
Time on task
Less patient
1.1 min
Error rate
Clicking faster
23%
Satisfaction score
No observer to please
3.8 / 5

The gap between testing and reality is not a flaw in usability testing -- it is the Hawthorne effect operating as expected. Participants in a test setting are performing. They try harder, take more time, make fewer careless errors, and rate their experience more positively because an observer is present. The practical response is not to stop testing, but to treat test findings as a ceiling rather than a prediction. If something does not work in a test, it definitely will not work in reality. If something works in a test, it might work in reality -- but probably less well.


Applying this to your work

The Hawthorne effect gives you two practical tools. First, as a research correction: always supplement usability testing with real usage analytics. The test tells you what is possible when users are paying attention. The analytics tell you what actually happens when they are not. Second, as a design tool: making activity visible changes behavior. Streaks, contribution graphs, public profiles, and weekly summaries all harness the Hawthorne effect to drive engagement.

The ethical boundary is between empowerment and surveillance. Showing users their own activity data (step counts, contribution graphs, learning streaks) helps them self-motivate. Showing managers per-employee activity data (click counts, time on page, idle time) is surveillance that erodes trust and autonomy. The mechanism is identical. The power dynamic is not.

βœ“ Apply it like this
β†’Supplement usability tests with real analytics data. Test findings show what happens when users are trying; analytics show what happens when they are not.
β†’Use visible activity records (streaks, contribution graphs, progress bars) to motivate users through the Hawthorne effect -- people perform better when their activity is tracked.
β†’Send periodic activity summaries (weekly email, monthly review) to maintain the effect over time. Without reinforcement, the behavioral change fades.
β†’Make activity data visible to the user themselves first. Self-observation is empowering; external observation without consent is surveillance.
βœ— Common mistakes
β†’Treating usability test results as ground truth. A 92% completion rate in testing may translate to 61% in production. Use test findings as directional, not absolute.
β†’Surveillance features disguised as productivity tools. Showing managers per-employee click data crosses from motivation into monitoring.
β†’Public leaderboards that shame low performers. Visible activity should motivate through personal progress, not through comparison-driven anxiety.
β†’Activity tracking without transparency. Users should always know what is being measured, who can see it, and how to opt out.

Landsberger, H. A. (1958). Hawthorne Revisited. Cornell University Press. β€” McCambridge, J., Witton, J., & Elbourne, D. R. (2014). Systematic review of the Hawthorne effect. Journal of Clinical Epidemiology, 67(3), 267–277. β€” Adair, J. G. (1984). The Hawthorne effect: A reconsideration of the methodological artifact. Journal of Applied Psychology, 69(2), 334–345.