People modify their behavior when they are aware of being watched or measured. In UX research, this means test participants behave differently than real users. In products, visibility features like activity feeds, dashboards, and streaks deliberately harness this effect to drive engagement.
Between 1924 and 1932, researchers at the Hawthorne Works factory in Illinois ran a series of experiments on worker productivity. They varied lighting levels, break schedules, and work hours, expecting to find optimal physical conditions. Instead, they found something unexpected: productivity increased regardless of what they changed. Brighter lights? Productivity went up. Dimmer lights? Productivity went up. More breaks? Up. Fewer breaks? Also up.
The conclusion was that the workers were not responding to the physical changes. They were responding to the attention. They knew they were being observed, and that awareness -- not the experimental conditions -- was driving the behavioral change. Henry Landsberger later named this the Hawthorne effect: people modify their behavior when they know they are being watched.
In product design, the Hawthorne effect operates in two directions. First, it is a research bias: usability test participants behave differently than real users because they know they are being observed. Second, it is a design tool: features that make user activity visible -- contribution graphs, streak counters, activity feeds, public profiles -- deliberately harness the effect to change behavior.
βThe mere awareness of being measured changes the thing being measured. In research, this is a problem. In product design, it is a feature.β
GitHub's contribution graph is one of the most effective applications of the Hawthorne effect in product design. It turns coding activity into a public, visible artifact. Developers do not just code -- they code to maintain the appearance of consistency. The green squares become a performance for an audience, even when no one is watching.
The contribution graph works because it creates a permanent, visible record. Every gap in the graph is conspicuous. Every streak is an investment the developer does not want to lose. The observation is not from a person -- it is from the graph itself, and from anyone who might view the profile. The behavioral change is real: developers with public contribution graphs contribute more consistently than those without them.
Every usability test is contaminated by the Hawthorne effect. The participant knows they are being watched, recorded, and evaluated. This awareness changes how they interact with the product in specific, predictable ways. Understanding these distortions is essential to interpreting test findings accurately.
The gap between testing and reality is not a flaw in usability testing -- it is the Hawthorne effect operating as expected. Participants in a test setting are performing. They try harder, take more time, make fewer careless errors, and rate their experience more positively because an observer is present. The practical response is not to stop testing, but to treat test findings as a ceiling rather than a prediction. If something does not work in a test, it definitely will not work in reality. If something works in a test, it might work in reality -- but probably less well.
The Hawthorne effect gives you two practical tools. First, as a research correction: always supplement usability testing with real usage analytics. The test tells you what is possible when users are paying attention. The analytics tell you what actually happens when they are not. Second, as a design tool: making activity visible changes behavior. Streaks, contribution graphs, public profiles, and weekly summaries all harness the Hawthorne effect to drive engagement.
The ethical boundary is between empowerment and surveillance. Showing users their own activity data (step counts, contribution graphs, learning streaks) helps them self-motivate. Showing managers per-employee activity data (click counts, time on page, idle time) is surveillance that erodes trust and autonomy. The mechanism is identical. The power dynamic is not.
Landsberger, H. A. (1958). Hawthorne Revisited. Cornell University Press. β McCambridge, J., Witton, J., & Elbourne, D. R. (2014). Systematic review of the Hawthorne effect. Journal of Clinical Epidemiology, 67(3), 267β277. β Adair, J. G. (1984). The Hawthorne effect: A reconsideration of the methodological artifact. Journal of Applied Psychology, 69(2), 334β345.