
This is one of my favourite discussion papers. Systems/processes in large and complex organisations can be difficult to use and access and not fit-for-purpose. People then find ways to adapt and “finish the design” of dysfunctional systems.
For only two pages, this very brief paper hits pretty hard.
“People are the most adaptable element in any complex work system” (p.338).
The authors argue that the tragedy of how adaptable people are is:
“adaptability makes dysfunctional work systems and practices appear to be performing better than they actually are. In the end, these invisible adaptations, filling gaps as they appear, can leave frontline workers “twisted like pretzels” around poorly designed or frankly dysfunctional systems as they contort themselves to get their work done” (p.338).
They talk about how people adapting is generally invisible to outsiders and within that field of practice, and works so well that it may be “hidden in plain sight”, where it fades into the background.
Since adaptability is finite it produces routines which may not be sustainable and may create adaptations with unforeseen risks, in conjunction with people burning out and spawning frustration, cynicism and learned helplessness.
The invisible adaptations which patch up gaps as they appear can leave managers with a false sense of security that things are tracking well since they’re not hearing or seeing bad news. These invisible adaptations may produce inefficiencies or create new failure modes. They give a hypothetical example of where requiring consistently more complicated and hard-to-remember passwords which need to be frequently changed may lead to reduced security.
Here, people resort to writing down their passwords at their workstations so they don’t forget the changing passwords (their example was in the hospital emergency department). In that example “workers adapt to reach higher-level goals (getting their work done) by sacrificing lower-level ones (system security)” (p338).
They also discuss the “less-than-enthusiastic” introduction of new technology into workplaces, like incident reporting systems or health information systems (in healthcare). On the latter the authors argue that in many cases physicians need a combination of incentives, training or champions to persuade them into using the technology (rather than being inherently convinced of the underlying value).
And while these disgruntled physicians may be seen as the outliers or troublemakers, they may “actually be the ones using the system best; that is, not using it much at all when there is a good chance they can obtain the required information more easily elsewhere or get by without it, or when the cost of thoroughness is high relative to the benefits of efficiency” (p338).
In these cases, while we may be quick to denounce work-arounds or slow adopters of systems, we rather should be heralding people for providing info on critical flaws or optimised ways to work.
Finally, they state that people shouldn’t have to adapt to support systems but systems should be adapted to support people. In the case of technology and systems, the intellectually lazy way calls for more or better training to fix deficiencies, which “provides the appearance of action without raising the issue of potentially costly redesign or retooling” (p338).
They make the apt observation that while training and awareness can be important when there are genuine gaps in knowledge about the technology, they aren’t helpful when the issues relate to the poor usability and limited usefulness.
Authors: Robert L. Wears, Zachary Hettinger, 2014, American College of Emergency Physicians.

Study link: https://doi.org/10.1016/j.annemergmed.2013.10.035
Link to the LinkedIn article: https://www.linkedin.com/pulse/tragedy-adaptability-ben-hutchinson
One thought on “The Tragedy of Adaptability”