Journal Article 1)
Policies to promote public health and welfare often fail or worsen the problems they are intended to solve. Evidence-based learning should prevent such policy resistance, but learning in complex systems is often weak and slow. Complexity hinders our ability to discover the delayed and distal impacts of interventions, generating unintended “side effects.” Yet learning often fails even when strong evidence is available: common mental models lead to erroneous but self-confirming inferences, allowing harmful beliefs and behaviors to persist and undermining implementation of beneficial policies.
Here I show how systems thinking and simulation modeling can help expand the boundaries of our mental models, enhance our ability to generate and learn from evidence, and catalyze effective change in public health and beyond.
. . . .
Faced with the overwhelming complexity of the real world, time pressure, and limited cognitive capabilities, we are forced to fall back on rote procedures, habit, rules of thumb, and simple mental models. Although we sometimes strive to make the best decisions we can, bounded rationality means that we often systematically fall short.
. . . .
Among the most damaging misperceptions is the tendency to attribute the behavior of others to dispositional rather than situational factors; that is, to character and especially character flaws rather than the system in which they are embedded—the “fundamental attribution error.” The atrocities at Abu Ghraib were blamed on a few bad apples, whereas decades of research, from Milgram’s obedience studies and the Stanford prison experiment on, demonstrate that “it’s not the apples, it’s the barrel.” 50 Despite overwhelming evidence that our behavior is molded by pressures created by the systems in which we act, problems, such as the failure of patients to stay on their medications, recidivism among drug users, and childhood obesity, are persistently attributed to the undisciplined personal habits, poor attitude, or low intelligence of these “others.” 51 The focus becomes scapegoating and blame, and policy centers on controls to force compliance. Blame and attempts to control behavior provoke resistance and patient drop-out, strengthening the erroneous belief that these people are unreliable incompetents requiring still greater monitoring and control.52 Recognizing the power of system structure to shape behavior does not relieve us of personal responsibility for our actions. To the contrary, it enables us to focus our efforts where they have highest leverage— the design of systems in which ordinary people can achieve extraordinary results.53
Poor inquiry skills
Learning effectively in a world of dynamic complexity requires dedicated application of scientific method. Unfortunately, people are poor intuitive scientists. We do not generate alternative explanations or control for confounding variables. Our judgments are strongly affected by the frame in which the information is presented, even when the objective information is unchanged. We suffer from overconfidence in our judgments (underestimating uncertainty), wishful thinking (assessing desired outcomes as more likely than undesired outcomes), and confirmation bias (seeking evidence consistent with our preconceptions). Scientists and professionals, not only “ordinary” people, suffer from many of these judgmental biases.27,54
. . . .