A summary of the different behavioural change models
On this page:
Nudge, system-thinking, and peripheral - and central-thinking models
The theory of planned behaviour and related models of rational decision-making
Motivated reasoning and cognition, and identity protection theories
This information is sourced from the Ministry for the Environment's Behavioural Insights Tool Kit.
A deeper understanding of decision-making processes can help you identify the underlying motivations and barriers for behaviour and consider these in your work.
Human behaviour is incredibly complex and countless theories and models have been developed to explain it. Theories and models try to explain aspects of what shapes our decisions and behaviour, including the ways we process information, how we make decisions and what information is considered when we do. None are perfect or tell the whole story, but they can help us understand the underlying influences behind decision-making and behaviour.
Consider which might apply best for the context you are working with and the type of behaviour you are trying to change. It’s often best to borrow ideas from each for a hybrid approach.
These similar models of decision-making and behaviour focus on the difference between decisions made slowly through conscious rational thought and those made quickly and automatically based on unconscious mental shortcuts, or heuristics.
Cass Sunstein and Richard Thaler argue in their book, Nudge: Improving decisions about health, wealth, and happiness, that we’re influenced unconsciously, whether it is intentionally planned or not. Any regulation, communications campaign, funding decision, institutional arrangement or other policy intervention has to connect with our unconscious decision-making processes.
Since the 1970s, researchers have demonstrated that most of our decisions and actions are guided unconsciously. Careful, rational thought requires too much time and mental energy to process the sheer volume of information that we receive every moment. To help with this, we use a variety of heuristics – or short cuts – to process information and respond faster and more efficiently. These short cuts are good at providing rapid, approximate answers, but they are imperfect and can be biased by the contexts around us.
The key to these theories is understanding when people are using this system of thinking, and how the models interact and the impact they have on decision-making. Even decisions that we believe were made rationally may have been predetermined by unconscious factors – our thinking is often better described as rationalisation after the fact than rational decision-making.
When to use it:
Strategies from these models work best for influencing quick decisions at the moment of action. Use them to target small, specific, relatively easy actions in the short term.
However, they can also shape the outcomes of apparently rational decision-making as well. Whether we're aware of it or not, small, unconscious judgements feed into our conscious, rational thinking processes
Strategies for change:
Knowing how a decision is made helps determine what kinds of interventions are more likely to succeed. In particular, these models focus on designing the context in which decisions are made to trigger short cuts that support positive behaviours. This shaping of the contexts is often called 'nudge' or choice architecture, and is the basis for many modern behaviour change interventions.
Resources to learn more about these models:
- Cialdini, R. B. (2006) Influence: The Psychology of Persuasion. Harper Business.
- Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
- Petty, R. E., & Cacioppo, J. T. (1986). The Elaboration Likelihood Model of Persuasion. In Communication and Persuasion (pp. 1–24). New York: Springer.
- Thaler, R.H. and Sunstein, C.R. (2009) Nudge: Improving decisions about health, wealth, and happiness. New York: Penguin Books.
These models all describe a process where our values, attitudes and beliefs, our perceptions of social norms, and our confidence or ability to act influence our behaviour. In these models, decisions are generally seen as conscious, rational processes.
The models focus on the types of evidence or arguments that we consider when making a decision more than how that information is processed. Generally speaking, we base our decisions on what outcomes we believe our actions will have and how much we value those outcomes. These theories acknowledge the influence of deeper values and barriers to actions.
For example, a farmer considering whether to build a wetland on her property might think about how a wetland will affect her property value, water storage for the farm, the visual appeal of the landscape, the water quality in her catchment, the number of waterfowl coming to the farm and, of course, her bank balance. She will then come to a decision based on how she values those outcomes.
When to use it:
These models are most useful to explain major decisions or encounters with new ideas where we’re likely to stop and consider our options carefully. Use strategies from these theories to target actions that are complex, non-specific or require ongoing, enduring commitment because these are likely to be considered more deeply.
Strategies for bringing change:
In these theories, behaviour change is a matter of changing people's attitudes, beliefs and perceptions of social norms. This can be both changing people's factual understanding by providing the right information about the behaviour itself or by shaping how people value the outcomes.
Resources to learn more about these models:
- Ajzen, I. (1985). From Intentions to Actions: A Theory of Planned Behavior. In J. Kuhl & J. Beckmann (Eds.), Action Control: From Cognition to Behavior (pp. 11–39). Berlin: Springer-Verlag.
- Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. https://doi.org/doi:10.1016/0749-5978(91)90020-T
- Ajzen, I. (2011). The theory of planned behaviour: reactions and reflections. Psychology & Health, 26(9), 1113–27. https://doi.org/10.1080/08870446.2011.613995
People define and express themselves based on their roles, their activities, their beliefs, and their relationships to other people and places. Because we all want to see ourselves as being good and doing right, our thinking is biased to protect ourselves from information that threatens that positive self-image. When confronted with threatening information, we can respond by ignoring the problem, challenging the evidence or resolving the conflict. The harder the problem is to resolve or the more threatening the message, the more likely we are to ignore or deny it. Conversely, we tend to accept information unquestioningly when it confirms our self-image and makes us feel positive.
For example, when we’re told that our actions are harmful to the environment, we’re unconsciously motivated to reject the evidence because it suggests that we’re doing harm and, therefore, are not good people. We are more likely to question the research methods, highlight any scientific uncertainties or dispute the impartiality of the information. When we’re told our actions are environmentally beneficial, we don’t look for methodological flaws or biases. Even minor activities like drinking coffee can trigger this effect: people who drink coffee regularly are less likely to believe science suggesting that coffee drinking is unhealthy and more likely to believe reports that it is beneficial.
This can explain why people reading the same information can reach radically different conclusions or why we reject evidence that contradicts ourr worldview. Importantly, this view of behaviour suggests why simply giving more evidence and information is often counter-productive and why people with more education are often more polarised. People with better information and mental skills use these resources to pick holes in the opposing arguments more effectively.
When to use it:
These theories are most helpful for explaining when hearts rather than minds appear to be the main drivers for actions, where people argue about how to interpret evidence, or where people deny there is a problem at all. Use strategies from these theories to target situations where people have entrenched, value-based positions and strong political or emotional arguments.
Strategies for change:
The key strategy is to call attention to and praise aspects of people’s identities or their values that might be receptive to the message and show people how the decision or action is consistent with that aspect or value.
In situations where people debate the evidence of a problem and demand further research, resist the temptation to simply provide more and better information. Instead, think about why people might not want the evidence to be correct and try to resolve those underlying issues.
Resources to learn more about these theories:
- Druckman, J. N., & Bolsen, T. (2011). Framing, Motivated Reasoning, and Opinions about Emergent Technologies. Journal of Communication, 61(4), 659–688. https://doi.org/10.1111/j.1460-2466.2011.01562.x
- Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as motivated social cognition. Psychological Bulletin, 129(3), 339–375. https://doi.org/10.1037/0033-2909.129.3.339
- Kahan, D. M. (2013). Ideology, Motivated Reasoning, and Cognitive Reflection. Judgment and Decision Making, 8(4), 407–424. https://doi.org/10.2139/ssrn.2182588
- Kahan, D. M., Dawson, E. C., Peters, E., & Slovic, P. (2013). Motivated Numeracy and Enlightened Self-Government (Public Law Working Paper No. 116). New Haven, CT. https://doi.org/10.2139/ssrn.2319992
- Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–98. https://doi.org/http://dx.doi.org/10.1037/0033-2909.108.3.480
- Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA Faked the Moon Landing−−Therefore, (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science. Psychological Science, 24(5), 622–633. https://journals.sagepub.com/doi/10.1177/0956797612457686