It is the nature of our cognitive systems that we alternate between heuristics and deliberative reasoning. Heuristics are reasoning 'shortcuts' based on patterns that help speed up decision making in familiar circumstances. Deliberation takes more attention and energy, but it can go beyond immediately available information and enables complex computations, comparisons, planning, and choice.
This 'dual mind' theory—as brought to popular attention in books by Kahneman,3 Rugg,7 and Evans1—explains why the heuristics associated with evolution for survival in a dangerous hunter-gatherer world are also responsible for causing systematic biases in our judgments. Says Kahneman: "Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high and there is no time to collect more information." In such circumstances, intuitive errors are probable, and hence, deliberation is worth the investment.
The nature of decision making under pressure is highly relevant in a domain such as software engineering, in which developers often make precipitate decisions, for example, to make a deadline under competing demands. There has been increasing interest in cognitive bias in software engineering—reflected in the systematic mapping study by Mohanani et al.4 that covered 65 articles (and 37 biases). Nevertheless, the authors of the following paper rightly highlight the need for situated studies that examine cognitive bias 'in the wild' during software development activity.
The authors report on two empirical studies with professional developers:
These two studies give a rich and situated glimpse into developers' everyday experience with bias—and recovery from it.
The paper gives an indication of the extent to which cognitive biases disrupt software development. An interesting element of the analysis is the distribution of the observed biases, with fixation biases (that is, anchoring problem-solving on initial assumptions) and convenience biases (that is, choosing seemingly quicker or simpler routes to solution) being the most-frequently observed. The insightful counterpoint is that fixation and convenience biases were also associated with the most-frequently observed reversal actions—with the implication that persistent bias requires persistent correction, until the bias is addressed.
It is worth pointing out that a reliance on heuristics need not lead to bias, as noted by Gigerenzer et al.2—whose work on 'fast and frugal heuristics' argues that "In contrast to the widely held view that less complex processing necessarily reduces accuracy, the analytical and empirical analyses of fast and frugal heuristics demonstrate that less information and computation can in fact improve accuracy." This observation is echoed by Chattopadhyay et al.: "… not all cognitive biases necessarily result in a negative outcome. Biases can lead to positive effects—participants taking fewer actions than anticipated." Understanding what distinguishes between positive-outcome heuristics and negative-outcome heuristics is matter for further study, as are the factors influencing bias.
The interview analysis shows that developers are aware of bias and of practices that can help avoid or manage it. Recognizing bias as arising from an inherent (and sometimes efficient) aspect of our cognitive systems should steer research away from treating developers as 'thoughtless, lazy, or ignorant' and implying that the solution is in the technology. Instead, the paper provides evidence that developers are aware of bias and can be engaged fruitfully in mitigation. Clearly, further research is needed into the interpretation and efficacy of different practices.
Work on mitigating bias has indicated that mitigation is not a 'once and done' process but requires continual renewal. Although there is some evidence of how high-performing teams employ socially embedded practices that reduce cognitive bias (for example, Petre5), and there have been proposals for 'debiasing' software development (for example, Ralph6), much more work is needed. These efforts emphasize the need for 'in the wild' studies to understand cognitive bias as it occurs in context and to understand the opportunities in existing practice.
1. Evans, J.T. Thinking Twice: Two Minds in One Brain. Oxford University Press, 2010.
2. Gigerenzer, G., Hertwig, R., and Pachur, T. Heuristics: The Foundations of Adaptive Behavior. Oxford Scholarship Online, 2011.
3. Kahneman, D. Thinking, Fast and Slow. Penguin, 2011.
4. Mohanani, R., Salman, I., Turhan, B., Rodriguez, P., and Ralph, P. Cognitive biases in software engineering: A systematic mapping study. IEEE Trans. Softw. Engin. 46, 12 (2020), 1318–1339.
5. Petre, M. Balancing bias in software development. Keynote address: ACCU 2016, Bristol.
6. Ralph, P. Toward a theory of debiasing software development. In Research in Systems Analysis and Design: Models and Methods 93 (2011). Springer Lect. Notes in Bus. Info. Processing, 2011, 92–105.
7. Rugg, G. Blind Spot: Why We Fail to See the Solution Right in Front of Us. Harper One, 2013.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.
No entries found