Cognitive Biases – Executive Briefing

Understanding and Navigating Cognitive Biases

There is all too often a reluctance to learn from the experiences of others; we tend to assume that our own particular challenges are unique. Our surveys and discussions with others using the Academic Unit Diagnostic Tool (AUDiT) emphasize the opposite: troubled units encounter similar difficulties. If you have used this tool to assess your own department and found more cells in the yellow and red columns than you would like, the next step is to consider points of potential intervention and reform… A task easier said than done.

Unit members may be reluctant to engage with any process of change if they don’t believe there are problems in the first place. One of the major barriers can be an unrecognized one: cognitive biases.

What are Cognitive Biases?

Cognitive biases are errors in thinking that are found throughout human interactions. They can drive us to assume the best in ourselves, and the worst in others; to retain information that reinforces our existing beliefs, and discount or ignore information that does not; to judge ourselves by our intent, and others only by their actions. Their effects are so quick, often we do not even realize anything has happened. Working to identify and counteract these flaws in our own thinking, and learning to recognize them in others, can improve relationships in our working environments.

Observing errors in logic and cognition from a distance, it can be easy to identify mistakes. Some might seem so obvious you can quickly convince yourself that you would never fall prey to them, and this in itself is a known bias born of overconfidence. The truth is that all of us are susceptible – no matter how self-aware we might feel, no matter how intelligent or well-educated. We see these biases at work in many of the situations that characterize troubled academic units, and in people’s reactions (or their failing to react) to those problems. For example, rationalization and denial are at the heart of many issues in challenged units, and they can be persistent and intractable.

Cognitive biases affect people of all races, identity positions, and cultures. They affect people with bad intentions and good ones, and while they are especially pernicious when people are tired or distracted, they come into play even when they are not. It takes hard work and dedication to forming good habits to guard against their effects, and if you are committed to overcoming them, accepting that you are vulnerable to them is an important first step.

Cognitive biases are not new – behavioral researchers have been studying such failures of reasoning for decades. Sometimes these differences reflect honest disagreements about the facts, or how heavily to weight them in assessing the status of the unit; other times they reflect an unwillingness to acknowledge the elephant in the room.

In these latter cases, the root of the differences may instead be in the form of denial through cognitive biases. The denial can manifest in any number of ways: As a department head, you might find yourself facing a host of faculty pointing fingers at one another; or they may all be pointing at you. Combatants may be furiously engaged in rationalizing their own behavior, because “so-and-so did something else just as bad!” or “I had to, to stand up for principle!”

It can be difficult to recognize cognitive biases in action, as they can be subtly subversive, without our even realizing it. They often arise from the beliefs we hold most strongly, and from natural egocentric human tendencies which we all exhibit – ones that lead us to see the world through our own filters and perceptions. These biases matter both because they can be a source of departmental dysfunction and because they can interfere with identifying and acting upon the problems a unit faces.

Drawing upon what is understood about these cognitive biases within the fields of social and behavioral psychology, we examine them through the lens of academia, and distill the known traits of several of the most common – and most counterproductive to a vibrant academic unit culture. While we list these as separate examples to make them easier to grasp, we also hope to make it clear that these are not entirely discrete phenomena, and that in many ordinary circumstances they operate in concert.

Sinister Attribution Bias

When we exhibit a Sinister Attribution Bias, we allow our personal feelings about another to shape our assumptions about the reasons for their actions: you attribute less admirable motives to those you do not like and excuse or rationalize the conduct of those you do. For example, if you don’t like Alex much and you are partial to Louise, when Alex is late for a meeting or the class he has to teach, you imagine him dismissively looking at the clock and shrugging his shoulders; when Louise is late, you are more likely to envisage her in heavy traffic or dealing with a pressing matter.

Fundamental Attribution Error

The Fundamental Attribution Error describes our tendency to credit ourselves for our successes and to blame external environmental factors for failures, while doing the opposite for others. This is illustrated in automobile accidents, where we often feel that an accident caused by someone else was due to that person’s ineptitude as a driver, while our own mishaps were the result of bad luck, poor road layout, adverse weather conditions, the actions of other drivers, or confusing signage. In academic contexts one can see this tendency manifested, for example, in data collection and research outcomes. When it’s your study that didn’t yield the results you were hoping for, it was simply that the “Data Gods were not feeling benevolent that week.” When it is your irritating coworker’s project, it is “probably because his methods were sloppy” or “her analysis was poorly done.”

Confirmation Bias

Confirmation Bias is one of the most common cognitive errors. It is the instinct to seek or acknowledge only the segments of information that support your already-existing beliefs, and to parse or reject data that goes against them. So, you might remember previous hires from prestigious schools (perhaps like your own!) as being among the best hiring decisions the unit has made, arguing that the same institutions should also be emphasized in future hires, while forgetting the several unsuccessful hires from those kinds of schools, and neglecting some outstanding hires from less prestigious programs.

Anchoring Bias

Our first impressions are often the easiest to reaffirm, and some of the hardest to readjust. We have a tendency to embed, or anchor, on the initial information presented during a conversation. Anchoring bias and confirmation bias often go hand in hand. Faculty might remember for years a single comment made in a faculty meeting, and project a colleague’s future behavior based on such an unrepresentative samples. Anchoring defines negotiations by shaping expectations and ranges. If you’re getting ready to negotiate a job offer or a promotion, learning more about anchoring is well worth the time. 

Dunning-Kruger Effect

As Bertrand Russell once said, “the trouble with the world is that the stupid are cocksure while the intelligent are full of doubt.” The DunningKruger Effect is observed when people who have little expertise or ability in a particular area assess their proficiency as being greater than it is. A major occupational hazard for academics is when people who are experts in one field believe they are justified in speaking with authority on other topics, whether they possess the requisite expertise or not. Conversely, other academics are reflexively insecure and doubtful about their abilities, needing reassurance or recognition far beyond what other colleagues require.

Motivated Blindness

Many of us have encountered a case of potential Motivated Blindness in our lives – the tendency to overlook bad news when it suits us, or fail to notice unethical behavior when it is not in our interests to do so. This can be especially destructive in academia if, for example, a coauthor is planning to selectively limit the data shown in a joint article. Doing so makes the conclusions stronger and more convincing, and while the other author knows it isn’t telling the whole picture, both really want the manuscript published, and no one wants to start an argument with a colleague, so the other author says nothing. Other manifestations of this fallacy can inhibit the kinds of frank and honest discussions a unit needs to have about its issues.

Egocentrism Bias

One final type of cognitive bias that we exhibit frequently that can afflict academic units is Egocentrism Bias, or the tendency to think your position is right, so naturally others will agree with you. This assumption can leave one unprepared for honest differences of opinion or (combined with other fallacies cited above) prompt feelings that when people disagree it must be for questionable motives. Egocentrism often affects the judgments of faculty toward administrators, or vice versa, in academic units – and can be the source of serious conflict and misunderstandings.

How do we protect ourselves from Cognitive Biases?

Recognizing biases and the ways in which they pose challenges to healthy academic units is only the first step: it is essential to understanding how to overcome them, engage in intervention and repair, and foster more open and informed discussions about a unit’s strengths and shortcomings.

Arming yourself with knowledge can help you to recognize cognitive biases in yourself or in others, and to begin to work against their effects. One of the simplest and most straightforward ways to avoid cognitive biases is to consciously train yourself to ask questions to challenge your own assumptions and those made by others. Sometimes, this means surrounding yourself with people you know will challenge you. Having someone on your team who is adept at playing “Devil’s Advocate” can help you make stronger decisions, because it prompts you consider a wider range of factors and possibilities.

It is preferable to ask more questions to confirm understanding than to simply assume the information you have is correct. The more information you acquire and the more options you consider, the better equipped you will be to identify and choose the path you should take, rather than the one you want to take. The challenge lies in learning how to pose questions constructively, in a spirit of inquiry – and not deploy them as weapons to label, humiliate, or vanquish others. Of course, since one of the markers for cognitive biases in action is an unwillingness to accept questions, it can take some practice and tact to cultivate the mindset and the skill required to ask questions that advance—not escalate—any complex discussion.

There are many reasons people often experience self-doubt and hesitation when it comes to asking questions. Outside certain kinds of formal settings where it is expected (an academic presentation, let’s say), there are social norms against skeptical questioning, which is often seen as aggressive. In politics, questions – for example, from reporters – are often characterized as “disrespectful” or “hostile” especially when they seek out uncomfortable or inconvenient facts. Given these larger social dynamics in our culture, questions can be viewed as power plays, acts of dominance, or as microaggressions. The intention – or perceived intention – behind the question, the context, the relative positions, roles, and status of the questioner and questioned, can all reinforce these perceptions. And so we often see, even in academic settings that are supposed to be about the free and open exchange of ideas, a certain laissez-faire tolerance of the views and opinions of others, even when we believe them to be seriously misguided, or even dangerous.

More prosaically, asking questions of others can be awkward, whether because of concerns about looking uninformed or foolish, an aversion to pestering others, or not wanting to appear to disagree. Consider the alternative, though: without asking questions to confirm information, intentions, and events, we tend to make assumptions, and this leads quickly to trouble. As we often say in our project group when trying to work through complex issues, “mind-reading is a highly imperfect form of communication.”

So, if questions framed poorly or used with mal-intent are counterproductive, what kinds of questions invite the type of self-reflection that can begin to uncover confirmation bias, self-deception, or an unwillingness to consider the possibility of being wrong? And, if we are to engage others in this fashion, what does that commit us to, in terms of reciprocity?

How one engages with other people about difficult issues depends on the respective roles of each person in the interaction: How you approach a subordinate will be very different from how you approach a supervisor, a peer, or boss. Are you approaching a group of people, or just one? Are you peers, or is there a power discrepancy between you? These factors will often affect how questions are perceived no matter how carefully they are worded.

In the context of a troubled department, members may react in diverse ways to avoid having to accept responsibility: attributing worse motives to others than to themselves; seeing in the actions of others unprofessional conduct but not recognizing it in themselves; selectively citing examples to make problems look more serious (or less serious) than they are; and so on.

One tendency that often leads people into trouble is to assume that there are always demonstrably right and wrong choices to make and outcomes to reach in dealing with difficult situations. Unfortunately, the world is rarely so simple, and many difficult situations have no clear resolution. Characterizations of the positions of others as right or wrong, correct and incorrect, are powerfully charged and often encourage defensiveness that hampers productive discussions. As a result, one of the least effective approaches is to begin any exchange with the expectation of convincing the other person, or persons, that they are “wrong” and you are “right” – even when (or especially when) you strongly believe that they are “wrong” and you are “right.” 

A more useful approach is to think in terms of “better” versus “worse” as opposed to “right” and “wrong.” Seek interventions that move things further along the spectrum towards the “better,” rather than seeking an ideal. If you can bring others far enough along to begin to see the possibility of flaws or holes in their positions, they may make the rest of that journey themselves by starting to consider other options, and they will be even more likely to do so if they can do it without appearing to “lose.”

Set an Example

As a leader, establishing a culture of encouraging questions can help to inoculate your unit against many of the most common and pernicious cognitive biases. Gathering more information and additional perspectives is almost never a bad thing in preparing to make decisions, and if that is the tone you set as the leader, then that is the model that the people around you will be more likely to adopt. Take care that your language does not exacerbate ideological or other divisions. When discussing how to improve and repair dysfunctional units, articulate what we can do together to move things along that spectrum towards “better,” rather than focusing on the actions or blame of individuals.

Another useful tactic is to incorporate “third point” perspectives to focus on, so that the lens of attention is not on any one person or group. If a subunit within your department has an inefficient or ineffective process, demanding an explanation for “how they could do something so stupid!?” is likely to elicit defensive reactions, increase reluctance to change, and even hinder acknowledgement that change is needed. Pointing to external data, a report, or even an environmental or institutional threat (e.g. competition from another unit), and using that to draw upon common goals can reinforce that this is a process among colleagues with shared interests.

An AUDiT review can serve this purpose by surfacing shared concerns that might otherwise be left unspoken, or citing data that highlight objective conditions that are not in themselves subject to dispute – even if the choice of what to do about them might be.

Another effective approach can be to provide an example of another institution’s methods or system, and ask to explore their strengths and weakness. In some cases, the act of simply explaining such differences is powerful enough to demonstrate their benefits and drawbacks, and because this is (initially, at least) talking about others, it raises potential issues in a manner that doesn’t point fingers at anyone in particular or assign blame internally. Gathering data and information on how other institutions handle issues can also help illuminate local habits rooted in “that’s how we’ve always done it” mentalities.

Sometimes You Must Be Blunt

Of course, you can take all the measures in the world to be tactful and non-confrontational in how you approach these issues, and find that the message is still not getting through. On these occasions, it may become necessary to be more straightforward: Remember that it is possible to be direct without being rude or cruel. Take the time to think about precisely what you want to say, and the points you want to convey. Make sure to have data or materials with you to support your conclusions and ideas concretely, so that they cannot be dismissed as misinformed opinion.

The ultimate goal in most of these situations is to get people to step outside their box, and to see things through a different perspective, even if only briefly. A narrow perspective is one of the most common causes of virtually every kind of cognitive bias, and those biases often go on to subsequently strengthen our conviction that the only perspective that is correct is our own. This is a vicious and damaging feedback cycle that can be challenging to interrupt. At times, all that can be done, at least in the short term, is to draw attention to a point of contention, an alternative option, or a means of improvement. This first step of realizing that the status-quo is not inevitable can be a starting point for the investigation of further change.

Whether you are dealing with just one especially intractable individual, or a larger group of people who are misinformed, the idea is to get people moving in the same direction, toward a shared goal of “doing better.” The problems facing a unit leader in grappling with dysfunction can be myriad and daunting, so it is crucial to avoid the trappings of trying to sort out who is “right” and who is “wrong.” These situations are rarely cut and dried, and even if there are relatively clear lines of division, pointing that out isn’t usually productive and can serve to deepen conflict.

The kind of leader who is most successful in these situations is one who works to maintain a “big picture” perspective in discussions and who can project the idea that, right or wrong, all are presumed to share the goal of moving the department back to a more vibrant and productive state. Doing so involves eliminating divisions of “us versus them”; dispelling the idea that throughout all the chaos, somebody was right and somebody was wrong; and finding a common interest for everyone to strive towards. When the conversations taking place start to become more about what “we” can do to arrive at a place that is better for all of us, rather than what “he” or “she” needs to do to stop mucking it up for everyone else, then you will know you are on a better track forward.

Understanding how cognitive biases can affect you personally is an ongoing process of selfevaluation and assessment. While critical selfreflection can help us to recognize these processes at work, they never go away completely. Complacency – thinking you are immune to these effects – can itself lure you into errors of cognition. Protecting yourself from bias requires an open mind, curiosity, and constant self-awareness.

Additional Resources

Please enable JavaScript in your browser to complete this form.
Was this content helpful?