January 1, 1970 (Vol. , No. )
William Ronco, Ph.D. Biotech Leadership Institute
This second part of a four-part series describes the recurring, predictable patterns evident in science project teams’ ineffective decision-making.
Dawn Breaks On Marblehead?
We often use the “Dawn breaks …” expression here in Massachusetts to describe the behavior of our politicians as they—usually painfully slowly—begin to grasp key issues. Though humbling, we can use the same expression to describe scientists’ efforts to understand the recurring, predictable patterns of ineffective communications that afflict our project teams.
It is both alarming and useful that the destructive patterns science project teams follow when they make mistakes are predictable. It’s alarming because teams make the mistakes often enough, and the mistakes are clear enough, that it’s possible to predict and notice patterns at all. Yet it’s also helpful that patterns exist because noticing them identifying provides the necessary first step for addressing them.
The Six Destructive Patterns
1. Groupthink occurs when groups make decisions driven more by an urge to reach agreement than their efforts to objectively analyze data and make use of critical analysis.
It may seem ironic that scientists who take pride in their objectivity and analytical skills should fall prey to this affliction. It may also seem silly that intelligent people could be intimidated, swayed by a majority, talked out of their own ideas. Groupthink, however, occurs in a wide range of groups, even those consisting of people with excellent educations, training, extensive experience, and free will.
The initial research that provided the case material for identifying the groupthink concept also came from people also usually known for the objectivity and analytical skills. Psychologist Irving Janis first popularized the term “groupthink” in his studies of military, defense and intelligence officers’ flawed group decision-making in the failures to anticipate the Japanese attack on Pearl Harbor and avert the doomed Bay of Pigs invasion in the 1960s. Others have used the term to describe the behavior of groups and organizations in the Challenger spacecraft disaster.
2. Satisficing occurs when groups reach decisions that get a job done but fail to achieve optimal results. Economist Herbert Simon originally coined this term to describe human behavior that opts to take action based on simply getting a job done, not aiming for or achieving optimal results. Researchers have used the term to describe group behavior because it often accurately does so.
We’ve observed science project groups become vulnerable to satisficing when they face tight schedules or have limited time for the issues they’re attempting to address. Project teams’ discomfort with disagreement seems linked to satisficing. Not quite knowing what to do when someone expresses a point of view that differs from the majority, they too often opt to make a decision just to be able to move on.
3. Abilene Paradox occurs when a group makes a decision that none of its individual members agrees with or intends to support. Group theorist Jerry Harvey coined the term after he and his family suffered through an uncomfortable drive to Abilene, a drive they all went along with but really didn’t want to take.
Trip To Abilene is an extreme version of satisficing, usually involving elements of groupthink as well. Though occurring less frequently that groupthink, it’s worth including in this list because it helps make a particularly clear case for group members to be wary and not suspicious, not relieved, when they reach a decision quickly and easily.
4. Stalemate occurs when groups give up working with differences of opinion. It’s one of the things people fear, and that contributes to groupthink and satisficing.
It’s not always a problem when groups reach a stalemate, more that they do so without devoting adequate time and intelligence to working with disagreements. We’re often struck by how quickly project teams fall prey to this problem, how rapidly they “agree to disagree” rather than continuing to explore options, alternatives, new combinations of opinions, and information.
5. Rigid Individual Roles drive group participants to make sure the whole group hears their perspective, benefits from the insights of their discipline. Of course it’s useful and important for groups, especially science groups, to be sure to hear from and work with the input of their members’ different disciplines.
One problem that accompanies rigid individual roles, however, is the group member’s blindness to issues that impact the whole group. The statistician who focuses overwhelmingly on making sure that the project team hears the essential statistical insights is not likely to notice that the safety and regulatory participants are not fully engaged in the discussion.
6. Getting To Yes. Of course the Getting To Yes book and approach have helped many people devise much more effective resolutions in difficult negotiations situations. However, for many science project teams, it’s important to get beyond yes. Yes may be a good start, but it also may derail the team from discovering a better solution. “Getting To Optimal” doesn’t hold the cachet of “Getting to yes” but it more accurately describes the goals science project teams need to achieve.
Overall, we need to become as skilled at group decision making problem solving as we are at individual problem solving. We must become as competent at working with complex group disagreements as we are with individually analyzing complex data.
Part 1 of this series described four shifts in focus that clarify group problems. Be sure to check out parts 3 and 4, which outline strategies, skills, and teambuilding that improve project team performance.
Director of the Biotech Leadership Institute William Ronco, Ph.D. ([email protected]), consults on leadership, communications, team, and partnering performance in pharmaceutical, biotech, and science organizations.