WISER: GETTING BEYOND GROUPTHINK TO MAKE GROUPS SMARTER
AUTHOR: Cass R Sunstein, Reid Hastie
PUBLISHER: Harvard Business Review Press
PRICE: Rs 895
ISBN: 9781422122990
The idea of the devil's advocate is meant to formalize the commitment to the expression of dissenting viewpoints. In at least one well-known case, the approach appeared to work. As Irving Janis described, "during the Cuban missile crisis, President Kennedy gave his brother, the Attorney General, the unambiguous mission of playing devil's advocate, with seemingly excellent results in breaking up a premature consensus" - a consensus that might well have led to war. It is worthwhile to wonder whether and when similar assignments might have proved helpful with the presidents who followed Kennedy.
By their very nature, those assuming the role of devil's advocate are able to avoid the social pressure that comes from rejecting the dominant position within the group. After all, they are charged with doing precisely that. And because they are specifically asked to take a contrary position, they are freed from the informational influences that can lead to self-silencing.
Hidden profiles are a lot less likely to remain hidden if a devil's advocate is directed to disclose the information she has, even if that information runs contrary to the apparent consensus within the group. In groups that are at risk of hidden profiles, a devil's advocate should help a great deal. For groups that seek to get wiser, it would seem sensible to appoint devil's advocates.
So much for theory. Unfortunately, we cannot give a strong endorsement of this approach, because research on devil's advocacy in small groups provides mixed support. True, there is some evidence for the view that devil's advocates can be helpful. Many experiments do find that genuine dissenting views can enhance group performance.
But there is a difference between authentic dissent and a formal requirement of devil's advocacy; an assigned devil's advocate does far less to improve group performance. One reason is that any such requirement is artificial - a kind of exercise or game - and group members are aware of that fact. The devil's advocate can seem to be just going through the motions. And indeed, when an advocate's challenges to a group consensus are insincere, group members discount the arguments accordingly. At best, the devil's advocate facilitates a more sophisticated inquiry into the problem at hand.
Because arbitrarily selected devil's advocates are acting out a role and have no real incentive to sway the group's members to their side, they succeed in their assigned role even if they allow the consensus view to refute their unpopular arguments. Unlike a genuine dissenter, the devil's advocate has little to gain by zealously challenging the dominant view - and as a result such advocates often fail to vigorously challenge the consensus.
The lesson is that if devil's advocacy is to work, it will be because the dissenter actually means what she is saying. If so, better decisions should be expected. If not, the exercise will turn out to be a mere formality, with little corrective power. Considering the existing evidence, we suggest that it is a lot better for groups to encourage real dissent - by, for example, assigning roles to experts representing different knowledge sets or perspectives - than to appoint formal, artificial dissenters. Designating groups of dissenters composed of members who, before deliberation, sincerely favor different solutions can solve some hidden-profile problems.
Another method, related to the appointment of a devil's advocate but more effective according to existing research, is called red teaming. This method has been extensively applied to military teamwork, but it can be applied in a lot of domains, including business and government. Any implementation plan, if it has a high level of ambition, might benefit from red teaming.
Contrarian team
Red teaming involves the creation of a team that is given the task of criticizing or defeating a primary team's plans to execute a mission. There are two basic forms of red teams: those that play an adversary role and attempt to defeat the primary team in a simulated mission, and those given the same instructions as a devil's advocate, which is to construct the strongest case against a proposal or plan. Even an artificial role assignment can be powerful enough to produce substantial improvements, if the assignment is to more than one dissenter. It is as if having more than one dissenter provides social proof of the validity or at least the significance of the divergent views. Anxious people can also operate as the functional equivalent of red teams, and sometimes they enlist red teams to test worst-case scenarios.
Versions of this method are used in all branches of the military and in many government offices. (Important government regulations are sometimes evaluated with the help of informal red teams.) In industry, some firms offer red-teaming services to other companies, as in the case of "white-hat hackers" who are paid to attempt to subvert software security systems and to penetrate corporate firewalls.
Within law firms, there has been a long tradition of pretrials, or the testing of arguments with the equivalent of red teams. In important cases, such efforts can reach the level of hiring attorneys from a second firm to develop and present a case against advocates from the primary firm. Often these adversarial tests are conducted before mock juries so that the success of the primary and red-team arguments can be evaluated through the eyes of citizens similar to those who will render the ultimate courtroom verdict.
One size does not fit all, and the cost and feasibility of red teams will vary from one organization to another. But in many contexts, red teams are an excellent idea, especially if they are sincerely motivated to find mistakes and to exploit vulnerabilities and are given clear incentives to do exactly that.
Consensus is over-rated. Leaders must encourage criticism or bad news, or suggestions about new directions, Sunstein tells Ankita Rai
In the book you write that group decision-making often fails to produce results because leaders tend to mix diversity of ideas and consensus-seeking function. Please explain.
Diversity of ideas is underrated. Consensus is overrated. Group members often have a lot of information, but groups as a whole don't obtain that information because their members do not tell their leaders what they know. One reason is that leaders sometimes value consensus more. They do not want, and tend to discourage criticism or bad news, or suggestions about new directions.
The best leaders create a certain kind of culture, in which everyone in the group feels free to add new information and fresh perspectives. It can be fun to be part of that culture, to be sure, but firms (and governments) can benefit a lot if people do not silence themselves. The most successful businesses (and governments) make sure that they learn what they need - before they make decisions rather than after. In the technology industry, for example, comapnies prosper when they solicit a lot of ideas and creativity from their employees.
What is the biggest impediment to successful group decision-making?
Not learning enough. Many groups, including for-profits and non-profits, initially start on a path that the leader really likes, or that a few people really like, and keep on that path even though it just isn't the right one. In a way, groups amplify the mistakes and biases of individual members.
Behavioural scientists have taught us a lot about individual mistakes. It turns out groups often aggravate those mistakes rather than correcting them. That's a real impediment to good decisions. Fortunately, leaders can do a lot to overcome that impediment.
AUTHOR: Cass R Sunstein, Reid Hastie
PUBLISHER: Harvard Business Review Press
PRICE: Rs 895
ISBN: 9781422122990
The idea of the devil's advocate is meant to formalize the commitment to the expression of dissenting viewpoints. In at least one well-known case, the approach appeared to work. As Irving Janis described, "during the Cuban missile crisis, President Kennedy gave his brother, the Attorney General, the unambiguous mission of playing devil's advocate, with seemingly excellent results in breaking up a premature consensus" - a consensus that might well have led to war. It is worthwhile to wonder whether and when similar assignments might have proved helpful with the presidents who followed Kennedy.
By their very nature, those assuming the role of devil's advocate are able to avoid the social pressure that comes from rejecting the dominant position within the group. After all, they are charged with doing precisely that. And because they are specifically asked to take a contrary position, they are freed from the informational influences that can lead to self-silencing.
Hidden profiles are a lot less likely to remain hidden if a devil's advocate is directed to disclose the information she has, even if that information runs contrary to the apparent consensus within the group. In groups that are at risk of hidden profiles, a devil's advocate should help a great deal. For groups that seek to get wiser, it would seem sensible to appoint devil's advocates.
So much for theory. Unfortunately, we cannot give a strong endorsement of this approach, because research on devil's advocacy in small groups provides mixed support. True, there is some evidence for the view that devil's advocates can be helpful. Many experiments do find that genuine dissenting views can enhance group performance.
But there is a difference between authentic dissent and a formal requirement of devil's advocacy; an assigned devil's advocate does far less to improve group performance. One reason is that any such requirement is artificial - a kind of exercise or game - and group members are aware of that fact. The devil's advocate can seem to be just going through the motions. And indeed, when an advocate's challenges to a group consensus are insincere, group members discount the arguments accordingly. At best, the devil's advocate facilitates a more sophisticated inquiry into the problem at hand.
Because arbitrarily selected devil's advocates are acting out a role and have no real incentive to sway the group's members to their side, they succeed in their assigned role even if they allow the consensus view to refute their unpopular arguments. Unlike a genuine dissenter, the devil's advocate has little to gain by zealously challenging the dominant view - and as a result such advocates often fail to vigorously challenge the consensus.
The lesson is that if devil's advocacy is to work, it will be because the dissenter actually means what she is saying. If so, better decisions should be expected. If not, the exercise will turn out to be a mere formality, with little corrective power. Considering the existing evidence, we suggest that it is a lot better for groups to encourage real dissent - by, for example, assigning roles to experts representing different knowledge sets or perspectives - than to appoint formal, artificial dissenters. Designating groups of dissenters composed of members who, before deliberation, sincerely favor different solutions can solve some hidden-profile problems.
Another method, related to the appointment of a devil's advocate but more effective according to existing research, is called red teaming. This method has been extensively applied to military teamwork, but it can be applied in a lot of domains, including business and government. Any implementation plan, if it has a high level of ambition, might benefit from red teaming.
Contrarian team
Red teaming involves the creation of a team that is given the task of criticizing or defeating a primary team's plans to execute a mission. There are two basic forms of red teams: those that play an adversary role and attempt to defeat the primary team in a simulated mission, and those given the same instructions as a devil's advocate, which is to construct the strongest case against a proposal or plan. Even an artificial role assignment can be powerful enough to produce substantial improvements, if the assignment is to more than one dissenter. It is as if having more than one dissenter provides social proof of the validity or at least the significance of the divergent views. Anxious people can also operate as the functional equivalent of red teams, and sometimes they enlist red teams to test worst-case scenarios.
Versions of this method are used in all branches of the military and in many government offices. (Important government regulations are sometimes evaluated with the help of informal red teams.) In industry, some firms offer red-teaming services to other companies, as in the case of "white-hat hackers" who are paid to attempt to subvert software security systems and to penetrate corporate firewalls.
Within law firms, there has been a long tradition of pretrials, or the testing of arguments with the equivalent of red teams. In important cases, such efforts can reach the level of hiring attorneys from a second firm to develop and present a case against advocates from the primary firm. Often these adversarial tests are conducted before mock juries so that the success of the primary and red-team arguments can be evaluated through the eyes of citizens similar to those who will render the ultimate courtroom verdict.
One size does not fit all, and the cost and feasibility of red teams will vary from one organization to another. But in many contexts, red teams are an excellent idea, especially if they are sincerely motivated to find mistakes and to exploit vulnerabilities and are given clear incentives to do exactly that.
Cass R Sunstein
Groups amplify the mistakes and biases of individual members: Cass R SunsteinConsensus is over-rated. Leaders must encourage criticism or bad news, or suggestions about new directions, Sunstein tells Ankita Rai
In the book you write that group decision-making often fails to produce results because leaders tend to mix diversity of ideas and consensus-seeking function. Please explain.
Diversity of ideas is underrated. Consensus is overrated. Group members often have a lot of information, but groups as a whole don't obtain that information because their members do not tell their leaders what they know. One reason is that leaders sometimes value consensus more. They do not want, and tend to discourage criticism or bad news, or suggestions about new directions.
The best leaders create a certain kind of culture, in which everyone in the group feels free to add new information and fresh perspectives. It can be fun to be part of that culture, to be sure, but firms (and governments) can benefit a lot if people do not silence themselves. The most successful businesses (and governments) make sure that they learn what they need - before they make decisions rather than after. In the technology industry, for example, comapnies prosper when they solicit a lot of ideas and creativity from their employees.
What is the biggest impediment to successful group decision-making?
Not learning enough. Many groups, including for-profits and non-profits, initially start on a path that the leader really likes, or that a few people really like, and keep on that path even though it just isn't the right one. In a way, groups amplify the mistakes and biases of individual members.
Behavioural scientists have taught us a lot about individual mistakes. It turns out groups often aggravate those mistakes rather than correcting them. That's a real impediment to good decisions. Fortunately, leaders can do a lot to overcome that impediment.
Cass R Sunstein
Robert Walmsley University, professor, Harvard Law School
Robert Walmsley University, professor, Harvard Law School
Reprinted by permission of Harvard Business Review Press. Excerpted from Wiser: Getting Beyond Group Think to Make Groups Smarter. Copyright 2015 Cass R. Sunstein and Reid Hastie. All rights reserved.