Skip to content

The Bathsheba Syndrome: When a Leader Fails

November 13, 2011

Another leader—no, an entire cadre of leaders—has been found to be a moral failure. Legal authorities have charged Jerry Sandusky, who retired as the defensive coordinator for the Penn State football team in 1999, with the sexual abuse of children who he targeted through his involvement in the charitable organization The Second Mile. Additionally, a number of other administrators and leaders at Penn State University—the university’s president Graham Spanier, vice-president Gary Schultz, athletic director Tim Curley and long-time football coach Joe Paterno—face charges or have been fired from the university because of their failure to take action when Sandusky’s crimes were brought to their attention. Time, research, and investigation will inform fully our judgment of who is guilty and who is innocent, but the indictment states many at the university were aware of Sandusky’s crimes but did not intervene as required by law and by moral standards.

Sandusky and the others join a long line of leaders who failed, in many cases miserably, to act in morally appropriate ways. University presidents find themselves in the news when someone discovers that they plagiarized their academic work. Executives’ decisions seem to reflect expediencies of a situation rather than the application of principles of justice and ethics. Politicians do what they must to win, even if it means using dirty tricks and spreading innuendo and falsehoods. James MacGregor Burns (1980) argues that a leader’s most essential quality is his or her commitment to moral values, but it is an understatement to say that in many cases those in positions of authority do not act in ways that inspire ethical confidence.

The empirical evidence for moral problems of leadership is quite old and documented in history books, religious and philosophical texts, literature, and art. Consider, for example, the biblical story of David and Bathsheba. King David is smitten by Bathsheba, the wife of one of his generals, and he seduces her. David compounds his moral failure with one misdeed after another, until he eventually orders Bathsheba’s husband killed—David is corrupted by his power. A leader acting immorally is not, apparently, a new phenomenon in human societies.

Recent work in social psychology suggests that this Bathsheba syndrome (Ludwig & Longenecker, 1993)—the moral corruption of those who are powerful—has many causes, but two causes are particularly potent: the psychological impact of gaining power over others and the tendency for groups and organizations to look the other way when their leaders act immorally. In the last 10 years or so dozens of researchers have confirmed that simply reminding people that they hold a position of power—even asking them to remember when they felt powerful in the past—triggers a number of psychological changes. Some of these changes are ones that will help a leader lead more effectively, for gaining power increases one’s level of activity, augments energy levels, and increases awareness of environmental constraints and resources. But power has a dark side. Powerful people are proactive, but in some cases their actions are risky, inappropriate, or unethical ones. Simply being identified as the leader of a group prompts individuals to claim more than their share, for they believe the leadership role entitles them to greater rewards (Forsyth, Zyzniewski, & Giammanco, 2002). When individuals gain power, their self-evaluations grow more favorable, whereas their evaluations of others grow more negative. In research conducted in our lab we have found that individuals, when they feel powerful, are more likely to surround themselves with “yes-men”: when building a team they prefer to recruit individuals who agree with them from the outset rather than those who may challenge them.

Gerben A. van Kleef and his colleagues demonstrated the pernicious effects of power by arranging for two people to discuss an experience that caused them emotional pain and suffering. During and after the conversation the researchers tracked, using both physiological measures and self-reports, participants’ feelings of compassion as they listened to their partner’s outpouring of emotional angst. As expected, people who did not describe themselves as powerful and influential became more and more distressed themselves when their partners became more upset as they related their experience—their emotions were relatively synchronized. Powerful people, in contrast, did not respond emotionally to their partner’s distress, and their levels of compassion declined as their partner’s became more troubled (see the Figure). These findings suggest that power may insulate the powerful from feeling troubled by the harm they inflict on others.

Compassion and Power

The relationship between power and compassion.

These studies of power’s impact on people provide insights into the psychologically corruptive effects of power. Others, in contrast, explain the negative impact of power on groups and organizations. A number of studies have shown, repeatedly, that leaders are not held to higher standards then others in the organization—but lower standards. Hollander’s work on what he termed idiosyncrasy credit indicated that individuals who reached high levels of authority in organizations were granted more lenience in terms of their behavior—they were viewed as having earned the right to deviate from principles others must heed (see Hollander & Offermann, 1990). Abrams and his colleagues (2011) more recently reported evidence of a transgression credit effect that supports a double standard at work in groups and organizations: the same negative behavior earns rebuke and punishment when performed by an employee, but is ignored or even praised when enacted by a leader. But no study in social psychology makes this point better than the field’s most famous, if controversial, piece of empirical work: Milgram’s (1963) study of obedience to authority. Milgram created an organization, in miniature, by assigning participants to the role of teacher in a feigned learning experiment. The subject’s task: deliver a painful electric shock to another subject, each time he made a mistake. The shocks were not real, but the subjects thought they were. Milgram found that most people were highly obedient—they delivered the painful shocks—but he also discovered that participants rarely questioned the moral authority of the leader. Some refused to follow orders, but no one—not one—rose up and freed the other subject. People will rebel and overthrow a morally corrupt leader, but such actions are exceptions. As Bazerman and Tenbrusnsel (2011) explain in their book Blind Spots (p. 81):

Across most major scandals of the last decade, many people—members of boards of directors, auditing firms, rating agencies, and so on—had access to the appropriate data and should have noticed and acted on the unethical behavior others. Yet they did not do so, at least in part because of the psychological tendency not to notice bad data that we would prefer not to see.

Heffernan (2011) calls this tendency willful blindness.

The moral failures of individuals, groups, and organization are exceptions—most people, and most organizations, act in morally commendable ways. But these exceptions point to the complexity of morality. Morality is in part a characteristic of an individual, for values, principles, ethical ideology, and personal beliefs shape our choices when we confront temptation and crisis. Morality is also, however, an interpersonal process, for even the most morally upright individual may be found wanting when he or she becomes ensnarled in a group or organization that tolerates wrong-doing for the sake of results or reputation. It is true that leaders face more temptations that the rest of us because they often have special privileges, which may make them feel that they are above others and not subject to the same rules. But when subordinates treat leaders with such deference that they tolerate actions that should never be allowed, they make it easier for leaders to believe that they are outside of the boundaries of our moral community.

References

Abrams, D. (2011). Transgression credit, or: Moral blindness – its part in our downfall . Paper presented at the annual meetings of the Society for Experimental Social Psychology, Washington, DC.

Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots. Princeton, NJ: Princeton University Press.
Burns, J. M. (1978). Leadership. New York: Harper.

Forsyth, D. R., Zyzniewski, L. E., & Giammanco, C. A. (2002). Responsibility diffusion in cooperative collectives. Personality and Social Psychology Bulletin, 28, 54-65.

Heffernan, M. (2011). Willful blindness. New York: Walker & Co.

Hollander, E. P., & Offermann, L. R. (1990). Power and leadership in organizations: Relationships in transition. American Psychologist, 45, 179–189.

Ludwig, D. and C. Longenecker (1993), ‘The Bathsheba syndrome: the ethical failure of successful leaders’, The Journal of Business Ethics, 12 (4), 265–73.
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 61, 371-378.

van Kleef, G. A., Oveis, C., van der Löwe, I., LuoKogan, A., Goetz, J., & Keltner, D. (2008). Power, distress, and compassion: Turning a blind eye to the suffering of others. Psychological Science, 19(12), 1315-1322.

Follow

Get every new post delivered to your Inbox.

Join 305 other followers

%d bloggers like this: