For example, Jonathan Haidt ("The Righteous Mind: Why Good People Are Divided By Religion and Politics") and his colleagues have run a series of experiments in which he asks people to imagine a variety of scenarios in which the subjects do disturbing things but harm no one in the process, such as “imagine eating your dead pet dog,” “imagine cleaning your toilet with your nation’s flag,” or “imagine someone having sex with a dead chicken before cooking it for dinner,” and in almost every case, his subjects felt immediate disgust. But when asked why, seldom could they provide a reason. They just did.
What’s more, Haidt has run these experiments in a variety of settings, from the slums of Brazil to the universities of the U.S., and he has found that across cultures, people share similar moral instincts. Of course, he had to alter the scenarios so they’d fit the culture in which they were presented, but he and his colleagues have consistently found that people in different cultures react similarly to analogous scenarios. This suggests that our moral intuitions a prior to culture, not after, that they’re deeply embedded within our psyches, that they’re something with which we’re born. This is not to say that there’s no moral variation across cultures. There is. But it appears that most of us are born with similar moral intuitions that are then modified through our interactions with the societies in which we live and move and have our being. As David Brooks notes:
Jonathan Haidt, Jesse Graham, and Craig Joseph have compared these [inclinations] to the taste buds. Just as the human tongue has different receptors to perceive sweetness, saltiness, and so on, the moral modules have distinct receptors to perceive certain classic situations. Just as different cultures have created different cuisines based on a few shared flavor senses, so, too, have different cultures created diverse understandings of virtue and vice, based on a few shared concerns (Brooks, "The Social Animal," p. 286).
In fact, Haidt and his colleagues have developed a theory (Moral Foundations Theory) that has identified six moral intuitions: (1) care/harm, (2) fairness/cheating, (3) liberty/oppression (4) loyalty/betrayal, (5) authority/subversion, and (6) sanctity/disgust. I've discussed these previously ("Miley Cyrus and Moral Outrage"), so I won't rehash them here. Briefly stated, what their research has found is that the blank slate is a myth; all of us are born with similar moral instincts, which are then modified by the cultures in which we live.
Why is this? It’s largely due to the way we think. As I've noted in previous posts ("Thinking Fast, Thinking Slow"), our minds engage in two types of thinking: one intuitive, one reflective; one instinctive, one deliberate; one fast, one slow. Most of the time, however, we react instinctively without reflecting on what we’re doing. And a lot of the time this is a good thing. As we listen to others speak, we don’t have think about every word they utter. If we had to, we’d be incapable of carrying on conversations. Likewise, when we’re driving, when someone cuts us off or slams on their brakes, we act instinctively. If we had to abstract away from what was happening, we wouldn’t last two seconds on our nation’s highways. This is not to imply that our reflective self never plays a role; it does, and it often keeps us in check, but reflective thinking’s hard work and takes a lot of mental energy, which is why a lot of the time, we operate on autopilot, unaware of how our unconscious guides our behavior.
Why is this? It’s largely due to the way we think. As I've noted in previous posts ("Thinking Fast, Thinking Slow"), our minds engage in two types of thinking: one intuitive, one reflective; one instinctive, one deliberate; one fast, one slow. Most of the time, however, we react instinctively without reflecting on what we’re doing. And a lot of the time this is a good thing. As we listen to others speak, we don’t have think about every word they utter. If we had to, we’d be incapable of carrying on conversations. Likewise, when we’re driving, when someone cuts us off or slams on their brakes, we act instinctively. If we had to abstract away from what was happening, we wouldn’t last two seconds on our nation’s highways. This is not to imply that our reflective self never plays a role; it does, and it often keeps us in check, but reflective thinking’s hard work and takes a lot of mental energy, which is why a lot of the time, we operate on autopilot, unaware of how our unconscious guides our behavior.
Unfortunately, our intuitive self doesn't always act the way we'd like it to. We're a mix of selfish and moral instincts, and the former often carry the day. All is not lost, however. We may be guided more by our selfish instincts than our moral ones, but as Aristotle (and the Buddha) argued long ago, we can cultivate our moral instincts through repetition and practice. This isn't rocket-science. Academics, artists, and athletes have known this for centuries. We may be born with innate intellectual, musical, or athletic ability, but if we don't practice, if we don't turn our talents into habits, into virtues, we'll never become great academics or artists or athletes.
So it is with the moral life. However, as Aristotle also pointed out, we can't know what practices to cultivate without first knowing what the goals we seek are. We first have to consider what our ends and purposes are before we can know what is right and good and just. Put more simply, we can’t know what the right thing to do is without a prior idea of what constitutes what is good. Take health care, for instance. If the ultimate goal is to make a profit, then it’s perfectly legitimate to limit the time that doctors spend with their patients so that they can see as many as possible. But if the ultimate goal is to provide quality health care, then the right thing to do is to insure that doctors spend as much time with their patients as needed. In short, our goals, what we consider to be good, drive what we consider to be right and good and just.
Moreover, as Michael Sandel reminds us ("Justice: What's the Right Thing to Do"), as hard as we try, we can’t separate our beliefs about right and wrong from the social circles in which we are embedded. It’s impossible for us to separate our notions of what is good from the moral claims of the communities in which we’re embedded. Complete objectivity and complete neutrality are noble goals, but they’re quixotic ones, which is why we can’t derive principles of justice apart from a prior conception of what’s good. Unfortunately, most political and moral discourse doesn't focus on what is good but on what is just, which is why many of us talk right past one another.
But even recognizing that we should begin with what constitutes the "good" rather than what is "right" doesn't guarantee that we'll reach agreement on various hot-button issues. That is because our moral communities often disagree with constitutes what is "good." Nevertheless, it is the place to start. We need to attend to the various conceptions of what the purposes and ends of the good life are. Does this mean that through deliberating about the ends we serve we can resolve all issues that will come before
us? Of course not, but as Sandel notes
A just society can’t be achieved simply by maximizing utility or by securing freedom of choice. To achieve a just society we have to reason together about the meaning of the good life, and to create a public culture hospitable to the disagreements that will inevitably arise (Sandel 2009:261).
Reasoning together requires, however, that we respect the opinions of others. Calling them silly, ignorant, morally bankrupt, or stupid accomplishes nothing other than making us feel good about ourselves (self-righteousness can be emotionally satisfying but is generally unhelpful in building a civil society). Instead, we need to take the approach advocated by Jonathan Haidt in his essay, "What Makes People Vote Republican." In it, he begins by summarizing the typical explanations that psychologists, most of whom are political liberals, have offered for why some people are conservatives:
Conservatives are conservative because they were raised by overly strict parents, or because they are inordinately afraid of change, novelty, and complexity, or because they suffer from existential fears and therefore cling to a simple worldview with no shades of gray. These approaches all had one feature in common: they used psychology to explain away conservatism. They made it unnecessary for liberals to take conservative ideas seriously because these ideas are caused by bad childhoods or ugly personality traits. I suggested a very different approach: start by assuming that conservatives are as sincere as liberals, and then use Moral Foundations Theory to understand the moral matrices of both sides ("The Righteous Mind, p. 164).
What a novel idea. Begin by "assuming that conservatives are as sincere as liberals." Of course, such advice cuts both ways. Conservatives will also have to come to the table assuming that liberals are as sincere as conservatives. Will it happen? I doubt it, at least not among the partisans on both sides who are so convinced that they're right that they can't see past their own ideological blinders (it must be great being them). But I do have hope for rest, who somewhere along the way, learned that for now we only see through a glass darkly (1 Cor. 13:12).
No comments:
Post a Comment