I've found myself lately reading a number of books about how humans think - Thinking Fast and Slow by Daniel Kahneman, Switch by Chip and Dan Heath, A Whole New Mind by Daniel Pink,The Righteous Mind by Jonathan Haidt, and more.
I didn't set out with any particular goal in mind, though I guess I'm searching for some clue, some hint, some idea of how I can do a better job of influencing (or, as my husband would say, "manipulating") others into acting as better stewards of our future.
One of the commonly described fallacies in human reasoning (and I have learned to use the term "reasoning" very loosely!) is called the Fundamental Attribution Error (or, to save typing, "FAE") and I seem to see it everywhere I look.
Originally named by Lee Ross at Stanford (though described much earlier), this is the human tendency to overestimate the role that people's personalities play in their observable behavior while discounting situational factors. In other words, instead of considering the circumstances that may have led to someone's actions, we infer basic personality traits as the cause.
I first came across the concept in a book 20+ years ago. I can't for the life of me remember the title or author, and I'm pretty sure she didn't use the FAE terminology. But she described how people might look at a woman who hadn't shaved her legs and infer something about her politics or sexual orientation. Of course, either or both of could be true. But she could also belong to a church that didn't believe in shaving one's legs. Or maybe she was traveling, forgot her razor, and hadn't had time to shop (been there). Or maybe she was just weary of the time it takes to adhere to social convention (been there, too). Or was too young and her parents wouldn't let her shave (yup). Or just got back from camping in Vermont (and that).
Incidents of FAE amused me then. Now it just pisses me off.
Normally, the term is used when describing how one individual presumes to judge another's character from observing her behavior in one situation. But the form I encounter frequently in my job is when some broad assertion is made about the motivations or beliefs of everyone in a group, organization or category because of one statement, action, or position. Now, a basic principle of sustainability initiatives is that no one knows everything and we must bring all parties to the table to find truly sustainable alternatives to how we live and conduct business. How the heck can you do that when at least one party is confident that if others disagree with them, the others must be evil?
I was speaking last year to a pretty conservative businessman (as it happens, not an EMCer) about sustainability, which he supported for good solid business reasons. When I asked him whether he felt it was important to preserve resources for future generations, he said , "Yeah, of course. But those environmentalists [air-quotes] - they don't care about people, only spotted owls." OK, so there are tree-huggers who do feel that way. But I can assure you that some of us self-professed "environmentalists" think that people are PART of the environment. I don't know - maybe it was a good thing he said that, because it was a wake-up call that I need to be careful about accidentally evoking broad generalizations in how I communicate. Still...
The FAE is related to another common human foible: the tendency to Manichean* outlooks. This is working from an assumption that there are only two positions: black or white. Good or evil. With us or against us. Aside from the fact that it presents a nearly impenetrable psychological wall that is very hard to argue against, it just REALLY pisses me off. (And I don't know about you, but when I'm angry, I am not at my best at listening, empathizing, or persuading.)
Here's one I experienced personally. I was talking to someone regarding eWaste, an issue about which I am passionate. As an industry, we know we have to create less waste, find new materials so that any waste is created is less harmful to the environment and human health, try to prevent it ending up in landfill or being taken apart on kitchen stoves or in open air acid baths, redirect more of the many non-renewable materials back into the supply chain, and find ways for the informal workers who have been subsisting on eWaste to get economic benefits without poisoning their children. Whew. It's hard. So I dared to wonder out loud as to whether banning the shipment of eWaste to developing economies would really solve the core problem, given the growing volumes of locally created eWaste in those countries. Know what he said? He said, "Companies that question this approach just want to make money by poisoning children."
Give me a little credit - I kept my mouth shut. (No, really!)
It's not just corporate types and NGOs that do it - politicians do it constantly. Perhaps one of the most succinct and quintessential examples of the FAE is the not uncommon "People on welfare are lazy".
And here's the thing. I am human, too. I, like you and everyone else, would love to believe that I'm above these errors. But I'd be lying to myself. So would you.
Do me a favor. If you catch me committing the FAE, call me on it. Make me use another common foible - "confirmation bias**" - to actually broaden my thinking. Say to me, "Imagine that this behavior you are demonizing had been exhibited by someone that you love and respect. What sequence of events might have led him to do that? Is it even remotely possible that something similar be happening here?"
And I'll do the same to you.
It might not work. But then, I wouldn't be in sustainability if I weren't an optimist.
_________________________________________________________________________________
*A term used to describe dualistic philosophies; derived from Manichaeism, a dualistic religious philosophy founded by a third century Persian prophet.
*Confirmation bias is the tendency to favor interpretation of evidence in support of already existing beliefs.
It's funny, Katie, how easy it is to forget in the heat of battle. My best bet, I've found, is not to respond to something that pushes a button until the next day. As long as I don't let it fester overnight (which can blow it out of proportion), I can then manage to step back a little and realize that there are alternative explanations.
Actually, what really works well for me is when someone ELSE makes the same judgment that I first leapt to; then in playing devil's advocate, I realize the fallacies in my own thinking!
Posted by: Kathrin Winkler | January 15, 2014 at 06:44 AM
Great post! I try to adhere to my "examine their motives" philosophy, which makes me stop and think before I form an opinion or react to someone. Comes in handy when I can remember it.
Posted by: Katie | January 15, 2014 at 06:39 AM