Flickr/Dierk Schaefer
How implicit bias shapes our thoughts on climate change, nutrition, and Reddi-wip

Commentary Culture Science

Quiz time! Here are three food-related news stories from the past couple of weeks. Read them and choose your reaction. Don’t think too hard, and don’t feel constrained by the multiple-choice format. Just react.

#1. Conagra subsidiary Reddi-wip announces a new blend of its product. The cream is free from bovine growth hormone; the flavorings are “all natural.” (Of course, as the label announces, it still contains sugar, high-fructose corn syrup, and carrageenan.) The company says it “adds a whoosh of wonder, celebration and joy to any occasion and any treat.” Your reaction?

a) Great. They’re learning.

b) Terrible. They’re using one OK ingredient to trick people into not noticing the rest of the label.

c) No big deal. Who cares about canned whipped cream? I don’t even know anybody who uses that crap.

#2. Low-calorie sweeteners, according to three recent studies, appear not to make people gain weight, as has been theorized. Reactions:

a) (In your best Donald Trump voice) Wrong!

b) Doesn’t matter. Artificial sweeteners are poison.

c) This is kind of hopeful. It might give people who need to cut down on sugar one less thing to worry about.

#3. Much of our discussion of global climate change is driven by mathematical models that predict the impact of rising carbon dioxide levels. But how reliable are the models? A group of scientists tweaked the model so that it assumed a bit more instability in the huge current known as the Atlantic Meridional Overturning Circulation (AMOC). The result (though it will take a considerable amount of time): a new ice age, not at all what most of us were expecting. Your reaction?

a) The results are a politically motivated lie.

b) The results might be true, but they shouldn’t be publicized because they play into the hands of climate deniers.

c) Let’s start laying in blankets as well as building dikes. 

At this point, I’m supposed to tell you which answer was right, or at least which one you were supposed to choose. But I’m not going to.

In some cases, I don’t know which answer is right. Some, I can’t know. Has diet soda really been exonerated from the charge of causing weight gain? Maybe, but there are studies that disagree. Indeed, one of the pieces mentioned in this article duplicated the work of a previous researcher and got different results. At least one of the studies was funded by an organization that receives funding from big food companies—though that doesn’t necessarily mean that it came to incorrect conclusions. The argument isn’t over, and probably won’t be any time soon.

If your mother says she loves you, check it out.

In other cases, the reactions aren’t necessarily mutually exclusive. It’s not unreasonable to believe that artificial sweeteners are a poison, but that for some people they might be preferable to consuming sugar. And it’s not just possible, but downright likely that the makers of Reddi-wip are simultaneously responding to consumer desires and using the quality of their cream to distract attention from the other ingredients in their product. (On the other hand, sorry, but it’s not true that your friends don’t use Reddi-wip. They spray it directly into their mouths late at night. They just don’t tell you.)

If there’s not a right answer, why the quiz? To me, the point is to think not about right answers, but about which one leaps immediately to mind. We’ve all heard of the idea of implicit bias—the idea that we can consciously hold one opinion (for instance, about a racial or ethnic group) but go on to act as if we held exactly the opposite point of view. I suspect there’s no practical way to eliminate implicit bias; we just have to learn to acknowledge and correct for it. When we’re addressing risky topics, we have to slow down and think, not just react, no matter how much we’d like to believe that we’re internally aligned and can trust our impulses.

I don’t believe in keeping the truth from the public, but sometimes you have to be careful that the truth is delivered with the right kinds of explanations and caveats, or it can wreak havoc.

And while you’re learning to get more mentally disciplined about race and ethnicity, it’s probably time to do the same thing with your thought processes about food—a field where everything is complex, and first reactions are almost always inadequate. Think of the climate-change example in the quiz. The questions it raises are layered deeply: Was the research conducted fairly and well? And because it’s more about testing a model than about accurately predicting the future, what should we carry away from it as we think about how to respond to climate change? The political side is a real concern, too. I don’t believe in keeping the truth from the public, but sometimes you have to be careful that the truth is delivered with the right kinds of explanations and caveats, or it can wreak havoc. If you can squeeze all those considerations into a first reaction, we need to start cloning you as quickly as possible: We need a lot more people like you.

As for the rest of us. we need to practice the art of thinking slow. The term, of course, comes from Daniel Kahneman and his remarkably revealing and helpful book Thinking, Fast and Slow. (If you haven’t read it, stop reading this piece immediately, obtain a copy, and get to it. You’ll thank me.)

To oversimplify a bit, Kahneman says that we have two pathways in our brains. One reaches quick-and-dirty decisions almost instantly. The other requires us to actually put the little gray cells to work: make calculations, seek evidence, consider logic. Our brains, being fundamentally lazy, prefer the fast channel, which means that we spend much of our time dealing with observations and conclusions that we’ve arrived at on autopilot.

Is a shy, withdrawn person more likely to be a librarian or farmer? Most people choose librarian, which is statistically unlikely to the point of absurdity.

The fast brain, contrary to what you might expect, generally works well, but it creates glitches. Kahneman, for instance, was drawn to his research by the observation that people routinely fell into characteristic errors in seat-of-the-pants statistical thinking. Here’s a classic example: Is a shy, withdrawn person more likely to be a librarian or farmer? Most people choose librarian, which is statistically unlikely to the point of absurdity. There are ten times as many farmers as librarians, and thus a much higher likelihood that our wallflower plows fields rather than shelving books. If you guessed wrong, you more or less knew all the relevant facts, but you were misled by your picture of librarians. Something similar happens when you tend to take the opinions you share with your group of friends and then assume that the rest of the world also shares them – even though you are well aware of—or even proud of—the fact that your friends hardly represent the norm.

I’m not arguing that your implicit biases are incorrect. I’m not suggesting that there’s something wrong if you ultimately act in accordance with them. I’m certainly aware of the problematic relationship between biases and deeply held convictions. And I don’t believe slow thinking inevitably leads to paralysis, doubt, and inaction. Think of it this way: When you take an action based totally on your biases, you don’t quite know what you’re doing. When slow thought leads you to an awareness that things are complicated, you still don’t quite know. Maybe you’ll change your behavior. Maybe not. But at least you’ll know that you’ve been arbitrary and tentative. It’s a completely unsatisfactory situation, as reality so often is. That’s how we make most decisions all the time, and there’s more integrity to acknowledging it.

It’s like all the journalists I knew back in Chicago used to say: “If your mother says she loves you, check it out.”

Here’s my favorite part. If you look up that quotation, you’ll probably learn that it was first uttered by A.A. Dornfeld, a legendary editor who was once played in the movies by Jimmy Stewart. I met Dornfeld one time and asked him about it.

“Never said it,” he told me.

I’m not sure exactly what that proves—but whatever it is, my first impression is that I believe it.

Patrick Clinton

Patrick Clinton is a long-time journalist and educator. He edited the Chicago Reader during the politically exciting years that surrounded the election of the city’s first black mayor, Harold Washington; University Business during the early days of for-profit universities and online instruction; and Pharmaceutical Executive during a period that saw the Vioxx scandal and the ascendancy of biotech. He has written and worked as a staff editor for a variety of publications, including Chicago, Men’s Journal, and Outside (for which he ran down the answer to everyone’s most burning question about porcupines). For seven years, he taught magazine writing and editing at Northwestern University's Medill School of Journalism.