I hugely admire Prof Dan Gilbert's research. You will see numerous references to his intriguing research on the nature of happiness on my web site and can watch a fantastic TED talk given by him on my videos page. His latest research throws some light on a recent debate sparked by the publication of Steven Pinker's latest book "Enlightenment Now." In this book, Pinker reviews the core values of the 18th Enlightenment, with its emphasis on Science, Reason, Progress and Human Rights, and asks what did this achieve? He demonstrates how life has improved on every single measure as a result of this scientific revolution. This includes health, wealth, free time and leisure, education, human rights, peace and fair government.
Pinker does not suggest that progress is linear. This is a jagged line of progress. Nor does he suggest that we live in a perfect world. We still face huge problems like global warming. Nor does he suggest that progress is complete or that it is a done deal. What he is suggesting is that life has improved for the vast majority of people globally, at varying rates, and will continue to do so if we remain loyal to the spirit of the Enlightenment. The book is a fascinating read and challenges many of the 'given' beliefs that we have about the world today. Whilst I did not agree with all Pinker's positions, it really made me think hard about why I disagreed. For me, this is a sign of a great book.
In my spare time I am a keen family historian and when charting the lives of one's ancestors it is impossible to ignore the differences that unfold between the struggles of each generation. Very few people die at the age of 36 from exhaustion as my great-great grand mother did. Having charted so many past lives, the basic tenet of the book is rather self-evident. But this is not the universal response from everyone. What was interesting in light of the publication of 'Enlightenment Now' was the huge reaction against the idea that society has progressed. People simply cannot accept that life is better today in a world of medicine, engineering, technology, human rights, travel, social media, mass literacy et c, etc, etc.
Pinker discusses this reaction in the book and I notice it too when discussing the book with peers and friends. People simply cannot accept that huge benefits that science has brought to humanity. Why is this? Although it's far from perfect by virtually any measure -- whether poverty rates, violence, access to education, racism and prejudice or any number of others -- the world continues to improve. Why, then, do polls consistently show that people believe otherwise?
The answer, Daniel Gilbert says, may lie in a phenomenon called "prevalence induced concept change." As demonstrated in a series of new studies, Gilbert, the Edgar Pierce Professor of Psychology, his post-doctoral student David Levari, and several other colleagues, show that as the prevalence of a problem is reduced, humans are naturally inclined to redefine the problem itself. The result is that as a problem becomes smaller, people's conceptualizations of that problem become larger, which can lead them to miss the fact that they've solved it. The studies are described in a paper in the June 29th issue of Science.
"Our studies show that people judge each new instance of a concept in the context of the previous instances," Gilbert said. "So as we reduce the prevalence of a problem, such as discrimination for example, we judge each new behavior in the improved context that we have created."
"Another way to say this is that solving problems causes us to expand our definitions of them," he said. "When problems become rare, we count more things as problems. Our studies suggest that when the world gets better, we become harsher critics of it, and this can cause us to mistakenly conclude that it hasn't actually gotten better at all. Progress, it seems, tends to mask itself."
The phenomenon isn't limited to large, seemingly intractable social issues, Gilbert said. In several experiments described in the paper, it emerged even when participants were asked to look for blue dots. "We had volunteers look at thousands of dots on a computer screen one at a time and decide if each was or was not blue," Gilbert said. "When we lowered the prevalence of blue dots, and what we found was that our participants began to classify as blue dots they had previously classified as purple."
Even when participants were warned to be on the lookout for the phenomenon, and even when they were offered money not to let it happen, the results showed they continued to alter their definitions of blue.
Another experiment showed similar results using faces. When the prevalence of threatening faces was reduced, people began to identify neutral faces as threatening.
Perhaps the most socially relevant of the studies described in the paper, Gilbert said, involved participants acting as members of an institutional review board, the committee that reviews research methodology to ensure that scientific studies are ethical.
"We asked participants to review proposals for studies that varied from highly ethical to highly unethical," he said. "Over time, we lowered the prevalence of unethical studies, and sure enough, when we did that, our participants started to identify innocuous studies as unethical."
In some cases, Gilbert said, prevalence-induced concept change makes perfect sense, as in the case of an emergency room doctor trying to triage patients. "If the ER is full of gunshot victims and someone comes in with a broken arm, the doctor will tell that person to wait," he said. "But imagine one Sunday where there are no gunshot victims. Should that doctor hold her definition of "needing immediate attention" constant and tell the guy with the broken arm to wait anyway? Of course not! She should change her definition based on this new context."
In other cases, however, prevalence-induced concept change can be a problem.
"Nobody thinks a radiologist should change his definition of what constitutes a tumor and continue to find them even when they're gone," Gilbert said. "That's a case in which you really must be able to know when your work is done. You should be able to see that the prevalence of tumors has gone to zero and call it a day. Our studies simply suggest that this isn't an easy thing to do. Our definitions of concepts seem to expand whether we want them to or not." Aside from the obvious questions it raises about how we might go about fixing problems both large and small, the studies also point to issues of how we talk about addressing those problems.
"Expanding one's definition of a problem may be seen by some as evidence of political correctness run amuck," Gilbert said. "They will argue that reducing the prevalence of discrimination, for example, will simply cause us to start calling more behaviors discriminatory. Others will see the expansion of concepts as an increase in social sensitivity, as we become aware of problems that we previously failed to recognize."
"Our studies take no position on this," he added. "There are clearly times in life when our definitions should be held constant, and there are clearly times when they should be expanded. Our experiments simply show that when we are in the former circumstance, we often act as though we are in the latter."
Ultimately, Gilbert said, these studies suggests that there may be a need for institutional mechanisms to guard against the prevalence-induced concept change. "Anyone whose job involves reducing the prevalence of something should know that it isn't always easy to tell when their work is done," he said. "On the other hand, our studies suggest that simply being aware of this problem is not sufficient to prevent it. What can prevent it? No one yet knows. That's what the phrase 'more research is needed' was invented for."
Materials provided by Harvard University.
David E. Levari, Daniel T. Gilbert, Timothy D. Wilson, Beau Sievers, David M. Amodio, Thalia Wheatley. Prevalence-induced concept change in human judgment. Science, 2018; 360 (6396): 1465 DOI: 10.1126/science.aap8731