top of page

Think You Know Better? Here is Your Chance


It never ceases to surprise me in training and lectures when sharing well thought through, systematic, peer reviewed research findings with individual's who simply dismiss it out of hand, never having even considered the subject previously. I have never been quite so confident. So, I thought I would share this great opportunity with people to test their own subjective assessments against leading research studies that are pending investigation. Here you can make your predictions about what you think will happen even before the studies begin.

"We've all had the experience of standing up to present a novel set of findings, often building on years of work, and having someone in the audience blurt out 'But we knew this already!,'" says Prof. Stefano DellaVigna, a behavioral economist with joint appointments in the Department of Economics and Berkeley Haas. "But in most of these cases, someone would have said the same thing had we found the opposite result. We're all 20-20, after the fact."

DellaVigna has a cure for this type of academic Monday morning quarterbacking: a prediction platform to capture the conventional wisdom before studies are run.

Along with colleagues Devin Pope of the University of Chicago's Booth School of Business and Eva Vivalt of the Research School of Economics at Australian National University, he's launched a beta website that will allow researchers, PhD students, and even members of the general public to review proposed research projects and make predictions on the outcome.

Their proposal, laid out in an article in Science's Policy Forum, is part of a wave of efforts to improve the rigor and credibility of social science research. These reforms were sparked by the replication crisis -- the failure of reproduce the results of many published studies -- and include mass efforts to replicate studies as well as platforms for pre-registering research designs and hypotheses.

"We thought there was something important to be gained by having a record of what people believed before the results were known, and social scientists have never done that in a systematic way," says DellaVigna, who co-directs the Berkeley Initiative for Behavioral Economics and Finance. "This will not only help us better identify results that are truly surprising, but will also help improve experimental design and the accuracy of forecasts."

Because science builds on itself, people interpret new results based on what they already know. An advantage of the prediction platform is that it would help better identify truly surprising results, even in cases where there's a null finding -- which rarely get published because they typically aren't seen as significant, the researchers argue.

"The collection of advance forecasts of research results could combat this bias by making null results more interesting, as they may indicate a departure from accepted wisdom," Vivalt wrote in an article on the proposal in The Conversation.

A research prediction platform will also help gauge how accurate experts actually are in certain areas. For example, DellaVigna and Pope gathered predictions from academic experts on 18 different experiments to determine the effectiveness of "nudges" versus monetary incentives in motivating workers to do an online task. They found the experts were fairly accurate, but there was no difference between highly cited faculty and other faculty, and that PhD students did the best.

Understanding where there is a general consensus can also help researchers design better research questions, to get at less-well-understood phenomena, the authors point out. Collecting a critical mass of predictions will also open up a new potential research area on whether people update their beliefs after new results are known.

Making a prediction on the platform would require a simple 5-to-15-minute survey, DellaVigna says. The forecasts would be distributed to the researcher after data are gathered, and the study results would be sent to the forecasters at the end of the study.

Berkeley Haas Prof. Don Moore, who has been a leader in advocating for more transparent, rigorous research methods and training the next generation of researchers, says the prediction platform "could bring powerful and constructive change to the way we think about research results. One of its great strengths is that it capitalizes on the wisdom of the crowd, potentially tapping the collective knowledge of a field to help establish a scientific consensus on which new research results can build."

To sign up to the prediction website, click here.

Story Source:

Materials provided by University of California - Berkeley Haas School of Business.

Journal Reference:

Stefano DellaVigna, Devin Pope, Eva Vivalt. Predict science to improve science. Science, 2019; 366 (6464): 428 DOI: 10.1126/science.aaz1704


23 views0 comments

Recent Posts

See All
bottom of page