¡°List is far too thoughtful to write something gimmicky or simple. . . . An entertaining and clear writer. His book is chock-full of compelling stories of businesses that failed and others that went big.¡±¡ªThe Wall Street Journal
¡°Skillfully done . . . Careful, comprehensive, and fun, The Voltage Effect excels in turning a seemingly boring niche topic into a fascinating book that¡¯s relevant to all, from CEOs and policymakers to naturally curious people with a taste for learning how economics shapes our lives in the real world.¡±¡ªZME Science
¡°If you¡¯ve ever wondered why so many promising solutions fail to achieve their desired impact, look no further. . . . A master class in how the quirks of human irrationality can make or break our ideas in the real world.¡±¡ªSteven D. Levitt, professor of economics, University of Chicago, and co-author of Freakonomics
¡°Brilliant, practical, and grounded in the very latest research, this is by far the best book I¡¯ve ever read on the how and why of scaling. If you care about changing the world, or just want to make better decisions in your own life, The Voltage Effect is for you.¡±¡ªAngela Duckworth, CEO of Character Lab and New York Times bestselling author of Grit
¡°How many books are funny and wise, practical and profound? John List is a scientist, but he¡¯s also a magician, and he¡¯s changing the world. The Voltage Effect shows how. This is one of the best economics books I have ever read¡ªand an instant classic in behavioral economics.¡±¡ªCass R. Sunstein, Robert Walmsley University Professor, Harvard University, and New York Times bestselling co-author of Nudge
¡°The Voltage Effect is the tool kit for the ambitious. Packed with proven principles and pro tips made real through behind-the-scenes stories in settings ranging from Silicon Valley to African NGOs, it fills the gap between startup books and management books to show how any idea can achieve its full potential.¡±¡ªScott Cook, co-founder of Intuit
¡°A must-read . . . Ideas from the ivory tower or Davos fail often¡ªand fail badly¡ªbecause they do not recognize the deeply political and historical nature of the problems they are meant to deal with or the social realities in which these problems are embedded. This thought-provoking and engaging book proposes an original framework for thinking about how good policy proposals can be applied at a scale large enough to do social good, and for avoiding predictable mistakes that prevent such scaling.¡±¡ªDaron Acemoglu, professor at MIT and co-author of Why Nations Fail and The Narrow Corridor
¡°John List¡¯s work in field experiments is revolutionary.¡±¡ªGary Becker, professor of economics and sociology, University of Chicago, Nobel Prize for Economics
Part One
CAN YOUR IDEA SCALE?
1
Dupers and False Positives
On September 14, 1986, First Lady Nancy Reagan appeared on national television to address the nation from the West Sitting Hall of the White House. She sat on a sofa next to her husband, President Ronald Reagan, and gazed into the camera. ¡°Today there¡¯s a drug and alcohol abuse epidemic in this country and no one is safe from it,¡± she said. ¡°Not you, not me, and certainly not our children.¡±
This broadcast was the culmination of all the traveling the First Lady had done over the preceding five years to raise awareness among American youth about the dangers of drug use. She had become the public face of the preventative side of President Reagan¡¯s War on Drugs, and her message hinged on a catchphrase that millions of people still remember, which she employed once again that evening on television. ¡°Not long ago, in Oakland, California,¡± Nancy Reagan told viewers, ¡°I was asked by a group of children what to do if they were offered drugs. And I answered, ¡®Just say no.¡¯£¿¡±
Although there are different accounts of where this infamous slogan originated¡ªwith an academic study, an advertising agency, or the First Lady herself¡ªits ¡°stickiness,¡± to use the parlance of marketing, was undeniable. The phrase appeared on billboards, in pop songs, and on television shows; school clubs took it as a name. And in the popular imagination it became inseparable from what government and law enforcement officials saw as the crown jewel of the Reagan-era drug prevention campaign: Drug Abuse Resistance Education, or D.A.R.E.
In 1983, Los Angeles chief of police Daryl Gates announced a shift in his department¡¯s approach to the War on Drugs: instead of busting kids in possession of illegal substances, the new focus would be on preventing those drugs from getting into their hands in the first place. This was how D.A.R.E., with its iconic logo of red letters set against a black background, was born.
D.A.R.E. was an educational program built on a theory from psychology called social inoculation, which took from epidemiology the concept of vaccination¡ªadministering a small dose of an infectious agent to induce immunity¡ªand applied it to human behavior. The approach of the program was to bring uniformed officers into schools, where they would use role-playing and other educational techniques to inoculate kids against the temptations of drugs. It certainly sounded like a great idea, and the early research on D.A.R.E. was encouraging. As a result, the government opened its taxpayer-funded faucet, and soon the program was scaled up in middle schools and high schools across the country. Over the next twenty-four years, 43 million children from over forty countries would graduate from D.A.R.E.
There was only one problem: D.A.R.E. didn¡¯t actually work.
In the decades since Nancy Reagan urged the nation¡¯s youth to ¡°just say no¡± to drugs, numerous studies have demonstrated that D.A.R.E. did not in fact persuade kids to ju...st say no. It provided children with a great deal of information about drugs such as marijuana and alcohol, but it failed to produce statistically significant reductions in drug use when these same kids were presented with opportunities to use them. One study even found that the program spurred participants¡¯ curiosity about drugs and increased the likelihood of experimentation.
It is hard to overstate the cost of D.A.R.E.¡¯s voltage drop at scale. For years, the program consumed the time and effort of thousands of teachers and law enforcement officers who were deeply invested in the well-being of our greatest natural resource: future generations. Yet all of this hard work and time, never mind taxpayer dollars, was wasted on scaling D.A.R.E. because of a fundamentally erroneous premise. Worse, it diverted support and resources away from other initiatives that might have yielded real results. Why D.A.R.E. became the disaster that it did is a textbook example of the first pitfall everyone hoping to scale an idea or enterprise must avoid: a false positive.
The Truth About False Positives
A first truth about false positives is that they can be considered as ¡°lies,¡± or ¡°false alarms.¡± At the most basic level, a false positive occurs when you interpret some piece of evidence or data as proof that something is true when in fact it isn¡¯t. For example, when I visited a high-tech plant in China that produced headsets, if a headset working properly got marked as defective due to human error, that was a false positive. When I was called for jury duty, a false positive would have occurred had we determined that an innocent suspect was guilty. False positives also show up in medicine, a phenomenon that gained attention during the pandemic, when some test results for the virus turned out to be unreliable, showing people had contracted the virus when in reality they had not. Unfortunately, false positives are ubiquitous across contexts; consider a 2005 study that found that between 94 and 99 percent of burglar-alarm calls turn out to be false alarms, and that false alarms make up between 10 and 20 percent of all calls to police.
In the case of D.A.R.E., the National Institute of Justice¡¯s 1985 assessment involving 1,777 children in Honolulu, Hawaii, found evidence ¡°favoring the program¡¯s preventative potential,¡± and a subsequent study conducted soon after in Los Angeles among nearly as many students also concluded that D.A.R.E. led to a reduction in drug experimentation. These purportedly strong results drove schools, police departments, and the federal government to just say yes to expanding D.A.R.E. nationwide. Yet numerous scientific analyses over the following decade examining all of the known studies and data on the program yielded incontrovertible proof that D.A.R.E. didn¡¯t actually have a meaningful impact. So what happened?
The simple answer is this: it is not uncommon for data to ¡°lie.¡± In the Honolulu study, for example, the researchers had calculated that there was a 2 percent chance their data would yield a false positive. Unfortunately, subsequent research shows that either they underestimated that probability or they just, unfortunately, fell within that 2 percent. There was never any voltage in D.A.R.E.
How can something like this happen in the hallowed halls of science? First, I should clarify that when I say the data are ¡°lying,¡± what I¡¯m actually referring to is ¡°statistical error.¡± For example, when you draw a sample of children from a certain population (i.e., children living in a single city in Hawaii), random differences among them might produce an ¡°outlier group¡± that leads you to make a false conclusion. Had the researchers gone back to the original population of children in Honolulu and tested D.A.R.E. again with a new group of students, they would have likely found the program didn¡¯t work. (A related kind of inference problem is when the results from one group don¡¯t generalize to another; we take up this issue in Chapter 2.) Unfortunately, statistical failures of this sort happen all the time.
As we saw with D.A.R.E., false positives can be very costly because they lead to misinformed decisions with downstream consequences¡ªtime and money that would have been better invested elsewhere. This is especially true when the ¡°lie¡± or error is missed early on, causing enterprises that were never actually successful to begin with to suffer an inevitable voltage drop at scale. In other words, eventually the truth will come out, as it did for D.A.R.E. when its critics produced overwhelming empirical evidence that the program didn¡¯t work. I have witnessed this firsthand in my own work in the business world.