Before you scale up an idea, consider this history.
www.chicagobooth.edu
While Peltzman’s paper was controversial at the time—unsurprisingly, it was politicized by pro- and anti-regulation advocates—much research in the intervening years has borne out similar conclusions in other domains. It turns out people have a tendency to engage in riskier behaviors when measures are imposed to keep them safer. Give a biker a safety helmet and he rides more recklessly—and, even worse, cars around him drive more haphazardly. And a 2009 study directly following the line of research pioneered by Peltzman found that NASCAR drivers who used a new head and neck restraint system experienced fewer serious injuries but saw a rise in accidents and car damage. In short, safety measures have the potential to undermine their own purpose.
Nothing makes spillovers more likely and visible than scaling an endeavor to a wide swath of people.
This phenomenon—which came to be known as the Peltzman effect—is often used as a lens for studying
risk compensation, the theory that we make different choices depending on how secure we feel in any given situation (i.e., we take more risk when we feel more protected and less when we perceive that we are vulnerable). This is why, in the wake of the 9/11 attacks and the rise in fear of terrorists gaining access to nuclear weapons, Stanford political scientist Scott Sagan argued that increasing security forces to guard nuclear facilities might actually make them less secure. The Peltzman effect also reaches into insurance markets, whereby people who have coverage engage in riskier behavior than those without coverage, a phenomenon known as moral hazard. Clearly, this pattern of human behavior has potentially huge implications when taken to scale.