Twenty years ago this month, Virginia Postrel released her seminal techno-optimist classic, The Future and Its Enemies, in which she articulated two competing visions of the future: dynamism and stasism. The dynamist perspective, she argued, is characterized by a willingness to embrace the reality of an ever-changing world, welcoming the infinite possibilities offered by an ever-unfolding future. In contrast, stasists prefer to safeguard the known benefits of the present, ill-content with the rocky waters of change. They find solace in a world that is regulated and engineered to ensure maximal stability, willing to forsake the mere possibilities of a better future for the contentment of the status quo. According to Postrel:
Stasists and dynamists are thus divided not just by simple, short-term policy issues but by fundamental disagreements about the way the world works. They clash over the nature of progress and over its desirability: Does it require a plan to reach a specified goal? Or is it an unbounded process of exploration and discovery? Does the quest for improvement express destructive, nihilistic discontent, or the highest human qualities? Does progress depend on puritanical repression or playful spirit?
These questions — and the answers offered by competing ideological visions — are just as relevant today as they were when Postrel first proposed them two decades ago. In fact, they may be more important than ever before as societies around the world confront one of the most significant and rapid periods of technological advancements in history. As profound as this rate of progress has been, however, government policies remain considerable barriers to greater advancements in life-changing technologies. Why is that?
In a 2015 interview with Tyler Cowen, Peter Thiel was asked about technology’s role in failing to alleviate the Great Stagnation — the slowdown in economic growth in recent years due, in part, to the slowing pace of innovation in the non-digital economy. He pointed out that over the past few decades we had only seen “stagnation in the world of atoms, not bits” because “we lived in a world in which bits were unregulated and atoms were regulated.”
Thiel has a point. The rapid proliferation of the Internet and other digital communications technologies was in large part a result of their unregulated nature. The Niskanen Center has written at length about the underlying policy decisions made during the 1990s — most notably the 1997 Framework for Global Electronic Commerce — that provided the foundation for these innovations to achieve widespread deployment and adoption at a speed unrivaled in the history of human civilization. Unfortunately, those same policy dispositions have yet to be adopted for those industries that occupy their attention with the “world of atoms.”
To remedy this problem, today the Niskanen Center is releasing its Policymaker’s Guide to Emerging Technologies. This white paper offers a dynamist-friendly governance framework and specific policy recommendations detailing how policymakers can end the regulatory disparities between these two worlds — not by putting the brakes on digital innovation, but by accelerating technological progress and adoption in analog industries ripe for disruption.
Part I outlines the foundational tenets of “soft law” and its impact on the governance of emerging technologies. This section frames the underlying rationale for many of the recommendations offered in the remainder of the Policymaker’s Guide. Marc Andreessen once wrote that software was eating the world; now, soft law is eating the world of technological governance. On net, we argue that’s probably a good thing.
In Part II, the discussion concentrates on issues that have their roots in the pre-digital era, but which purportedly present new challenges in the wake of an increasingly interconnected world. In particular, this section looks at the role that antitrust, privacy, and copyright play in current debates surrounding the digital economy. We argue that many of the (often hypothetical) concerns offered as justifications for aggressively expanding the reach, scope, and applicability of existing rules and approaches — from ditching the consumer welfare standard in competition analysis to cries for a set of standardized federal rules governing privacy — are unfounded and potentially destructive to the engine of economic growth and social betterment.
Part III then narrows the focus further by diving deeper on the issues associated with seven new emerging technologies: genetic modification, the Internet of Things, autonomous vehicles, commercial drones, supersonic flight, commercial space, and climate engineering. It then offers specific recommendations to address some of the common concerns associated with their development and adoption. Those recommendations generally revolve around the need to embrace rules that are flexible, adaptive, and narrowly-tailored to address actual — not merely hypothetical — harms to individuals.
Finally, Part IV examines the unique characteristics of an emerging technology that has wide-ranging implications for numerous industries, both within and beyond the technology sector: artificial intelligence (AI). As a “nexus technology” — one whose development and improvement will have an outsize impact on the development of other related technologies — AI deserves expanded consideration, with a specific focus on those areas most likely to have near-term, high-impact effects. To that end, we focus on recommendations for the use and application of AI in the areas of online digital advertising and medical device technologies, as well as a concluding section that offers a specific governance framework — “algorithmic accountability” — that can help address observable harms resulting from a misapplication or misuse of AI.
As we note in the conclusion:
This analysis can serve as a signpost along the path to a brighter future. Like any sign, however, it can only guide policymakers towards the better path; it cannot ease the burdens of the journey. Policymakers should be ever skeptical of proposals that would purport to provide easy solutions to these complex problems. It is notoriously difficult to chart a clear and easy course to a better future without expecting to experience some difficulties, and not every decision that seems wise at any given moment is necessarily paving the way to something better — that’s why rules that allow for maximum flexibility and adaptability are far more ideal than those that propose a One True Golden Path.
Read the full white paper here.
* This article was originally published on the Niskanen website, and is republished here with permission.