Artificial intelligence (AI) appears poised to transform the economy across sectors ranging from healthcare and finance, to retail and education. What some have coined the “Fourth Industrial Revolution” is driven by three key trends: greater availability of data, increases in computing power, and improvements to algorithm design. First, increasingly large amounts of data have fueled the ability for computers to learn, such as by training an algorithmic language model on all of Wikipedia. Second, better computational capacity (often termed “compute”) and compute capability have enabled researchers to build models that were unimaginable merely 10 years ago, sometimes spanning billions of parameters (an exponential increase in scope from previous models). Third, basic innovations in algorithms are helping scientists to drive forward AI, such as the reinforcement learning techniques that enabled a computer to defeat the world champion in the board game Go.
Historically, partnerships between government(s), universities, and industries have anchored the U.S. innovation ecosystem. The federal government played a critical role in subsidizing basic research, enabling universities to undertake high-risk research that can take decades to commercialize. This approach catalyzed radar technology, the internet, and GPS devices. As the economists Ben Jones and Larry Summers put it, “[e]ven under very conservative assumptions, it is difficult to find an average return below $4 per $1 spent” on innovation, and the social returns might be closer to $20 for every dollar spent. Industry in turn, scales and commercializes applications.