Gauging the Information Revolution
Economists have paid very little attention to the revolution in computing which has taken place since World War II. The few studies that have been done suggest a revolution much greater in magnitude than the industrial revolution. However, such claims are based on estimates implying that the cost-effectiveness of computers has been increasing at a much faster rate than earlier revolutions in energy production or transportation. This Note argues that these claims are unwarranted, unwittingly involving comparison of the incommensurate. The author suggests that the right way to assess long-term trends in computer performance is to count "switches," the number of active "on-off" devices contained in the main memory or the central processing unit. Building on an extensive review of computer history as well as an intensive study of the economics of chip production, he makes the case that technological progress in computers reduces simply to a decline in the price of a switch. At the same time, this measure of progress is alien: obtaining more switches at a lower price is highly relevant to the engineer designing a new computer, but an increment in switches does not translate linearly into an increment in consumer satisfaction. Understanding this distinction is key to a proper assessment of the information processing revolution.