Articles

Breaking Moore's Law? Does Not Compute

Bloomberg View , April 22, 2015

Can a relatively minor change in a single company’s pricing strategies distort our picture of what’s happening in the economy?

Yes, if the company is Intel Corp.

Semiconductors are what economists call a “general purpose technology,” like electrical motors. Their effects spread through the economy, reorganizing industries and boosting productivity. The better and cheaper chips become, the greater the gains rippling through every enterprise that uses computers, from the special-effects houses producing Hollywood magic to the corner dry cleaners keeping track of your clothes.

Moore’s Law, which marked its 50th anniversary on Sunday, posits that computing power increases exponentially, with the number of components on a chip doubling every 18 months to two years. It’s not a law of nature, of course, but a kind of self-fulfilling prophecy, driving innovative efforts and customer expectations. Each generation of chips is far more powerful than the previous, but not more expensive. So the price of actual computing power keeps plummeting.

At least that’s how it seemed to be working until about 2008. According to the producer price index compiled by the Bureau of Labor Statistics, the price of the semiconductors used in personal computers fell 48 percent a year from 2000 to 2004, 29 percent a year from 2004 to 2008, and a measly 8 percent from 2008 to 2013.

The sudden slowdown presents a puzzle. It suggests that the semiconductor business isn’t as innovative as it used to be. Yet engineering measures of the chips’ technical capabilities have showed no letup in the rate of improvement. Neither have tests of how the semiconductors perform on various computing tasks.

If the chips are still getting more and more powerful, in keeping with Moore’s Law, why isn’t the price of computing power still falling fast? “That didn’t make sense, given that it seemed that the sector was still technologically dynamic,” said economist Stephen D. Oliner of the American Enterprise Institute in an interview.

In a new National Bureau of Economic Research paper, Oliner, David M. Byrne of the Federal Reserve Board and Daniel E. Sichel of Wellesley College look carefully at the price data and argue that the official statistics are misleading.

To track prices, the Bureau of Labor Statistics uses what’s known as a “matched model” technique, looking at the listed prices of the same product over time. Every new good enters the price index as a separate item, so any quality improvements it brings for the same price don’t show up as lower prices, as they ideally should.

That didn’t used to matter, because whenever Intel introduced a new generation of chips, the company cut the list prices of the old chips. It’s these official list prices that the BLS uses to calculate trends.

“When a new, higher-quality model is introduced, the price of the old model falls so that price per unit of quality is at least roughly equalized across models,” the economists explain in their paper. “In this case, the gap between model prices in periods when two models are in the market would represent the value of improved quality, and a matched-model index would correctly adjust for this quality change.”

But after 2006, when Intel gained dominance over rival Advanced Micro Devices Inc., it stopped cutting list prices of old chips as it introduced new ones. That doesn’t mean that the prices it actually charges haven’t gone down. Most likely, Intel is simply negotiating discounts on a case-by-case basis while charging full prices to customers willing to pay them -- price discriminating, in other words.

But, said Oliner, “We don't see what the transaction prices really are, and neither does BLS, and we don’t know how many units are actually being sold of these high-priced old chips at the list price -- if anybody's actually even paying that price.” As a result, the BLS’s matched-model approach no longer captures the decline in the price of computing power as new chips are introduced.

In their paper, the economists lay out a two-pronged approach to improving the price measurement. First, they adjust the prices of chips to reflect improved performance. In effect, they measure not the price of “a chip” but the price of what that chip can do, which is what buyers actually care about.

From 2000 to 2008, their results using this “hedonic” technique closely track the official PPI. But after 2008, they diverge, with the hedonic measurement showing sharp price drops while the PPI’s numbers flatten. Instead of that lousy 8 percent annual price decline from 2008 to 2013, this approach shows a 22 percent annual drop -- somewhat slower progress than earlier but nonetheless substantial.

Second, the economists look at what happens when only introductory prices, adjusted for performance, are compared across time. That approach shows a much sharper decline: 43 percent annually from 2008 to 2013.

This suggests, Sichel said in an interview, “that technological progress is much more rapid than you’d infer from the official statistics and that maybe there isn't such a puzzle.” Chips are still getting better, faster, and cheaper. Intel just changed the way it advertises their prices, and that change distorted the official numbers.

The next question -- and the one that matters for headline numbers like gross domestic product and the consumer price index -- is what’s happening to the prices of the computers that use the chips. Here, too, changing business practices and the actions of a single company seem to have affected official statistics.

About 90 percent of new PC sales are now imports, compared with less than 40 percent in 2008, before Dell Inc., which accounted for most domestic production, began closing its U.S. plants. Price data on imports are less accurate; many are transfer prices recorded on the books of a single firm and subject to tax considerations and other non-market factors. More importantly, it’s much harder for import indexes to account for quality changes. Unlike domestic data collected from manufacturers, import data often doesn’t include good information about product characteristics.

In a recent Federal Reserve Note, Byrne and Eugenio Pinto argue that the dramatic shift toward overseas production has created a false impression that computer prices have flattened rather than continuing their steep decline. As a result, they suggest, government statistics have understated the quantity of computing power businesses are investing in.

That research is still preliminary, however. Byrne, Oliner, and Sichel have recently bought data on actual retail prices of computers, including desktops, laptops, and tablets, going back to 2007. They have enough information to make good quality adjustments for most of the market and hope to have some results this summer.

When it comes to measuring quality gains, semiconductors and personal computers are relatively simple cases. They’re physical objects with quantifiable performance characteristics. They’re well understood, have been around for more than 30 years, and are purchased directly by their users. If the BLS is badly underestimating technological progress for these relatively simple examples, how likely is it that government statistics accurately depict the prices of Internet services, genetically targeted cancer treatments, improved hotel room décor, or softer, more washable fabrics? When our economic statistics invisibly stop tracking quality improvements, we’re likely to be fooled about important trends, overestimating inflation and underestimating real growth in GDP and wages. That can not only distort macroeconomic policy but lead to illusions of stagnation.