July 24, 2016

Here's the other reason why PC sales are in decline

Were you wondering why PC sales have been declining for 14 straight quarters, or why smartphone sales are also declining? The slow death of Moore's Law may be part of the answer.

After more than 50 years of miniaturization, the transistor could stop shrinking in just five years. That is the prediction of the 2015 International Technology Roadmap for Semiconductors, which was officially released earlier this month.
After 2021, the report forecasts, it will no longer be economically desirable for companies to continue to shrink the dimensions of transistors in microprocessors. Instead, chip manufacturers will turn to other means of boosting density, namely turning the transistor from a horizontal to a vertical geometry and building multiple layers of circuitry, one on top of another.
[...]
These roadmapping shifts may seem like trivial administrative changes. But “this is a major disruption, or earthquake, in the industry,” says analyst Dan Hutcheson, of the firm firm VLSI Research. U.S. semiconductor companies had reason to cooperate and identify common needs in the early 1990’s, at the outset of the roadmapping effort that eventually led to the ITRS’s creation in 1998. Suppliers had a hard time identifying what the semiconductor companies needed, he says, and it made sense for chip companies to collectively set priorities to make the most of limited R&D funding.
But the difficulty and expense associated with maintaining the leading edge of Moore’s Law has since resulted in significant consolidation. By Hutcheson’s count, 19 companies were developing and manufacturing logic chips with leading-edge transistors in 2001. Today, there are just four: Intel, TSMC, Samsung, and GlobalFoundries.
For those who've been paying attention, this isn't anything new.
The impending end of Moore's Law has been reported a couple of times just this year, such as this piece back in February by Arstechnica:
Intel’s former chief architect Bob Colwell delivered the keynote address at the Hot Chips conference on Sunday, in a speech I personally wish I’d been able to attend. Colwell, who served as a senior designer and project leader at Intel from 1990 to 2000, was critical to the development of the Pentium Pro, Pentium II, P3, and P4 processors before departing the company. [...]
Today, Colwell heads up DARPA’s Microsystems Technology Office, where he works on developing new cutting-edge technologies across a variety of fields. In his talk at Hot Chips, he faced up to a blunt truth that engineers acknowledge but marketing people will dodge at every opportunity: Moore’s law is headed for a cliff. According to Colwell, the maximum extension of the law, in which transistor densities continue doubling every 18-24 months, will be hit in 2020 or 2022, around 7nm or 5nm.
or this piece from July, in Business Insider:
For half a century, Moore's Law has guided the advance of technology. Formulated in 1965 by Intel cofounder Gordon Moore, the law stipulates that the number of transistors on an integrated circuit will double every two years — effectively doubling computing power.
But, as The Financial Times reports, Intel CEO Brian Krzanich has now warned that it may be coming to an end.
For the last eight years, Intel has followed a "tick tock" cycle of product releases. On the "tick," the process technology shrinks; on the "tock," architecture is improved. But the company has now been forced to scrap this model, and introduced a third step to its current 14nm line. "Our cadence today is closer to 2.5 years than two," Krzanich explained on the company's earnings call on Wednesday.
Is this a temporary hiccup, or a sign of a greater seismic shift? The CEO said that "we're not sure." But he said that the company will "always strive to get back to two years" — suggesting that this could well be the end to Moore's Law as we know it.
Circuit makers are quick to point out that Moore's Law doesn't actually say anything about layering chips on top of each other, thus continuing to increase the number of transistors on a chip even though the areal transistor density can't be increased (i.e. the number of transistors per unit area). Basically, they want to pretend that everything can carry on as if Moore's Law can continue to hold sway long after areal transistor density hits its ceiling -- a ceiling imposed by exponentially increasing costs of dealing with with the physical limitations of chips themselves, especially compared with the relatively modest gains in computational power that can be realized in the process.

Increasing the number of layers (or cores) on each processor can help, but doubling the number of cores doesn't actually double performance, unless you're doing something computation-heavy that can also be distributed evenly across all of the processor's available cores. You can see this for yourself, if you're running e.g. a Steam-installed game; just hit the appropriate function key to see the processor load on each core during a running game, and you'll see that one or two cores see a lot of load, while the rest are relatively idle.

Por ejemplo, I have 6-core AMD CPU, and most of those cores are idle most of the time; increasing to an 8-core CPU would cost $200, but only net me about 20% more performance on average, which already puts it past the point of diminishing returns for the average consumer. And my PC is already over 3 years old; it was mid-range when I bought it, and it's still mid-range now, and I'm not expecting to need (or even want) to upgrade anytime in the next few years.

This is part of why I think that console gaming, as it's existed up to now, is a slowly dying thing. PC gaming used to be expensive, with brand-new $1000 gaming PCs being behind the technology curve in a year, and basically junk only a few years after that. PC gamers had lots of power, but the cost of that power was a never-ending cycle of expensive hardware upgrading and replacement. Now, though, that's just not necessary; most games are being made to run on all platforms, including current-gen consoles, which are basically just budget gaming PCs, and raw processing power just doesn't matter as much unless you're streaming, playing competitively, or otherwise making your living playing video games.

$500 worth of gaming PC today is good enough to run most of the games you want to play, and $1000 of gaming PC will last you for years, and this is the new normal: PCs whose performance will remain basically stable for years, and need replacement a lot less often than they used to. Console makers, all of whom are basically making media-centre PCs anyway, are now adopting this same PC-like upgrade cycle, making incremental improvements to existing platforms rather than designing new consoles from scratch.

Smartphones are also seeing this same plateau effect: people are less quick to replace their phones because the new phones just aren't that much better than the ones they already have. These are all mature markets; continued explosive growth really isn't something that anyone should be planning on, at least until the nascent field of quantum computing reaches a point of making consumer-ready devices which totally change what a personal PC is capable of doing. That time is still years off, though, and maybe decades; the meantime, get ready for the breakneck pace of technological change to slow slightly, giving everyone a chance to catch their breath, and catch up with the state of the computing art.