I've been saying for years that Moore's Law will end in the next two decades. The typical response I get to this assertion is, “Oh, they'll just move on to the next thing...quantuum computing, multi-core processors, optical, whatever.“ This is basically an assertion of faith against an argument of physics - a typically human response. Which is why I'm pretty sure the computing industry is going to get totally blindsided when suddenly things stop getting exponentially smaller/faster/cheaper. What's Microsoft's business model when you have to buy a computer that's five times as expensive as the one you own now in order to support the features they're working on?
The key, of course, is to figure out a way to profit by announcing something that solves the problems while everyone else reels around in the smoke. My current best guess is “compiler technology,“ but that's another post.
It seems that Intel may already be experiencing an early little speed bump on the way to the giant brick wall.
The warning came first from a group of hobbyists that tests the speeds of computer chips. This year, the group discovered that Intel's newest microprocessor was running slower and hotter than its predecessor. What they had stumbled upon was a major threat to Intel's longstanding approach to dominating the semiconductor industry - relentlessly raising the clock speed of its chips. Then two weeks ago, Intel, the world's largest chip maker, publicly acknowledged that it had hit a "thermal wall" on its microprocessor line.