Tuesday, May 18, 2004

Moore of the Beginning of the End


I've been saying for years that Moore's Law will end in the next two decades. The typical response I get to this assertion is, “Oh, they'll just move on to the next thing...quantuum computing, multi-core processors, optical, whatever.“ This is basically an assertion of faith against an argument of physics - a typically human response. Which is why I'm pretty sure the computing industry is going to get totally blindsided when suddenly things stop getting exponentially smaller/faster/cheaper. What's Microsoft's business model when you have to buy a computer that's five times as expensive as the one you own now in order to support the features they're working on?


The key, of course, is to figure out a way to profit by announcing something that solves the problems while everyone else reels around in the smoke. My current best guess is “compiler technology,“ but that's another post.


It seems that Intel may already be experiencing an early little speed bump on the way to the giant brick wall.



The warning came first from a group of hobbyists that tests the speeds of computer chips. This year, the group discovered that Intel's newest microprocessor was running slower and hotter than its predecessor. What they had stumbled upon was a major threat to Intel's longstanding approach to dominating the semiconductor industry - relentlessly raising the clock speed of its chips. Then two weeks ago, Intel, the world's largest chip maker, publicly acknowledged that it had hit a "thermal wall" on its microprocessor line.


4 comments:

  1. A few things could happen while everyone is fumbling in the dark and I think it would resemble the past of computing when limitations were standard and the exponential growth was a pipe dream. The programmers who knew their architectures and knew how to squeeze every ounce out of a piece of hardware, versus just anyone writing a major app. Granted that landscape has changed quite a bit, but I recall even a professor of mine in the early days of Java saying "It's slow now, but in a few years time, the processor and memory increases will make VM overhead obsolete."

    So to agree with you Craig, compilers will need to get smarter to help those people who can't write x86 assembly get those corners cut on the architecture, you'll see more diversity in cooling solutions, or maybe we'll see something akin to PCI-Express letting us use our video cards as a math coprocessor.

    It's going to be an interesting next couple of years.

    ReplyDelete
  2. I agree that moores law can't continue forever, but I think it will still continue for quite some time. Just look at two things that Intel already has at its disposal. First, there's the low power Centrino core, that's re-architected to give them high performance on significantly less power, giving them more headroom on the heat problem. As they've canceled their desktop lines and are using the Centrino/Dothan core for future processors, it's apparent that Intel is going this direction to address the thermal wall.

    There's also the approach of dual cores. Another easy win is to put 2 or 4 or 8 processors inside of a single chip. Operating systems can leverage this immediately, as can multithreaded software, so some of the complexity will get pushed to the developer to write multi-threaded apps where something more single threaded might have worked fine in the past.

    Also, keep in mind that Intel is agressively persuing optical computing. Now you're not going to have a Pentium that's powered by a flashlight any time soon, but you're already seeing optical circutry being blended with electronics in the case of ultra high-speed networking, so this technology may mature in time to keep moores law going. We may successfully lane change before we hit the electrons in silicon wall.

    ReplyDelete
  3. There's no question that the exponential nature of computing power growth will continue for some time. I'm just saying that beyond that we have no idea. And even if multiple cores works, it only works for a while - once you have more cores than you have threads, you just idle some of them.

    On the optical front, how does that help avoid the fundamental limits imposed by physical laws? Let's say that it gets you another 10x in scaling/speed before quantum effects stop you in your tracks. That's only, what? Five years? Different wall, but still brick.

    My observation is that optimism of the "we've still got a lot of room left" variety is correct, but that we shouldn't let it blind us to the fact that it's still a good idea to plan past the end of Moore's law...unless you think computing power can become infinite, in which case you should be asking a whole *different* set of questions. :)

    ReplyDelete
  4. Wow Craig, your arguments sound a bit like this quote:

    "Everything that can be invented has been invented."
    -Charles H. Duell, Commissioner, U.S. Office of Patents, 1899.

    ReplyDelete