Update: Diminishing Returns

Tuesday Mar 6th 2001 by Tyler Sperry

Does Moore's Law apply to processors and processing power? Does the term, "Intel Developers" make sense? Find these answers and more in today's Developer.Com Update

Diminishing Returns

Contrary to popular belief, change is not a constant. What happens is that we humans become habituated to a certain rate of change, come to think of it as a constant, and then get thrown off stride when the next shift occurs. One doesn't need to invoke differential calculus to appreciate the folly of becoming complacent while traveling at high speed.

Perhaps the most painful examples of habituation are to be found in the stock market. Having become used to an atypically strong market for the better part of a decade, politicians and investors watched the slide of the Nasdaq over the last three quarters of 2000 with what can only be described as denial. As if to ward off the financial demons, they kept repeating a venerable speculator's prayer, "It's only a correction."

Here in Developerville, we have our own challenges of habituation. Perhaps the most popular is the expectation that silicon will continue to increase in speed and we software developers will surf the wave of progress. On occasions when we're feeling particularly good, we'll refer to Moore's Law to buttress this optimism.

The problem with Moore's Law is that people often assume it has implications it doesn't. Moore's observation was that the feature size of integrated circuits continues to shrink on pretty much a standard trajectory. As the Intel article puts it, "In 26 years the number of transistors on a chip has increased more than 3,200 times, from 2,300 on the 4004 in 1971 to 7.5 million on the Pentium II processor." People (including those writing Intel's articles,) often assume that doubling the number of transistors will double the "computing power" of a CPU. Sorry, but it's not that easy.

So what's the problem? Once you achieve a certain level of complexity, adding more transistors doesn't help much. We saw that a few years ago, when CPU designers ran into the performance wall with superscalar processors. The idea was that programs could be broken up into pieces that could be executed in parallel, and CPUs with multiple logic and math units could exploit that fact. Unfortunately, there's only so much work that can be done in parallel in your typical program. Beyond three execution units, adding transistors is an exercise in rapidly diminishing returns.

The Last Platform for True Believers?

Which brings us, in a typically circuitous fashion, to this week's Intel Developer Forum Conference 2001.

The chip announcement was that the new Itanium processors would be coming out in the coming months, with full production a trifle behind expectations. As for the long-awaited McKinley design, we can keep waiting.

But oddly enough, it was the name that struck me most. Not the "Forum Conference" redundancy, but the "Intel Developer" aspect. Do you think of yourself as an Intel developer? While there are some holdouts for platform loyalty in the Microsoft camp, it seems pretty clear that today's developers are used to working on multiple platforms. Forget the days of the Wintel hegemony, today's developers are faced with a variety of platforms from the server to handhelds and cell phones. Today's environment rewards those who aren't locked into a single platform for their expertise -- at least for the front and middle layers of today's systems. The "hot" trends of the last few years have been XML and Java, technologies notably platform independent. Even Microsoft's .NET promotions have promised support on multiple platforms.

If there's one candidate for development platform loyalty today, it's Linux. Like Windows developers, the penguin crowd still seems to identify with their OS. But with increased acceptance of Linux, the religious fervor seems to have abated somewhat. With Linux prominently featured in IBM's mainstream radio and TV ads (and, I'm told, Intel CPU demos) it's only a matter of time until your average Linux developer becomes, if not platform-neutral, at least cross-platform savvy. Market research indicates Linux usage has already become mainstream in numbers, if not attitude. For example, International Data Corporation's latest estimates put Linux at 27% of last year's server OS market -- second only to Microsoft's Windows.

Tyler Sperry

Tyler Sperry is a freelance writer working with

Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved