A Multicore Processing Primer
Issue: 13.6 (November/December 2015)
Author: Markus Winter
Author Bio: Markus is a Molecular Biologist who taught himself REALbasic programming in 2003 to let the computer deal with some exceedingly tedious lab tasks. Some call it lazy, he thinks it smart. He still thinks of himself as an advanced beginner at best.
Article Description: >No description available
Article Length (in bytes): 21,252
Starting Page Number: >3
Article Number: 13605
Related Web Link(s):
Excerpt of article text...
Not so long ago—funds permitting—a CPU's frequency rating (measured first in MHz and later in GHz) seemed to be all that mattered when buying a new computer. Quite wrongly, as it turned out. In their attempts to lure customers with ever-higher clockspeeds, the designers often made chips that did less and less per clock cycle. In some cases a 1.2 GHz CPU could easily beat the (metaphorical) pants off a different CPU running at 2.4 GHz.
To explain this, a simple analogy will be handy. If computers were bikes, then the frequency measured in MHz/GHz would be equivalent to how fast the wheel turns. It is obvious to everyone that a large wheel can cover more ground per turn than a small wheel. Just look at Figure 1—the old large bike is most certainly faster than the newer tricycle (even though the tricycle has an extra wheel).
But there is another problem. My brother once bought a laptop with a 2.4 GHz "Desktop Class" Pentium 4 processor. But whenever it got hot in summer the laptop would shut down after a few minutes. Again, the bike analogy comes in handy: as the rider on the small tricycle will sweat more trying to keep up with the big bike, so does a CPU running at 2.4 GHz produce more heat than a slower-clocked CPU. This meant that the CPU was often forced to slow down to prevent overheating. In effect, most CPUs never ran at their maximum speed. Sometimes, because they didn't need to—like when the computer was idle (thereby conserving energy which is a good thing)—but sometimes because they couldn't (because of overheating). That technology even got its own name: SpeedStep. And when even SpeedStep couldn't cope, a computer would shut down.
But to understand all that required technical knowledge that was—and still is—beyond the capabilities of most people. (I'm not being smug about it as I'm well aware that car maintenance or repairing the dish washer is well beyond my capabilities!)
Nowadays it is less about MHz and GHz but all about
cores. The current mantra is "the more cores the better!" You'd be hard-pressed to find a single-core computer to buy today. Actually, you'd be even worse off trying to sella single-core computer today. Everyone has heard of cores, and the number of cores has even become a selling point for smartphones (some are advertised as having eight cores).
With cores, the analogy isn't bikes, but shovels. Eight small shovels might theoretically have the same capacity as two big shovels in moving a heap of sand. But it depends on the circumstances. What if only one shovel at a time can get to the heap of sand? Some tasks lend themselves easily to being split up, like video or image processing (so eight separate cores might have an advantage), but many tasks don't (in which case one powerful core is better).
...End of Excerpt. Please purchase the magazine to read the full article.