2013-10-02

Co.Labs

Your Future Code May Run On Imperfect Circuitry, And That's A Good Thing

Some of the cleverest advances in computer design may happen thanks to chip circuits that are at least partly broken.



Have you ever looked at a high resolution photo of a silicon chip? It can be a beautiful thing to see, a maze of intricate lines and blocks with a complexity that boggles the mind, all rendered into crystalline perfection. Except that the quest for perfection, according to the director of the Institute for Microengineering, may be one strange limiting factor for future computer design. Instead Christian Enz would love us to embrace something that seems weird in the uber-accurate world of computing: Flaws.

Shrinking the dies used to make conventional silicon chips has led to some amazing advances, including the impressively powerful CPUs that power your latest smartphone. But in the quest for ever-smaller-scale chip tech, for example Samsung's plan to create 3-D 7nm-scale chips compared to today's typical 20nm sizes, chip manufacture bumps up against tough limits imposed by physics and raw engineering problems. Errors can creep in to the chip production process, and to keep the tech working extra hardware is added to the chip design so it can mitigate against flaws preventing a chip from working. This leads to extra power consumption and, in contradiction to the plan, more hardware on the chip itself.

So Christian Enz is pushing for a radical alternative approach that would see chip designers, and ultimately coders, embracing flaws and the joys of non-reliable circuitry.

Enz's argument is persuasive. Look at smartphone screen design: Some flaws in the quality of a pixel or two will almost certainly go unnoticed by the phone's owner because the human eye tolerates errors well. In the case of the iPhone with Retina screen, as we pointed out a while back, the human eye acutally can't see an individual pixel and thus as long as there aren't too many, then the "dead pixel" problem of old has gone away. That's quite an unexpected spin-off of the innovation Apple and its manufacturer partners achieved. So if you apply this sort of thinking to chip design in general, then you can make chips that are equally "good enough" that the end user can tolerate or even miss perceiving the flaws.

For an example, think of a circuit that simply adds numbers. If the numbers run through the circuit don't often have decimals then one could try to simplify the circuit itself to avoid including components that handle decimals. The circuit will then produce much less exact results, but on the whole it will likely do well enough that though its precise quality metrics will have slipped it will still fit many purposes--as long as it meets minimum standards. Though this example is deliberately facile, the idea can be scaled up to the transistor circuitry on much more complex chips.

In some ways, Enz's approach echoes what was used to create the recent carbon nanotube computer. Stanford's scientists knew that an array of nanotubes naturally includes some flawed ones, even if you manage to evaporate some of the worst offenders--so they worked out an algorithm to use the chip that can tolerate some misperforming transistors. The suggestion is that in near future chips, particularly for those where size and power consumption is critical like in smartphones, a similar sort of "good enough" design will certainly have benefits for the end user. You've heard of the idea that code is more of an art form than a science? It looks like the chips your next genius code will run on will be similarly a bit more "arty" and less OCD about precision.

[Image: Flickr user yellowcloud]






Add New Comment

0 Comments