Sekėjai

Ieškoti šiame dienoraštyje

2022 m. kovo 20 d., sekmadienis

Correcting Our Errors About Errors

"In the days when top-of-the-line computers contained rooms full of vacuum tubes, their designers had to keep the tubes' limitations carefully in mind. They were prone to degrade over time, or suddenly fail altogether.

Partly inspired by this problem, John von Neumann and others launched a new field of investigation, epitomized by von Neumann's 1956 paper "Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Parts." In it, he wrote, "Our present treatment of error is unsatisfactory and ad hoc. It is the author's conviction, voiced over many years, that error should be treated by thermodynamical methods." He added, "The present treatment falls far short of achieving this."

Thermodynamics and statistical mechanics are the powerful methods physics has developed to derive the properties of bodies -- such as their temperature and pressure -- from the basic behavior of their atoms and molecules, using probability. Von Neumann hoped to do something comparable for the complex basic units, analogous to atoms, that process information.

The emerging theory was short-circuited, so to speak, by developments in semiconductor technology and molecular biology. Solid-state transistors, printed circuits and chips are models of reliability when assembled meticulously enough. With their emergence, the focus of engineers turned from coping with error to avoiding it. The most basic processes of biology share that focus: As cells read out the information stored in DNA, they do rigorous proofreading and error correction, to nip potential errors in the bud.

But the old issues are making a comeback as scientists push the boundaries of technology and ask more ambitious questions. We can make transistors smaller and faster -- and relax manufacturing requirements -- if we can compromise on their reliability. And we will only understand the larger-scale, sloppier processes of biology -- such as assembling brains, as opposed to assembling proteins -- if we take on von Neumann's challenge.

A lot of progress in overcoming processing errors has been made since 1956. The internet is designed to work around nodes that malfunction or go offline. (Early research aimed to ensure the survival of communication networks following a nuclear exchange.) Artificial neural nets can do impressive calculations smoothly despite imprecision in their parts, using a sort of probabilistic logic in which each unit takes averages over inputs from many others.

We've also come to understand a lot more about how human brains get wired up and process information: They are made from vast numbers of biological cells that can get miswired, die, or malfunction in different ways, but usually still manage to function impressively well. Blockchains and (so far, mostly notional) topological quantum computers systematically distribute information within a resilient web of possibly weak components. The contribution of failed components can be filled in, similar to how our visual perception "fills in" the retina's famous blind spot.

Von Neumann's concern with unreliable parts fits within his vision of self-replicating machines. To reproduce themselves, such automatons -- like the biological cells and organisms that inspired them -- would need to tap into an unpredictable, unreliable environment for their building material. This is the engineering of the future. Plausibly, it is the path to some of science fiction's wildest visions -- terraforming of planets, giant brains, and more.

You won't get there without making, and working around, lots of mistakes. Ironically, if semiconductor technology hadn't been so good, we might have taken on that issue sooner, and be further along today." [1]

1. REVIEW --- Wilczek's Universe: Correcting Our Errors About Errors
Wilczek, Frank.
Wall Street Journal, Eastern edition; New York, N.Y. [New York, N.Y]. 19 Mar 2022: C.4.

 

Komentarų nėra: