Sekėjai

Ieškoti šiame dienoraštyje

2026 m. kovo 30 d., pirmadienis

Network Theories

 

“It is in the nature of the mind to wander. For instance, Emily Dickinson does not appear in Tom Griffiths's "The Laws of Thought," yet in the course of reading the book one might be reminded of a line from her poetry: "Hope is the thing with feathers."

 

The word "bird" appears some 20 times in these pages; "canary" also appears frequently, along with one "ostrich" and a few "feathers."

 

In the late 1980s, Mr. Griffiths notes, the psychologist David Rumelhart used neural-network models to study, among other things, an owl's sound-localization system. The computational model of a neural network mimicked the owl's auditory map -- the flow of auditory information -- which helped Rumelhart derive a theory about how the owl's system developed and worked.

 

Come to think of it, thoughts are perhaps not unlike a murmuration of starlings, flitting and swooping and morphing with seemingly serendipitous activation. As Mr. Griffiths, a professor of psychology and computer science at Princeton University, describes: "A mental state is a point in space, summarizing the activation of all the concepts inside our head. A thought traces a curve in activation space."

 

But "The Laws of Thought" is not a stream-of-thoughts tale. It's a rigorous and captivating account of how cognition can be modeled via three mathematical frameworks: logic, artificial neural networks ("mathematical systems that emulate the operation of the brain") and probability theory. It's a "quest for a mathematical theory of the mind," as the subtitle puts it -- opening with Aristotle and ending with artificial intelligence.

 

Aristotle was among the first to contemplate the question of how to think -- more specifically, how to make a good argument -- with his systemic structure of syllogisms, or arguments set up in the form of two premises. In the 17th century, Gottfried Wilhelm Leibniz devised an unsuccessful scheme to turn Aristotle's system of logic into arithmetic. More successfully, in the 19th century, George Boole -- mathematician, logician, theoretical psychologist -- "fundamentally changed our understanding of logic," Mr. Griffiths tells us, by "showing how reason could be captured by a formal system." Boole translated logic into algebra -- with an algebra of logic, or Boolean algebra -- transforming logic from a philosophical, rule-based system into a mathematical, symbolic one.

 

The notion of the brain as a logic machine has origins in the early 1940s with the work of Warren McCulloch and Walter Pitts. They are credited with proposing the first mathematical model of a neural network to explain how the brain's biological connectivity produces complexity -- thus leading to a computational theory of the mind.

 

"Logic and neural networks are two threads tracing their way back through time, each corresponding to a different way of thinking about the mind," Mr. Griffiths writes. "Probability theory is a third thread that is intertwined with both of them." It's the branch of mathematics that analyzes randomness, uncertainty and the likelihood of certain outcomes.

 

A fundamental and powerful concept in probability is Bayes's theorem, developed by the 18th-century statistician Thomas Bayes. It's a formula that expresses uncertainty in terms of probability, rationally updating prior beliefs and reducing uncertainty based on observed evidence. For some, the Bayesian framework is essentially a working definition of rationality or common sense: As one learns more, one's views change -- whether about the weather or the deadliness of a disease like Covid-19.

 

Cognitive scientists have investigated whether the brain itself is Bayesian -- a probabilistic inference engine, perpetually recalibrating, merging incoming sensory data with past experience. "Many thinkers," Mr. Griffiths notes, have "had an interest in probability, whether they tried to understand the mind in terms of rules and symbols or networks, spaces, and features."

 

Probability was integral to David Marr's understanding of brain function. A pioneer in computational neuroscience, Marr was best known for his work on vision, treating it as a multilevel system and not merely a system of neurons. He, too, used an avian analogy: "Trying to understand perception by studying only neurons is like trying to understand bird flight by studying only feathers," Marr said. "It just cannot be done. In order to understand bird flight, we have to understand aerodynamics; only then do the structure of feathers and the different shapes of birds' wings make sense."

 

And so, Mr. Griffiths asks, "what is the equivalent of aerodynamics for understanding the mind?" He observes that, from an engineering perspective, it might be futile to try making machines that are fully human -- after all, our various abilities have been refined through a lengthy, data-rich evolution. "It's certainly easier to create systems that are more specialized," he notes. "Jets don't catch worms, build nests, find mates, lay eggs, or care for their young, but they do a great job of flying from one place to another."

 

"Machines," Mr. Griffiths continues, "don't operate under the same constraints as humans -- our finite lives, finite brains, and limited capacity to communicate have shaped the nature of human intelligence. We might thus expect that human minds will continue to be a little bit special, even as we continue to develop smarter machines."

 

One early smart machine was the Mark I Perceptron, an "artificial brain," invented in 1958 by the psychologist Frank Rosenblatt, that could learn to classify simple patterns, such as geometric shapes and handwritten letters. When asked by a reporter from the New Yorker magazine what the Perceptron wasn't capable of, "Dr. Rosenblatt threw up his hands. 'Love,' he said. 'Hope. Despair. Human nature, in short.'"

 

---

 

Ms. Roberts is the author of "Genius at Play" and "King of Infinite Space." She is the executive editor of the American Mathematical Society's Notices journal.” [1]

 

1. Network Theories. Roberts, Siobhan.  Wall Street Journal, Eastern edition; New York, N.Y.. 30 Mar 2026: A15.

Komentarų nėra: