"Since the release of ChatGPT to the public two years ago, we have been awash in extreme claims about the potential benefits and threats of large language models and generative A.I. Boosters and critics alike believe the technology’s emergence is an inflection point in human history.
Proponents claim that artificial intelligence will eliminate acts of mental drudgery, freeing us to become our best selves. Detractors worry not just that it will eliminate well-paying knowledge sector jobs and increase inequality but also that it will effectively steal the human soul. Once computers can produce songs and shows, paintings and poems indistinguishable from the work of living hands, the last remnants of human exceptionalism will be snuffed out.
Recently this fever shows signs of breaking. Even the technology’s champions acknowledge that it might have been overhyped a bit. Perhaps the emergence of machine learning will be merely a major technological transformation and not a world historical event. Now might also be the time to retire our worst fears about the technology.
I’m talking not about the quite reasonable anxiety surrounding the potential social and economic disruption of a powerful new technology but about the fundamental worry that a digital machine might one day exhibit — or exceed — the kind of creative power we once believed unique to our species. Of course, the technology is still relatively young, and it might make good on many of its promises. But the obsolescence of human culture will almost certainly not come to pass.
The root of this worry is not an overestimation of technology but a radical underestimation of humanity.
Our narrowing sense of human capability did not begin with the rise of artificial intelligence. Like a surprising number of recent cultural developments, it originated in the academy a half-century ago. Along with such related concepts as truth and beauty, the ideal of human creativity was among the bourgeois idols that the postmodern critical theorists of the 1960s and ’70s sought to deconstruct.
In a famous 1967 essay, “The Death of the Author,” the poststructuralist critic Roland Barthes argued that a text is “a multidimensional space in which a variety of writings, none of them original, blend and clash.” Writers, he insisted, “can only imitate.” Human culture is the product of large, impersonal forces — particularly material economic forces — not heroic individual action. The myth of artistic originality represents “the epitome and culmination of capitalist ideology.”
A decade later, the sociologist Pierre Bourdieu published the wildly influential study “Distinction,” in which he treated aesthetic judgment as an expression of “cultural capital” and aesthetic distinctions like that between high and low culture as forms of social control that exist to perpetuate class hierarchies and serve the material interests of capital.
In its efforts to tear down the myth of human creativity, these thinkers received support from a surprising place. For more than a generation, the primary intellectual alternative to the various strains of Marx-inflected critical theory has been a highly rationalized, neo-Darwinist scientific materialism with roots in the gene-centered view of evolution that emerged at the same time that Barthes and his peers were kicking off the critical turn. The two movements are generally hostile to each other, but they are in a strange agreement on the matter of human culture.
In an effort to assimilate seemingly inexplicable human achievements into the evolutionary framework of natural selection acting on blind variation, the evolutionary theorist (and popularizer of the gene-centered theory) Richard Dawkins developed the concept of memes —self-replicating units of cultural meaning — and suggested that the mind plays passive host to memes in the same way the body does genes.
The upshot is a view remarkably similar to that of the critical theorists: We are all in thrall to impersonal, systemic forces. Culture dictates human action far more than individual humans dictate cultural production. To understand great works of art as human achievements is just as backward as understanding the beauty and variety in nature as the work of divine hands.
On the matter of human psychology, the neo-Darwinist subdiscipline of cognitive science tells us that our brains are algorithms for the processing of information. Through brute trial and error, we have learned which algorithmic outputs will produce material rewards. Subjective, qualitative experience, which inevitably retains a hint of the spiritual, has been removed from the picture, as has the idea that individual humans might be capable of acting in ways not determined by millenniums of genetic history.
The combined influence of these views of human creativity has been enormous. As many commenters have noted, our culture has largely given up on originality. Hollywood can’t quit repackaging comic book universes and tired old TV shows. Popular musicians cycle through existing styles — taking turns at country, synth-infused 1980s pop, dance hall — rather than develop distinctive sounds. Literature has become dominated by auto-fictional mining of personal experience, revisionist retellings of classic works and various literary genre exercises.
The meme — in the narrower sense of the term adopted by the internet — has become our signature form of cultural production. These are artifacts whose origins are generally obscure at best, ones that exist almost exclusively for the sake of being transformed into new texts in the form of tweets and takes, which are further repackaged and reposted. A scene from an auteurist marriage drama, a political post, a celebrity eye roll, a new Starbucks menu offering are all essentially fungible, grist for the mill.
For many people, this cultural leveling has felt liberating, just as the critical theorists hoped. But it has also brought a persistent feeling of antihumanist despair. What has been entirely lost in all of this is any hope that some combination of inspiration and human will could bring something truly new into the world, that certain works of individual insight and beauty might transcend the admittedly very real influences of political and economic contexts in which they are created.
This spirit is exemplified by the dread over artificial intelligence. We have credulously swallowed an idea of culture as an empty power game and ourselves as walking algorithms — an idea that runs contrary to our deepest experiences. Now we are terrified that some other algorithm might prove more powerful at the game than we are. One way to step back from the brink might be to allow ourselves now and then to recognize and appreciate the truly transformative power of human genius.
Although it’s generally associated with the Romantic era, the notion of genius is nearly as old as culture itself. From the beginning, it has indicated something other than intelligence, even intelligence of an extreme sort. Socrates claimed throughout his life to be visited by a spirit — “daimonion” in Greek, “genius” in Latin. The spirit did not grant him any substantive knowledge; it only guided his actions, particularly warning him against certain behavior. His career as a public gadfly began when an oracle praised him for knowing more than any other citizen of Athens. Believing himself to know nothing, he started wandering the city, asking prominent Athenians some basic questions: What is truth? What is knowledge? What is justice? From their answers, he concluded that he was ahead of the pack only because he recognized his own ignorance.
In the Christian era, the person touched by genius gave way to the mystic-saint who had achieved ecstatic moments of unity with God. While a small number of people dedicated their lives to mystical practice, an immediate encounter with the divine was recognized as a possibility for any person at any time. Reality had deep truths that could not be arrived at by way of the intellect, and these truths could make themselves manifest in surprising ways.
With the rise of the Enlightenment, tutelary spirits and divine visitations went out of favor, but secular culture could not quite do away with the allure of genius. People now spoke of certain human beings as being geniuses, not as having geniuses, but the term still indicated something other than simple intelligence. The German philosopher Immanuel Kant claimed all truly great art — the kind that could transform us rather than simply entertain us — was the product of genius. Whereas conventional artistic creation involved imitation and the following of existing procedures, geniuses made their own rules. They acted not by science but by inspiration, with all of the spiritual trappings that term implies.
Even in an era of extraordinary technological breakthroughs, genius was most likely to be identified with artists and poets, in part because there was still a belief that art could deliver profound human truths unavailable to us through other means. For the Romantics, one good indicator of genius was a marked lack of aptitude for mathematics, which they considered simply a technical skill grounded in following rules. Something of this old thrust still applied to our elevation of the 20th century’s great scientific geniuses. Figures like Einstein, Gödel, von Neumann and Oppenheimer were thought to possess intuitive powers that seemed only tangentially related to their quantitative abilities.
There are many reasons our culture has largely given up on geniuses, some of them very good. We’re living in a thoroughly fraudulent era whose signature figures, from Donald Trump to Sam Bankman-Fried, have made the claim of genius central to their frauds. We are also increasingly sensitive to the harm caused when genuine creative talent is excused for abusive behavior. On all fronts, we have become rightly impatient with those who make up their own rules.
But our suspicion of genius runs much deeper than the social and political turmoil of the past decade. In fact, it’s as old as the concept of genius itself. While Socrates inspired a nearly religious devotion in his followers, many Athenians found him frankly ridiculous. Still others found him dangerous, and this faction managed to sentence him to death, a verdict he accepted with equanimity. (He didn’t mind leaving this life, he reported, because his genius had nothing to say against it.)
Many of the holy figures of medieval Christianity resembled Socrates not just in their humility and simplicity but also in the threat they posed to the surrounding society whose norms they rejected. Often enough they faced death at that society’s hands as well. Even Kant noted the perpetual challenge of distinguishing the genius from the charlatan: “Nonsense, too, can be original,” he acknowledged.
What seems to have changed more recently is not our understanding of the risk of the pseudo-genius but our suspicion of the very possibility of genius of the genuine sort, and this has everything to do with the larger cultural developments of the past 50 years. If the critical turn “means anything,” the American theorist Fredric Jameson wrote in a classic study of postmodernism, it signals the end of “quaint romantic values such as that of the genius.” From the vantage point of cognitive science, meanwhile, the classic notion of genius makes no sense. Clearly some people have more processing power than others, but the idea of some cognitive quality other than sheer intelligence is incoherent.
Our culture seems now to reserve the designation of genius almost exclusively for men who have put quantitative skill to work in accumulating enormous wealth. Bill Gates, Steve Jobs, Mark Zuckerberg and Elon Musk have all been awarded the status at one time or another. In the process we have rendered the term useless. When the clearest sign of genius is a net worth in 10 figures, we have come a long way from the ascetic outsider who suffers for the truth. Under such conditions, it’s hardly surprising that the term is treated as a cynical bit of marketing we’d be better off without.
Yet we might have given up more than we can afford to lose, as the great A.I. panic demonstrates. Ironically, our single greatest fear about A.I. is that it will stop following the rules we have given it and take all its training in unpredictable directions. In other words, we are worried that a probabilistic language-prediction model will somehow show itself to be not just highly intelligent but also possessed of real genius.
Luckily, we have little reason to think that a computer program is capable of such a thing. On the other hand, we have ample reason to think that human beings are. Believing again in genius means believing in the possibility that something truly new might come along to change things for the better. It means trusting that the best explanation will not always be the most cynical one, that certain human achievements require — and reward — a level of attention incompatible with rushing to write the first reply. It means recognizing that great works of art exist to be encountered and experienced, not just recycled. Granted, it also means making oneself vulnerable to the pseudo-genius, the charlatan, the grifter. But belief of any kind entails this sort of risk, and it seems to me a risk worth taking, especially when the alternative is a stultifying knowingness.
If we really are better off without the Romantic idea that certain people are exceptions to the general rule of humanity, that they will favor us with their insight if only we don’t force them to abide by the constraints that apply to the rest of us, perhaps we could instead return to the old Socratic-mystic idea that genius might visit any of us at any time. There is a voice waiting to whisper in our ears.
Everything about our culture at the moment seems designed to eliminate the space for careful listening, but the first step in restoring that space might be acknowledging that the voice is out there and allowing that it might have something important to say.
Christopher Beha is a memoirist and novelist." [1]
1. A.I. Isn’t Genius. We Are.: Guest Essay. Beha, Christopher. New York Times (Online) New York Times Company. Dec 26, 2024.
Komentarų nėra:
Rašyti komentarą