Asking if computers will be more intelligent than humans distracts us from grasping the underlying ethical problem with the humans who create and use them. (Shutterstock)
In the age of the Anthropocene, humanity appears poised to destroy itself.
Each day brings a reminder of another threat to our peace and security. War, political instability and climate change send migrants and refugees across national borders. Cybercriminals hack networks of public and private institutions. Terrorists use trucks and planes as weapons.
And hanging grimly above us all, like the sword of Damocles, lurks the threat of total nuclear annihilation.
At the root of these threats is a problem that is as old as humanity itself.
In the domain of survival and reproduction, human intelligence stands out for one specific reason. We are the only species on earth for whom intelligence is also an ethical liability. As the anthropological critic Eric Gans has argued, we are the only species for whom the problem of our violence is also our greatest existential threat.
Insights from western literature and myth point to the ethical problem at the core of human intelligence. How we understand the role of humans’ symbolic communication, including language in establishing ethical relations, has profound consequences for our society.
An ethical liability
For most of human history, controlling human conflict has been the task of religion. For example, among hunting and foraging societies, carefully prescribed rituals must be followed when meat is distributed after a successful hunt.
Animals are difficult to track and kill. Meat is rare and highly valued. Consequently, the possibility of violence breaking out during distribution is more likely. Religion provides an ethical guide to the peaceful distribution of meat.
The ethical problem of human violence has also been explored by literature.
For example, my work on Shakespeare examines his plays as a systematic attempt to understand the origin of human conflict. Shakespeare’s plays depict in exquisite detail humanity’s penchant for self-destruction.
Before Shakespeare, Homer’s epic poem the Iliad treated similar themes. Homer’s focus was not simply the war between Greeks and Trojans but, more precisely, Achilles’s resentment of his king, Agamemnon, who has used his authority to appropriate Achilles’s war captive, Briseis.
Achilles is by far the better fighter, but if the Greeks are to win the war, Achilles must learn to defer his resentment of his superior.
Monster as metaphor
In the scientific and technological revolutions of the modern era, this lesson receives a peculiar twist in science fiction, beginning with Mary Shelley’s Frankenstein.
In Mary Shelley’s novel, the protagonist Victor Frankenstein succeeds in creating a being that is capable of thinking for itself. But Victor’s creature very quickly becomes Victor’s hated rival, which is why Victor refers to his creation as a hideous monster. Victor has what his rival wants, namely, a wife and, therefore, the prospect of children. Victor’s monster is a metaphor for the violence humans inflict on one another.
Of course, all animals compete for scarce resources. In this Darwinian competition, violence between rivals is inevitable. Other social animals, like chimpanzees, have well-developed pecking orders that allow conflict over disputed objects to be defused or constrained. The beta animal may challenge the alpha in a fight. If it wins, it takes the alpha position.
But these challenges for dominance are never represented symbolically as existential threats to the social order.
Only humans represent their capacity for violence symbolically in religion, myth and literature because humans are the only animals for whom the greatest danger is themselves.
Establishing mutual attention: an ethical task
The dominant view today is that human intelligence is measured by how fast an individual brain can process information. This picture of the human brain as an “information processor” is itself a product of the belief that the most important thing about speech is to communicate facts about the world.
But what this picture misses is a more fundamental task of language: establishing mutual attention.
(Shutterstock)
Michael Tomasello, a professor of psychology and neuroscience who specializes in social learning, notes that at around nine months of age, children engage in what he calls joint attentional scenes.
The child’s mother may point to some flowers and say, “Pretty flowers!” What is significant is not merely that the mother has uttered words, but that the child is being invited to engage in joint attention with the mother. The flowers are being made present to the child as an object of shared collective and aesthetic attention.
An ethical social order
These insights demonstrate that establishing a human sense of the world depends on our relationships with other people. An ethical social order depends on ethical relationships.
In the age of social media, the rapid rise of extreme ideologies and conspiracy theories has underscored the ineffectiveness of focusing on empirical truth alone to combat extremism. Many people remain enthralled by charged and incendiary speech or ideologies.
This fact ought to remind us that before we can communicate a concept, we must establish a scene of joint attention.
The view that language is mostly about communicating concepts has consequences beyond encouraging us to underestimate the threat posed by polarizing, divisive or hate speech. This view also encourages us to see people as discrete storehouses of information, who are valuable to us for our own use, instead of in their own right.
Forgetting our ethical responsibilities
Increasingly, our conversations are mediated by the ubiquitous digital screen. This is convenient, of course, but convenience comes with a cost.
The cost could be that we forget our ethical responsibility to others.
When technologists assert that computers may soon be smarter than humans and that artificial intelligence represents an existential threat to humanity, they distract us from grasping the underlying ethical problem, which lies not in the computer but with the humans who create and use it.
Richard van Oort, Professor of English, University of Victoria
This article is republished from The Conversation under a Creative Commons license. Read the original article.
books_awareness