Is AGI a threat to Humans?

September 07, 2024
AGI
  • AGI, or Artificial General Intelligence, is a type of AI that is said to be capable of having sentience and cognitive functions, learning capabilities that are faster than humans, and adapting to new conditions;

The unparalleled advancements in every field of science and the modern philosophies of reality are merging into a new type of phenomenon, namely transhumanism. It is a movement that advocates the use of technology to enhance human conditions and capabilities. We now believe that it is possible for man to transcend the boundaries set by physical impediments, that it is possible to venture beyond what is biologically possible by integrating technology with the human body. Through this process of integration, humans may eventually evolve into something fundamentally different, which we may call “posthumans”: entities or beings in which biology and technology have a symbiotic relationship, and the cognition will be powered by Artificial General Intelligence (AGI). Though these are impressive ideas, they have deep philosophical implications. There are ethical and existential concerns that arise from tampering with human nature, and one of them is about consciousness which we will delve into in this article.

AGI, or Artificial General Intelligence, is a type of AI that is said to be capable of having sentience and cognitive functions, learning capabilities that are faster than humans, and adapting to new conditions; in other words, possessing abilities similar or even superior to those of humans. The claim that AGI will be conscious and replace humans is a bold one, because the mystery of consciousness is still unsolved. No one truly understands the nature of awareness, let alone replicate it in computers.

What is consciousness? It is the ability to be aware of one’s own existence and surroundings; it is self-awareness, the ability to reflect on the self and its thoughts. It is to experience feelings and sensations, to understand meaning and the concept of meaning itself. One of the most unique aspects of the human being is the ability to reason. The Greek stoic philosopher Epictetus, a slave who influenced the likes of Marcus Aurelius, once said something quite remarkable about the human intellect: “In general, you will find no art or faculty that can analyze itself, therefore none that can approve or disapprove of itself… So what can? The faculty that analyzes itself as well as the others, namely, the faculty of reason. Reason is unique among the faculties assigned to us in being able to evaluate itself—what it is, what it is capable of, how valuable it is—in addition to passing judgment on others.” This is indispensable in our query because if AGI does not pass the test of reason, it cannot be considered conscious in any way or form.

So, the question is: Is AGI capable of evaluating itself, meaning being critical of itself to pass judgment on itself? Can AGI be self critical, assess its own reflections and decisions? Does it understand the difference between right and wrong as humans do intuitively? Can it even have intuition like a mother has about her child? We do not fully understand these human properties; therefore, how can we endow these properties to computers?

Look at it this way: AGI is essentially a form of computation, which at its most fundamental level is defined by zeros and ones—binary digits that correspond to mechanical switches or transistors turning on and off. But is the one aware of the zero, and vice versa? Does the switch know it is being turned on and off? If something is bereft of a quality, how can it give rise to that quality? Ones and zeroes lack awareness, how is it that we expect these unconscious entities to somehow give rise to consciousness?

There is also the problem of syntax and semantics. Syntax is a set of rules and principles that governs the structures of sentences in a language, while semantics is concerned about what syntax means. Computers only understand syntax; they have no way of comprehending semantics. The phrase 'I love you' can be expressed in different symbols in different languages; they all mean the same thing to the computer. However, saying 'I love you' to your son is not the same as saying 'I love you' to your wife. Loving your wife has different depths and dimensions than loving your son. Only a conscious being is capable of grasping the differences of meaning. And how about sensations, feelings and experiences? When we taste honey, we know it is sweet. While the information is that it has sweetness, the degree of sweetness varies from one person to another. I might need three sugar cubes for my tea, while you only need one. Tasting honey before tasting salt can diminish the perceived saltiness to some extent. Our tongues also play a role in sensation. Perhaps my taste buds are not as sensitive as yours, so the degree to which we experience the taste of the same chocolate may differ. A sugarcane does not have the same sweetness as a strawberry, yet both are sweet in taste. A person living in a desert does not experience the same sensation of 40 degrees Celsius as a person living in Antarctica.

And what about perspective? The Greek amphitheaters were semi-circular. A person might sit in different places on different days for the same play—one day in the front row in the middle, another day far away on the right, and another day on the left. The characters in the play had designated positions in the amphitheater, and these positions never changed. Will the person sitting in different places each day have the same perspective on the play and its characters, even though it is the same play? Before the advent of the amphitheater, Greek sculptures appeared mundane. However, after experiencing different perspectives of the same character in the amphitheater, the Greeks began to create more lifelike sculptures. The art changed fundamentally due to this experience of varying perspectives. Will AGI ever be able to understand these differences in perspective? Our life experiences, emotions, education, religion, and self-awareness—all these factors influence our perspective. For example, a philosopher or a mystic may be content with the little wealth he has, while the same amount of wealth might be unbearable for a greedy person. When our loved ones die, we experience grief, but to a computer, it is all just information; emotion is never evoked. The intensity of grief from losing one’s child is not the same as the grief from losing a friend, although grief is experienced in both cases.

Intelligence and awareness are not the same. While a monkey cannot play chess, it is aware of the snake that lurks behind the branches. On the other hand, computers can play chess or solve complex mathematical problems, but they do not know the fear of the snake. According to Sir Roger Penrose—a renowned British mathematician, physicist, and philosopher of science—consciousness has nothing to do with computation. The cerebrum in our brain contains only a quarter of the neurons found in the cerebellum. If we suppose that the brain is a biological computer and the neurons equate to computing power, clearly the cerebellum is more powerful in this context. Yet the cerebrum is chiefly responsible for our conscious experiences, reasoning, and sensory perceptions. 

Consciousness has nothing to do with material reality, physical objects, or abstract entities like numbers. It does not exist within the space-time reality as we understand it. Some scientists suggest that consciousness is a spectrum, altering from one entity to another. For example, we now know that trees possess a form of consciousness and communicate with each other. While they may not have the faculty of reason like humans, they do experience some form of consciousness. It appears that consciousness is fundamental and gives rise to matter, rather than matter giving rise to it.

The claim that the symbiotic relationship between humans and AGI will give rise to a different species, that man is merely a caterpillar for a technological butterfly is deeply flawed. It is fear-mongering at its best and has no philosophical basis. As long as the mystery of consciousness remains unsolved which is highly unlikely, there is no way computers will ever have the same conscious experience as humans, let alone replace them entirely.

You May Like

September 07, 2024
Is AGI a threat to Humans?