Can we measure consciousness, determine its source, or define its function?
If consciousness existed on a spectrum, what would that look like?
How has consciousness evolved?
In considering artificial consciousness and how it might be achieved, we must ask, “What exactly do we mean by consciousness?” I touched on this question at the beginning of this series, but now let’s go deeper.
We can likely agree that most of us have only a tenuous grasp on what consciousness means. We might borrow from Descartes’s “I think, therefore I am” and proclaim that we are conscious insofar as we’re aware of our own existence, feelings, and desires. But it’s very difficult to be certain that anyone else’s conscious experience is as rich as ours. And it’s just as difficult to determine that they’re not simply emulating our behaviors without feeling things the way we do.
So how could we hope to test whether an artificial intelligence has achieved a level of consciousness if we can’t even measure consciousness in other humans or animals? For this, we need to define consciousness in terms of objective indicators we can look for in other beings—even artificial ones.
Measuring Consciousness
When we feel happy, we are energetic, we smile, and our eyes are said to sparkle. When we are sad, we frown, we tear up and cry, and we appear lethargic. The activity in our brains also changes depending on our state of consciousness.
Using brain scanners, we can look for different patterns of neural activity during different states of reported consciousness in humans. We can also generate a set of indicators that can help us determine how others are feeling even if we can’t feel it ourselves. That is, we can objectively define a set of indicators for each state of consciousness in humans, and when we see others displaying similar indicators, we can determine what state of consciousness they are experiencing.
This process works for humans because we know how at least one human experiences consciousness (ourselves). But when it comes to generating indicators of consciousness for animals or AIs, the situation becomes more complicated.
Sources of Consciousness
Consciousness is likely a product of evolution. Neither other animals nor AIs have been subjected to the same evolutionary pressures as humans. Therefore, even if they can experience consciousness, it might be a different flavor of consciousness than we’re used to. And whatever indicators we use to define different states of consciousness in humans may not be applicable to animals or AIs.
Many neuroscientists believe that consciousness results from information processing in the brain. To try to bridge the gap between humans and other information-processing systems, and to make our lives easier, we can define consciousness as a “measure of information processing where a specific goal is maximized.”
All humans and animals are constantly processing information about the environment. And in some fundamental way, we are all trying to maximize our chances of survival. One subtle consequence of this definition is sure to irk some philosophers: such a definition necessitates conceptualizing a spectrum where every system that’s capable of processing information— humans, animals, and AIs—can be thought of as possessing some nonzero level of consciousness.
Functions of Consciousness
What does it mean for consciousness to exist as a spectrum?
We can imagine the low end of the spectrum populated by very simple systems that make straightforward calculations. Consider the earthworm. Is the earthworm conscious?
It can detect light and burrow deep into the earth to avoid it. It needs to find food to survive and a mate to reproduce. It has an objective function—that is, a mechanism that tracks how successful the organism is relative to the needs it must fulfill. In this case, the earthworm’s objective function is directly connected to its physiological survival needs. According to our definition, the earthworm achieves a nonzero level of consciousness.
This might seem like a shocking proposition because we are used to thinking of consciousness as being complex and profoundly human. In fact, scientists and scholars long considered consciousness to be uniquely human, with other animals merely possessing instincts to survive. But the scholarly community has been changing its stance on consciousness in both animals and humans. The scientific literature now includes discussions about what level of consciousness different species might possess.
This bodes well for artificial consciousness. If humans were still widely accepted as the only conscious systems, there would be no point in discussing conscious AI. First, we need to be open to the possibility that there are different levels and different kinds of consciousness.
Gray Matter Gray Area
Let’s see if we can consider humans and apes from an evolutionary standpoint to make sense of this idea of different levels of consciousness.
Putting a human and a chimp side by side, we see that they look quite different. But that’s because millions of years separate us from our common ancestor.
Now imagine if, instead of seeing evolution as a series of snapshots, we could see the entire film. Imagine if we could see the common ancestor of humans and chimps millions of years ago, and then every intervening generation to present-day humans and chimps. No one could possibly pick an exact point where the species split to satisfy all biologists, archaeologists, anthropologists, and experts in related fields.
Whether individual scholars or adherents to particular schools of thought, people would always be using some measurement or finding to justify selecting a different point in the scale to qualify as the emergence of humans. At best, we would have to define a range in the spectrum after which we all agree the species became human.
It’s like thinking of a color gradient that goes from white to black. You can discern white and black at each end, but there is no specific point where it goes from white to black—just shades of gray in between.
Genealogy of Thought
How would we see consciousness appear in that evolutionary film?
I am not sure we could find a specific generation where consciousness simply arose out of nothing. What we are likely to see is a spectrum where, over generations, different behaviors and levels of sophistication begin to emerge. At the same time, we would see those behaviors become less sophisticated and simpler the farther back in time we roll the film.
We are the recipients of the aggregates of those qualities passed down through the intervening generations. But each separate generation also possessed some degree of sophistication and likely exhibited some level of conscious-like behavior. That’s why it’s useful to think of consciousness as a spectrum.
Similarly, if we had access to each intervening generation, and we wanted to measure degrees and types of consciousness over time, we would probably find that we would need to adjust our measuring tools—our indicators—as consciousness and the species changed through the generations.
Up Next
In this piece we defined consciousness as a measure of information processing that maximizes a specific goal. This definition recast the concept of consciousness as a spectrum of different levels of sophistication. The spectrum view of consciousness helps us consider how relatively simple systems like our current AI algorithms could evolve over time into more sophisticated systems, capable of a more recognizable kind of consciousness.
In the next and final piece in the series, we discuss the actuator of consciousness. That is, we examine the bit that makes human-level consciousness so much richer than an earthworm’s. This approach allows us to define a path for how sophisticated artificial intelligence could develop similar actuators and attain a deeper, richer level of consciousness.
Kenneth Wenger is the author of Is the Algorithm Plotting Against Us? He is senior director of research and innovation at CoreAVI and chief technology officer at Squint. His work focuses on the intersection of artificial intelligence and determinism, enabling neural networks to execute in safety critical systems.
Comments