Saturday, June 06, 2020

An engineering look at p-zombies

"A p-zombie would be indistinguishable from a normal human being but lack conscious experience, qualia, or sentience. For example, if a philosophical zombie were poked with a sharp object it would not inwardly feel any pain, yet it would outwardly behave exactly as if it did feel pain. The thought experiment sometimes takes the form of imagining a zombie world, indistinguishable from our world, but lacking first person experiences in any of the beings of that world."
This is the kind of thing philosophers get up to, but why would anyone think this critique of materialism (or physicalism) could make sense? Here is the modal argument, due to David Chalmers:
"According to Chalmers one can coherently conceive of an entire zombie world, a world physically indistinguishable from this world but entirely lacking conscious experience. The counterpart of every conscious being in our world would be a p-zombie. Since such a world is conceivable, Chalmers claims, it is metaphysically possible, which is all the argument requires. Chalmers states: "Zombies are probably not naturally possible: they probably cannot exist in our world, with its laws of nature." The outline structure of Chalmers' version of the zombie argument is as follows;

1. According to physicalism, all that exists in our world (including consciousness) is physical.

2. Thus, if physicalism is true, a metaphysically possible world in which all physical facts are the same as those of the actual world must contain everything that exists in our actual world. In particular, conscious experience must exist in such a possible world.

3. In fact we can conceive of a world physically indistinguishable from our world but in which there is no consciousness (a zombie world). From this (so Chalmers argues) it follows that such a world is metaphysically possible.

4. Therefore, physicalism is false. (The conclusion follows from 2. and 3. by modus tollens.)"
This sounds way too much like the Ontological Argument.

Here's an argument. Consciousness is an evolved faculty, like language. It requires brain resources. It must be doing a job mediating what it is that humans do. If you subtract consciousness while leaving a human active in the world you surely will have impaired behaviour - as when consciousness is removed by sleep, anaesthesia or injury.

We could model a p-zombie as a a Rodney Brooks subsumption architecture. A low-level runtime system (like the hind brain) controlling basic bodily functions - heart rate, breathing, reflexes, digestion - plus a top-level planner or theorem-prover managing the agent's progress through the world to meet its survival and social goals. This is a very conventional architecture in AI which no-one has plausibly claimed to be conscious.

This p-zombie would certainly do the basics. It could answer questions, claim that it 'felt' pain if it were damaged and in general execute any social behaviour which we could classify (in the deep learning sense) or theorise (in the sense of GOFAI).

So what's missing? What's necessarily missing?

Maybe that's the core mystery, the hard problem in a nutshell.

---

Here's a final thought. In a Darwinian world of predators and prey, humans and other animals are notoriously prone to see agency everywhere. What Dennett calls Second-Order Intentional Systems Theory:
"A first-order intentional system is one whose behavior is predictable by attributing (simple) beliefs and desires to it. A second-order intentional system is predictable only if it is [itself] attributed beliefs about beliefs, or beliefs about desires, or desires about beliefs, and so forth." -- Daniel Dennett, Intentional Systems Theory. 1971.
Aside from some experimental systems utilising clunky modal logic (a blind alley in my view) we don't build AI systems today which understand that their environments are populated by entities which have agency.

In fact I don't think we know how to do that.

And without such a requirement, perhaps it's no surprise that our AI systems lack the capability to model themselves as having agency. And so to a lack of self-consciousness - and p-zombiehood.

If autistic people lack a theory of mind (which I don't believe they do, at least in that straightforwardly simplistic form) then we could say that the p-zombie would present itself as autistic, in the guise of a smoothly-performant Asperger's syndrome.

No comments:

Post a Comment

Comments are moderated. Keep it polite and no gratuitous links to your business website - we're not a billboard here.