Consciousness Mindf*ck
I have come to a nice paradox that I feel is worth sharing. If you happened to see it somewhere else, let me know where I can read more about it.
Let me start with the assumption that consciousness is a product* of some computation. Be it a biological neural network, GPU operations of an AGI, etc.
So, there exists a universal Turing Machine that can perform it**.
The
universal TM has got some 20 internal states, the read-write head, the
infinite tape, and, in essence, nothing more. Still, by the
aforementioned assumption, it should be conscious.
Note that the TM
works in discrete steps, only a simple and small part of it is active at
any given moment (let's call the part "engine" for convenience), and only at
most a single tape cell content matters at once.
This gives us two quite obvious possibilities:
-
The consciousness of the TM is non-local, as it somehow depends on the
static but important content of the remote parts of the tape, or
- The engine is conscious.
[*] I'm deliberately skipping the analysis of what "product" means - unless you think this is important here.
[**]
In theory, we might even replace brain neurons one by one with their mechanical
counterparts, and finally reach the TM at the technical level. In case of AI this is even simpler and can be done straightforward. But I think it does not matter here.
EDIT: some extra notes
- I do not want to make any dualist claims here. I am rather analyzing the consequences of functionality and trying to push them to the limits.
- A physical TM is not a pure state transition machine, so this is not about "how can a state transition machine be conscious". It has the "engine" with a clock and a state change mechanism somehow implemented in the physical substrate. I suppose we cannot tell much about the properties of a purely abstract TM for the following reason:
- We start with a regular, adult human brain. By convention I assume it is conscious.
- We do what we did to create the "counting stick machine": gradually replace neurons with e.g. optoelectronic counterparts one by one, then by, say, a GPU, etc.
- Two obvious things can happen. Either:
- At some conversion stage the thing becomes unconscious (but when? and why?) — BTW, since it should have the same apparent functionality, we'd create a philosophical zombie.
- The physical TM is conscious.
- The rest of the analysis goes as before, so let's look for the minimal conscious TM, tamper with the remote parts of the tape, etc. etc.
Komentarze
Prześlij komentarz