The Chinese Room Argument vs. the Brain Simulator Reply

John Searle, in “Minds, Brains, and Programs”, defends the Chinese Room Argument against the concept of strong Artificial Intelligence. Strong AI claims that an appropriately programmed digital computer with the right inputs and outputs, and satisfies the Turing test, would necessarily have a mind. Searle makes the argument that this cannot be true because syntax is not sufficient for the the understanding of semantics. Since a programmed machine enacts simulations and not duplications, it cannot understand although it may function. He uses the metaphor of the Chinese Room to help argue his point. This argument supposes if a man who does not know any Chinese is put in a room and given Chinese symbols and a set of rules that enable him to respond with the correct Chinese characters, although it may appear that the man knows Chinese, he actually has no understanding of the language.

The most convincing counter-argument to Searle’s defense of the Chinese Room is the Brain Simulator Reply. This argument poses the possibility that a machine can correctly simulate the exact brain functioning of the mind of a native Chinese speaker. Meaning the machine could simulate actual neuron activity that would be happening in the brain. Now, this argument seems most convincing because in this case, the machine and the brain would theoretically be operating in the exact same manner.  It seems natural to assume that if something runs the same way that the same results (of understanding) will occur.

However, Searle’s argument still succeeds despite this objection. Although the machine will be operating in the same manor a brain does in terms of its neuron structure, the objection does not address the issue that simulation is not the same as duplication, and in fact there is not a reason to believe that it will produce exactly the same results. In fact, this simulation just seems like nothing more than a different program of formal symbol manipulations, which by themselves do not have any intentionality. As Searle argues, computation is not by itself constitutive of thinking, and this attempt of a counter example seems to be not much more than an alternative form for computation.

One Response to “The Chinese Room Argument vs. the Brain Simulator Reply”

  1. I don’t think Searle’s argument stands up to the Brain Simulator reply at all. In fact, I think it crumbles in the face of this reply unless he holds that a mind can exist independently of any physical existence, and is NOT necessarily an emergence resulting directly from something physical (which, to me, lies in the realm of speculation). Assuming that the brain is necessary for the “mind,” an interesting question is raised.

    At what point does a simulation become ‘the real thing’?

    To quote the Stanford Encyclopedia of Philosophy: “Are artificial hearts simulations of hearts? Or are they functional duplicates of hearts, hearts made from different materials? Walking is a biological phenomenon performed using limbs. Do those with artificial limbs walk? Or do they simulate walking?”

Leave a comment