ReviewEssays.com - Term Papers, Book Reports, Research Papers and College Essays
Search

Searle and His Dilapidated Chinese Room

Essay by   •  November 20, 2010  •  Essay  •  2,033 Words (9 Pages)  •  2,044 Views

Essay Preview: Searle and His Dilapidated Chinese Room

Report this essay
Page 1 of 9

Searle and His Dilapidated Chinese Room

It is the objective of this essay to demonstrate that Searle's Chinese Room argument is fallacious on the grounds that it commits the fallacy of composition. Since it is fallacious on this account, the argument fails to adequately discount the Turing Test as an indicator of artificial intelligence. We shall substantiate our claim by executing the following: 1) discussing the Turing Test and its role in Searle's argument; 2) addressing the Chinese Room argument directly; 3) demonstrating that Searle's move from the homunculus to the wider system is unjustified and fallacious; and finally 4) expounding on this view by means of a quasi-dialogical form by incorporating Searle's retorts as we proceed through our claim. It shall become quite apparent that his responses to the Systems Reply do not introduce any new evidence for his case.

In the mid-twentieth century, Alan Turing fueled the classic debate of whether machines could think with his impressive insights. He introduced a scientific test for determining the success or failure of a thinking machine, or to be more specific, a thinking computer. The notion is fairly straightforward: if a computer can perform in such a way that an expert interrogator cannot distinguish it from a human, then the computer can be said to think. Since then, it has been the aim of artificial intelligence to design a system to pass this test.

Thirty years after Turing first formulated this test, John Searle put forth a thought experiment of a system that he claimed would pass the Turing Test. He claimed further, however, that any observer would clearly see that the system would not be able to think. We shall now attempt to expose the subtle logical fallacy in Searle's thought experiment, the Chinese Room argument.

The Chinese Room argument has prima facie merit and strength. He proposes this situation. Place an English speaking man ignorant to Chinese in a room with only a rulebook, which is written in English, and an input/output slot for communicating with the surrounding world. Now imagine that this man is asked questions written in Chinese and passed through the slot. The man is asked to follow the instructions in the book and then to output a response for the Chinese interrogators. We assume that the instruction book has codified all the rules needed to speak fluently by mere Chinese symbol manipulation. The man follows the rules perfectly and supplies flawless Chinese answers to the questions, yet note that the exchange of "squiggles and squoggles" means nothing to him. The interrogators outside of the room, however, believe that the man inside the room comprehends Chinese. It should be noted that the man symbolizes the computer and the rulebook represents the computer program. According to Searle, although the man in the room would pass the Turing Test, it cannot be stated that the man comprehends the dialogue in the way that the interrogators believe. Searle, thus, concludes that, "just manipulating the symbols is not by itself enough to guarantee cognition, perception, understanding, thinking, and so forth." His states his position with the following premises and conclusion:

1. Computer programs are formal (syntactic).

2. Human minds have mental contents (semantics).

3. Syntax by itself is neither constitutive of nor sufficient for minds.

4. Therefore, programs are neither constitutive of nor sufficient for minds.

Our principal objection to Searle's argument concerns his application of the premise that man does not comprehend the conversation (Searle refers to this objection as the Systems Reply). We hold that he cannot validly make the move of assuming that the system as a whole understands nothing. We concede that there is no way to demonstrate that the whole system understands. However, there is no way to prove that the whole system does not understand, which is what Searle would have us believe. To elaborate, if a part fails to display quality Ð''x', for example, we cannot logically infer from this that the greater system also fails to display quality Ð''x'- to do so would be to fall prey to the fallacy of composition. Let us consider an analogy with a pilot flying an airplane. A pilot, according to the laws of nature, cannot fly on his own, but that does not mean to imply that when he is in a plane the airplane is also unable to fly. In all probability, Searle would hesitate to contend that the plane cannot fly, but he would have us believe that that same faulty logic is valid when applied to his Chinese Room argument. The pilot is, after all, the symbol manipulator of the cockpit controls, just as the man is the symbol manipulator of the room. Again, this example does not concern the room's ability to think; we hold that there is no logical way to support that claim. The point that we are trying to make is that we cannot assume that the room does not understand merely based on the fact that the man inside the room does not understand.

When he first introduced his Chinese Room argument, Searle anticipated the Systems Reply and attempts to dismiss it with a counter-argument. However, we hold that his counter-argument fails. Searle adjusts the scenario so as to internalize the system within the man in the room, so now the man does not even have to be situated in a room. Instead of the rulebook serving as a separate entity, he suggests that we can imagine the man following the rules completely from memorization. Ultimately, he holds that the man still lacks any comprehension of the Chinese exchange with the interrogators. Searle, here, attempts to evade the objection by making the system exist within the man, but he misses the point. It remains that the man is not the system, thus his lack of comprehension cannot be projected onto the system. Searle's statement that "there is nothing in the Ð''system' that is not in me, and since I don't understand Chinese, neither does the system" is unsubstantiated. Jack Copeland shows in his Artificial Intelligence that "Searle himself makes no mention of why he believes [the projection onto the system to logically work]." For similar reasons stated above, Searle cannot dismiss the Systems Reply by merely asserting otherwise. In this circumstance, the man has not become the system, thus this adjustment of the argument does not accomplish much. Now the man encompasses the system rather than being encompassed by it. Searle still cannot logically draw conclusions about the system based on the man.

Searle's basic argument and even his modified,

...

...

Download as:   txt (11.9 Kb)   pdf (159.6 Kb)   docx (13.3 Kb)  
Continue for 8 more pages »
Only available on ReviewEssays.com