Minds and Computers : An Introduction to the Philosophy of Artificial Intelligence


Download 1.05 Mb.
Pdf ko'rish
bet76/94
Sana01.11.2023
Hajmi1.05 Mb.
#1737946
1   ...   72   73   74   75   76   77   78   79   ...   94
Bog'liq
document (2)

appearance of understanding is evidenced by the system – su
fficiently
well to convince a human – there is something crucial lacking in the
operations of the system. They don’t, in and of themselves, mean
anything.
In other words, the syntactic operations of the Chinese room,
although they pass a Turing test, lack semantics.
17.2 SYNTAX AND SEMANTICS
On the strength of the Chinese Room thought experiment, we might
be tempted to mount this argument against computationalism:
P1 Having semantics is a necessary condition for having a mind.
P2 The syntactic operations of formal systems are not su
fficient
for having semantics.
_____________________________________________________
 The operations of formal systems are not sufficient for having
a mind.
_____________________________________________________
 Computationalism is false.
Premise 1 is not in dispute – it is clear that mental states have seman-
tic content. Premise 2, however, is certainly arguable.
There are two ways we might interpret the second premise, and,
consequently, the interim conclusion that follows from it. The weaker
interpretation is the claim that is licensed by the thought experiment;

177


however, a much stronger interpretation is appealed to in deriving the
claimed falsity of computationalism.
The weak interpretation of the second premise is that there is a
formal system such that its operations are not su
fficient for having
semantics. The stronger interpretation is that there is no formal
system whose operations are su
fficient for having semantics.
The Chinese room thought experiment does not show that there
can be no formal system whose operations are su
fficient for generat-
ing semantics. Consequently, the argument above fails to show the
falsity of computationalism. What the thought experiment does
show, however, is something a little stronger than the weak interpret-
ation of the second premise I’ve given above.
The weak interpretation of premise 2 is essentially trivial. It’s a
given that there are plenty of formal systems whose operations don’t
meet conditions for having a mind. What is interesting about the
Chinese room thought experiment, however, is that it shows that
there is a formal system whose operations alone are su
fficient for
passing a Turing test, yet, intuitively, the system lacks understanding
entirely.
We might, then, interpret the Chinese room thought experiment as
an indictment on the e
fficacy of the Turing test. After all, if some-
thing can pass the test despite a complete lack of understanding, it
doesn’t seem the test is at all a reliable indicator of the presence of a
mind. Before we draw this conclusion, however, we should reflect on
the system described in the thought experiment in light of what we
know about formal systems and natural language processing.
The thought experiment describes a system which, while logically
possible, is not physically possible. To implement this system, we
would need to draw up the generation tree for all possible Chinese
conversations that can be had in the course of several hours.
Given that a generation tree for possible conversation states would
be considerably more complex than the generation tree for possible
chess states, it should be clear that constructing a complete genera-
tion tree for even the first twenty possible conversational exchanges is
simply not computationally tenable. It is safe to say that, regardless
of future advances in practical computational power, no computer
will ever be able to pass a Turing test by following the method which
the Chinese room implements.
Ordinarily, arguing against the physical possibility of a thought
experimental situation obtaining does no philosophical work since,
generally, we are using thought experiments to test claims of logical
relations.
178
  


For instance, physicalism holds that a complete physical description
is su
fficient as a complete description of the mind. This sufficiency is a
logical claim. Consequently, while it is physically impossible that there
could be a scientist such as Mary, the thought experiment described in
Chapter 5 still speaks against this su
fficiency – to the extent that your
intuitions are primed by the thought experiment – since the situation
described is logically possible.
If the claim concerning the Turing test was similarly a logical claim,
then the Chinese room thought experiment would indeed speak
against it. Recall, however, that the claim is not that passing a Turing
test is su
fficient for having a mind, but rather that, were something to
pass a Turing test, we should be prepared to attribute mentality to it.
The Turing test is an empirical test. Consequently, when consider-
ing possible counter-examples to the e
fficacy of the test as a reliable
indicator of the presence of a mind, we should restrict our consider-
ation to empirically possible systems.
So the Chinese room thought experiment fails to straightforwardly
show the falsity of computationalism and o
ffers no indictment on the
e
fficacy of the Turing test. It does, however, still show something
rather important.
The thought experiment does, I think, show that no amount of
syntactic operation in isolation from the external world is su
fficient
for generating semantics. I could be in the Chinese room performing
this procedure for years – given enough books – and it seems, intui-
tively, that there is no way to begin to understand the meaning of the
symbols I am processing. The reason is that my operations lack an
appropriate connection to the external world.
To understand Chinese just is to understand how elements of the
language – written or spoken – relate to things outside the system of
language. Languages are systems which encode and communicate
meanings. These meanings, however, are not generated by mecha-
nisms and inputs entirely internal to the linguistic facility. Certainly
linguistic mechanisms are implicated in the conferral of meaning to
linguistic entities but necessarily implicated is an appropriate connec-
tion to the external world.
The lesson to draw from the Chinese room thought experiment is
that embodied experience is necessary for the development of
semantics. In order for our mental states to have meaning, we must
have antecedent experience in the world, mediated by our sensory
apparatus. In other words, semantics do not develop in isolation but,
rather, this development is conditional on experience in relation to the
empirical world.

179


This necessity of embodied experience for the development of
semantics does not, in and of itself, speak against computationalism.
It merely shapes the explanatory burden on the computationalist,
requiring them to provide a computational account of the meaning
conferring mechanisms. This will involve, inter alia, an account of
the computational conversion of sense data to various kinds of

Download 1.05 Mb.

Do'stlaringiz bilan baham:
1   ...   72   73   74   75   76   77   78   79   ...   94




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling