Minds and Computers : An Introduction to the Philosophy of Artificial Intelligence


particular version of computationalism according to which the mind


Download 1.05 Mb.
Pdf ko'rish
bet51/94
Sana01.11.2023
Hajmi1.05 Mb.
#1737946
1   ...   47   48   49   50   51   52   53   54   ...   94
Bog'liq
document (2)


particular version of computationalism according to which the mind
is analogous to a computer operating system, governing the oper-
ations of applications and controlling the hardware in which it is
instantiated (the central nervous system). We have also witnessed the
failure of several first-blush objections to computationalism.
The final thing to do in this chapter before we move on to examine
various artificial intelligence applications is to spend some time
reflecting on the conditions under which we are prepared to attribute
mentality.
10.6 ATTRIBUTING MENTALITY
How do we determine whether or not something has a mental life?
When we are not in the grip of philosophical scepticism, most of us
are quite convinced that those around us have mental lives – at the
very least we habitually act as if they do. Given that we have privi-
leged access to only our own mental states, what is it about other
people that leads us to believe that they have minds?
Well, for starters other people are physically very similar to our-
selves and appear to be the same kind of being. As such, we expect
them to share various properties with us, such as being capable
of similar locomotion and interaction with the environment. This
extends to an expectation that other people also experience an
inner mental life. Clearly, however, such physical similarity is not
su
fficient for an attribution of mentality such as we experience, as
these conditions might be met by non-human animals and
very young children who, we judge, lack the kind of rich mental life
we enjoy.
108
  


Nor indeed is such similarity with respect to physical capacities
necessary for attributions of mentality, as we can imagine someone
who clearly has a mind, in the strong sense in which we hold ourselves
to have minds, yet lacks these physical capacities.
So what conditions do other people meet such that we attribute
mentality to them? The key here – unsurprisingly given the professed
focus of the coming chapters – lies in our use of language.
The competent and sophisticated use of language is a hallmark of
the mental. By ‘competent’ and ‘sophisticated’ here I mean that the
speaker is able to comprehend and produce novel utterances, use
various linguistic devices to achieve various e
ffects, and discourse on
various topics involving various degrees of abstraction.
It is by virtue of evidencing this kind of linguistic capacity – which
I might point out implicates a rational capacity – that we are prepared
to attribute mentality to those things which behave in this way. Even
if this capacity is realised in highly non-standard ways we seem to be
prepared to take this as su
fficient for an attribution of mentality. No
one is going to deny, for instance, that Stephen Hawking has a mind,
and a sterling one at that.
We need to be careful here – I have made a su
fficiency claim but not
a necessity claim. There are certain cases of aphasia (language deficit)
such that su
fferers will fail any standard test of linguistic capacity yet
are still quite capable in other respects to the extent that we would cer-
tainly attribute to them a robust mentality. We will revisit this point
when we discuss human language in Chapter 16.
This su
fficiency of the linguistic capacity for mentality provides us
with a rough and ready test for the presence of a mind. If the test
subject can satisfy certain conditions we stipulate, which are designed
to probe the subject’s capacity to use language, then we might think
that should be su
fficient for assuming, at least as a working hypothe-
sis, that the subject indeed has a mind.
In real life, this assessment is an ongoing enterprise. We tend to
assume by default that others have minds and we revise our estima-
tion of their mental capacities in accordance with their evidenced lin-
guistic capacity. A little reflection on our social interactions serves to
demonstrate this so I shan’t argue any further for it. Certainly it is the
case that it is through language that we are able to investigate the
minds of others (here I intend ‘investigate’ in lay terms). The simplest
way to find out about the beliefs, desires and so forth of others is to
talk to them (although this is of course fallible).
It is precisely this intuition that Alan Turing seized upon in his
seminal article Computing Machinery and Intelligence, wherein

109


Turing determines to investigate the question of the conditions under
which we would be prepared to attribute mentality to an artefact.
Turing posited a test, which is standardly known as the Turing test,
the successful passing of which he proposed as su
fficient for holding
the subject to have a mind.
The Turing test is often glossed in a weaker form than Turing
intended, so let’s employ some care in setting it up. The test is an
extension of an old parlour game known as the imitation game. In the
imitation game, a man and a woman are placed in separate rooms
with an interrogator in a third. The interrogator employs an interme-
diary and, by passing notes (which the intermediary reads out to the
interrogator), interrogates the man and the woman for a specified
length of time. The interrogator is allowed to question the man and
the woman on any topic she sees fit. The point of the game is for the
interrogator to try and determine which room contains the man and
which the woman, and for the man and the woman to attempt to fool
the interrogator.
Turing proposed that an adaptation of this game could serve as a
barometer of mentality for artefacts (computers). In all fairness to
computing devices, we should not expect them to satisfy conditions
of physical similarity, or to be able to perform certain physical tasks
in order to be said to have a mind. After all we have seen that this is
neither necessary nor su
fficient. We should expect, however, that if an
artefact can satisfy us of its linguistic competence, then this would be
a very good indication that the artefact has a mind.
The Turing test adapts the imitation game – in ways that are prob-
ably already obvious – to provide a fair means of assessing the linguis-
tic capacity of an artefact. It is conducted as follows. We put a
computer in one room, a human in another, and a human interrogator
in a third. The interrogator is able to communicate with the computer
and with the human via a keyboard and monitor. The interrogator has
an allotted amount of time to question each participant on any topic
she sees fit, during which she attempts to discern which of her inter-
locutors is the human, while both the human and the computer attempt
to convince her that they are the human. If, at the end of the allotted
time, the interrogator is unable to discern the machine from the human,
we should say of the machine that it has a mind.

Download 1.05 Mb.

Do'stlaringiz bilan baham:
1   ...   47   48   49   50   51   52   53   54   ...   94




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling