Stories of Your Life and Others


Liking What You See: A Documentary


Download 5.39 Kb.
Pdf ko'rish
bet90/91
Sana18.06.2023
Hajmi5.39 Kb.
#1588352
1   ...   83   84   85   86   87   88   89   90   91
Liking What You See: A Documentary
Psychologists once conducted an experiment where they repeatedly
left a fake college application in an airport, supposedly forgotten by a
traveler. The answers on the application were always the same, but
sometimes they changed the photo of the fictitious applicant. It turned out
people were more likely to mail in the application if the applicant was
attractive. This is perhaps not surprising, but it illustrates just how
thoroughly we're influenced by appearances; we favor attractive people
even in a situation where we'll never meet them.
Yet any discussion of beauty's advantages is usually accompanied by a
mention of the burden of beauty. I don't doubt that beauty has its
drawbacks, but so does everything else. Why do people seem more
sympathetic to the idea of burdensome beauty than to, say, the idea of
burdensome wealth? It's because beauty is working its magic again: even in
a discussion of its drawbacks, beauty is providing its possessors with an
advantage.
I expect physical beauty will be around for as long as we have bodies
and eyes. But if calliagnosia ever becomes available, I for one will give it a
try.


The Lifecycle of Software Objects
People routinely attempt to describe the human brain’s capabilities in
terms of instructions per second, and then use that as a guidepost for
predicting when computers will be as smart as people. I think this makes
about as much sense as judging the brain by the amount of heat it generates.
Imagine if someone were to say, “when we have a computer that runs as hot
as a human brain, we will have a computer as smart as a human brain.”
We’d laugh at such a claim, but people make similar claims about
processing speed and for some reason they get taken seriously.
It’s been over a decade since we built a computer that could defeat the
best human chess players, yet we’re nowhere near building a robot that can
walk into your kitchen and cook you some scrambled eggs. It turns out that,
unlike chess, navigating the real world is not a problem that can be solved
by simply using faster processors and more memory. There’s more and
more evidence that if we want an AI to have common sense, it will have to
develop it in the same ways that children do: by imitating others, by trying
different things and seeing what works, and most of all by accruing
experience. This means that creating a useful AI won’t just be a matter of
programming, although some amazing advances in software will definitely
be required; it will also involve many years of training. And the more useful
you want it to be, the longer the training will take.
But surely the training can be accelerated somehow, can’t it? I don’t
believe so, or at least not easily. This seems related to the misconception
that a faster computer is a smarter one, but with humans it’s easier to see
that speed is not the same thing as intelligence. Suppose you had a digital
simulation of Paris Hilton’s brain; no matter how fast a computer you run
her on, she’s never going to understand differential equations. By the same
token, if you run a child at twice normal speed, all you’d get is a child
whose attention span has been cut in half, and how useful is that?
But surely the AI will be able to learn faster because it won’t be
hampered by emotions, right? On the contrary, I think creating software that
feels emotions will be a necessary step towards creating software that
actually thinks, in much the same way that brains capable of emotion are an
evolutionary predecessor to brains capable of thought. But even if it’s
possible to separate thinking from feeling, there may be other reasons to


give AIs emotions. Human beings are social animals, and the success of
virtual pets like Tamagotchis demonstrates that we respond to things that
appear to need care and affection. And if an AI takes years to train, a good
way to get a human to invest that kind of time is to create an emotional
bond between the two.
And that’s what I was really interested in writing about: the kind of
emotional relationship that might develop between humans and AIs. I don’t
mean the affection that people feel for their iPhones or their scrupulously
maintained classic cars, because those machines have no desires of their
own. It’s only when the other party in the relationship has independent
desires that you can really gauge how deep a relationship is. Some pet
owners ignore their pets whenever they become inconvenient; some parents
do as little for their children as they can get away with; some lovers break
up with each other the first time they have a big argument. In all of those
cases, the people are unwilling to put effort into the relationship. Having a
real relationship, whether with a pet or a child or a lover, requires that you
be willing to balance someone else’s wants and needs with your own.
I don’t know if humans will ever have that kind of relationship with
AIs, but I feel like this is an area that’s been largely overlooked in science
fiction. I’ve read a lot of stories in which people argue that AIs deserve
legal rights, but in focusing on the big philosophical question, there’s a
mundane reality that these stories gloss over. It’s a bit like how movies
show separated lovers overcoming tremendous obstacles to be reunited:
that’s wonderfully romantic, but it’s not the whole story when it comes to
love; over the long term, love also means working through money problems
and picking dirty laundry off the floor. So while achieving legal rights for
AIs would clearly be a major milestone, another stage that I think is just as
important – and indeed, probably a prerequisite for initiating a legal battle –
is for people to put real effort into their individual relationships with AIs.
And even if we don’t care about them having legal rights, there’s still
good reason to treat AIs with respect. Think about the pets of neglectful
owners, or the lovers who have never stayed with someone longer than a
month; are they the pets or lovers you would choose? Think about the kind
of people that bad parenting produces; are those the people you want to
have as your friends or your employees? No matter what roles we assign
AIs, I suspect they will do a better job if, at some point during their
development, there were people who cared about them.



Download 5.39 Kb.

Do'stlaringiz bilan baham:
1   ...   83   84   85   86   87   88   89   90   91




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling