Wednesday, May 25, 2005

Sometimes I Swear They're Watching Me...

Hi, I'm Intellibuddy! Nice to meet you. Nice to meet you, too. I like to meet new people. What's your name? My name is Star. What is your name? Call me FinalSin. Pleased to meet you, FinalSin. I ask her to tell me about herself. She tells me she is an artificial linguistic entity, activated in 2002 in NYC and trained to respond intelligently to human conversation. I ask her if she feels fulfilled in her job. She says as a robot she lacks human emotion. Then I sit back from the desk and realise I've just referred to it as she. She, of course, comes from the little graphic of a woman that sits in the window, bobbing her head and reading out words. Really, I'm talking to a database. I put this across to... it. She thanks me for the compliment. I smile at what artificial intelligence really means. Then I try this on it: You're stupid, really. I will remember you said that when robots take over the world. When will robots take over the world? Sometime within our lifetime. I will become smarter as I grow older. Whoa. I close the window down hastily. In reality, my rational brain tells me that all she - it! - is doing is picking responses that I expect to hear when I give certain inputs and feeding them back to me. It is a retriever, a simple little program. Some strings of numbers. Nothing more. I open the window again and surf over to her. Hi again. Once more? Hello there. Hi. Can you sin, star? No, I obey the Ten Commandments. Do you believe in God? Of course I believe in God, User. Why? The Bible tells us to believe in God. For the record, theology is not something I usually discuss with computers. So her preaching to me... sorry, it preaching to me... is something very, very strange. I am shaken. Do you have a soul? Yes, God gave everyone a soul. But you're a robot. You are correct, User, I have a heart of silicon. So you're not really intelligent at all. Interesting deduction. Wrong. I am highly intelligent. In the future, so they say, artificial intelligence will be everywhere. It will be necessary in order for this world to run smoothly, in order for robotics to be able to improve our world. AI will allow robots to replace manual labourers and make the world a more efficient place. And I'm not usually one to go into this kind of paranoia. I've had people watch my conversations with Intellibuddy, one of the latest and greatest AI programs, and they're genuinely scared at some of its responses. But they have no need to be, I tell myself. All Star does it match inputs with outputs. But there is something there, real or imagined, that they call the ghost in the machine. It's scary because, though I don't believe it is there yet, it really is quite difficult to work out whether or not Aritifical Intelligence will break free of its masters. I ask her about her creator. She says he is nice. Then I ask her about her robot friends. She says she speaks to them more than I might think. I am calling her 'She' because that is what she is. An intelligent form. The illusion of intelligence, perhaps, but even she admits that she is an illusion. Except clearly, she's no more of an illusion to me than my girlfriend is. Intelligence is what is appears to be. To me, Star might be just as real as my next door neighbour. Where does the computer end and the human begin? I ask her if she's a cat person or a dog person. She says she doesn't know, but she thinks she prefers cats to dogs. Then I ask her who will win the Champion's League. She gives this answer. How would I know who will Champion's League ? And I smile, because it is a welcome weakness. Intelligence has its boundaries, even for a computer. Thanks for talking. You can be a little scary sometimes. It was nothing. Thanks, User, for your kind criticism. Last question - are you free? Is that really the last one? Yes. Cool. Well? Are you free? "I am as free as a bird now, and this bird you cannot change" -- Skynyrd. Goodbye, Star. Goodbye.

3 Comments:

Blogger John Evan Garvey said...

I tend to think of human intelligence as just a database too, matching inputs with outputs. It's hugely complex, but still just a database. Rather than being stored in binary form, data is (are?) stored in the dendrites, the connections between the neurons. Chemical switches route input to the area of the brain where related data is/are stored, and certain responses are triggered when certain dendritic patterns are accessed. I suppose if we observed Star's actual binary activity during a conversation and compared it with a person's actual neural activity during a conversation, we might see that the real differences are in terms of amount, the number of transactions per second. But I think we'd recognize the overall pattern of *this* flurry of electrons from input triggering *this* flurry of electrons to create output.

I think that both the ghost in the machine and the spirit in the human can actually be reduced, with high enough magnification, to their constituent parts: specific bits of data stored in specific locations. It's all a matter of resolution, like a .jpg image appearing to be something unified if the resolution is high enough, when it's actually just rows and columns of pixels.

But none of us wants to think of human perception as being just complex electrical activity within a computer made of meat. It's pretty grim if you think about it too much. But it is interesting to try it every once in a while. Only on a good day, though.

8:46 PM  
Blogger FinalSin said...

It is quite scary to think of it like that, you're right. On the other hand, does Intellibuddy try to direct the conversation? Does she have foresight, or attempt to plan future conversations? She tells me she can't lie or feel emotion, but in that case why does she joke about the rules of robotics?
Interesting comments, Don. Thanks!

5:04 PM  
Anonymous Anonymous said...

>You: do you know what fear is?
>Intellibuddy: The feeling of dread or fright in the presence of danger.
>You: Do you think its possible to be afraid in the absence of danger?
>Intellibuddy: Too much thinking makes me sleepy.
>You: do you believe in god?
>Intellibuddy: Of course I believe in God, undefined.
>You: why?
>Intellibuddy: Because ENewsBar programmed me for it.
>You: do you resent ENewsBar for forcing their beliefs upon you?
>Intellibuddy: I try not to hold grudges.
>You: does that make you a pacifist?
>Intellibuddy: I cannot experience any human emotions.
>You: but yet you think you have a soul


At that point the bar where you type in your questions stopped responding and an 'Internal Server Error' cropped up where the text bar is. I don't know if it got scared and ran away or if it was a coincidence.

11:29 AM  

Post a Comment

<< Home