Uncommon Descent Serving The Intelligent Design Community

Can we teach a computer to feel things? A dialogue…

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

There’s the computer’s side… and then there’s the dog’s side. Listen to both:

O’Leary: Bear with me one more time as I work this out. I am trying to imagine it in real life, the way things happen:

You are a dog’s human friend and he is quite sick. You take him to the vet. The vet uses up-to-date monitoring systems and the assistant helpfully explains to you what all those machine signals mean. So you sit there watching the machine and your presence in the room reassures the dog.

You are having an experience reading the vital signs. The dog is having quite a different experience living them. You have all of his data and none of his experience. The dog has none of his data and all of his experience.

Suppose you took all that data and instantiated it into a robot. Is the robot having your experience or the dog’s? Or neither, actually?

Is it even possible to be the subject of experience without being alive? How could simulation amount to the same thing? That’s the part I don’t understand.

Holloway: Yes, it’s a weird phenomenon in computer science, but we like to call things by names that they are not. Like “artificial intelligence” We like to hide the artificiality of the computers with naming them after real things. Just like children pretend their dolls are real people, we pretend our circuits and signals are really alive, thinking and feeling, and replacements for the real world. I guess it is the grown up version of make believe.

O’Leary: As long as we remember it’s all make believe.

News, “Can we teach a computer to feel things: A dialogue” at Mind Matters News

What do people mean when they say they can give computers or robots feelings? Is it possible to feel things without being alive?

Hey, this post is in honour of the Ottawa Humane Society and of all Humane Associations everywhere.

Comments
In 2009, a regularly scheduled Air France Airbus flying from Rio to Paris crashed into the Atlantic Ocean. Everyone onboard was killed. After extensive investigation, the conclusion was that the flight crew, who were experienced pilots, CHOSE to misinterpret both their instruments and what they could see outside the cockpit and feel in their bodies because one of the alarms had gone off. In a STEEP climb, the aircraft hit the ocean TAIL FIRST (and still nose up) It was subsequently determined that the pitot tube (which measures airspeed directly as air pressure) had FROZEN OVER, and thus gave FALSE airspeed. The crew CHOSE to go with the "zero airspeed" datum in preference to EVERYTHING else. Note that once one of the pilots had CHOSEN to use ONLY the false data, no other data from ANY other source was considered. The post-crash investigation eventually recommended that Airbus install HEATERS in their pitot tubes... So, this is a reasonably common problem: ONE of the humans misunderstands what's going on and PANICS the other humans. You of course CANNOT panic a computer. In a STEEP climb, the aircraft hit the ocean TAIL FIRST (and nose still up)mahuna
December 15, 2020
December
12
Dec
15
15
2020
04:12 PM
4
04
12
PM
PDT
Sorry Mahuna, but sensors do not make a computer "feel" in the sense used in the post, as you state in your second paragraph. A computer cannot be happy or sad, sore or disappointed. Yes, it could be programmed to simulate those feelings in its responses and behaviour, but it would not truly feel those feelings and emotions. It would simply be translating sensor inputs into actuator outputs according to its programming and stored memory. It could be cleverly programmed to fool people into thinking it is truly feeling, just as a computer can be programmed to fool a human in a Turing test. That does not make the computer into a thinking or feeling entity. Moreover, your thermostat does not "know" anything. It does not FEEL hot or cold, it is merely a mechanical or electrical machine responding as designed to its environment. For it to "know" it was hot or cold, it would need to translate the environmental signal into some sort of "knowledge" that it "understood" in some more abstract way, rather than just into an open/closed switch. I suppose you could argue that the thermostat has one "bit" of memory, relating to its "knowing" hot or cold, but one bit of mechanical "memory" does not a feeling make.Fasteddious
December 14, 2020
December
12
Dec
14
14
2020
01:10 PM
1
01
10
PM
PDT
You're being silly. You can teach a computer, with the right sensors, to feel all kinds of stuff. Autopilots on airplanes (and ships) do all all the time, every day. My furnace just kicked on again. The Thermostat knows what "hot" and "cold" mean today. Works great. But what I THINK you're talking about is something like "feel EMOTIONS". And if you really CARE, OF COURSE your facial recognition software can be modified to translate facial expressions and word choice and emphasis into Emotions. And I gotta believe that within the next 5 years the guys selling "sex dolls" will get them to the point that the robots can "feel" their partner's emotions better than most humans, AND respond better. "Better" is a choice you pick from the start up menu. I'm also pretty sure that there is software out there for lazy "political analysts" that can listen to or read 5 minutes of speech by a politician and bang out perfectly normal "commentary" using all of the preselected hot button issues. And the computer version will sound better and contain fewer typos than the human version.mahuna
December 14, 2020
December
12
Dec
14
14
2020
11:11 AM
11
11
11
AM
PDT
We always try to humanize our familiar objects. Cars and houses and clothing can have names and imputed personalities. "Artificial intelligence" wasn't really started by this kind of natural animism. It was started by a mix of Deepstate scientists and commercial hucksters. The Deepstaters have used it to demoralize and oppress humans, and the hucksters like Elon have used it to bring in billions from fools.polistra
December 13, 2020
December
12
Dec
13
13
2020
10:15 PM
10
10
15
PM
PDT
The answer is “No”.Viola Lee
December 13, 2020
December
12
Dec
13
13
2020
08:58 PM
8
08
58
PM
PDT

Leave a Reply