Uncommon Descent Serving The Intelligent Design Community

Deep Blue Never Is (Blue, That Is)

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In the comment thread to my last post there was a lot of discussion about computers and their relation to intelligence.  This is my understanding about computers.  They are just very powerful calculators, but they do not “think” in any meaningful sense.  By this I mean that computer hardware is nothing but an electro-mechanical device for operating computer software.  Computer software in turn is nothing but a series of “if then” propositions.  These “if then” propositions may be massively complex, but software never rises above an utterly determined “if then” level.    This is a basic Turing Machine analysis. 

This does not necessarily mean that the output of computer software is predictable.  For example, the “then” in response to a particular”if” might be “access a random number generator and insert the number obtained in place of the variable in formula Y.”  “Unpredictable” is not a synonym for “contingent.”  Even if an element of randomness is introduced into the system, however, the way in which the computer will employ that random element is determined. 

Now the $64,000 question is this:  Is the human brain merely an organic computer that in principle operates the same way as my PC?”  In other words, does the Turing Machine also describe the human brain ?  If the brain is just an organic computer, even though human behavior may at some level be unpredictable, it is nevertheless determined, and free will does not exist.  If, on the other hand, it is not, if there is a “mind” that is separate from though connected to, the brain, then free will does exist. 

This issue has been debated endlessly, and I refer everyone to The Spiritual Brain for a much more in depth analysis of this subject.   For my purposes today, I propose to approach the subject via a very simple thought experiment. 

First a definition.  “Qualia” are the subjective responses a person has to objective experience.  Qualia are not the experiences themselves but the way we respond to the experiences.  The color “red” is the classical example.  When light of wavelength X comes into my eye, my brain tells me I am seeing the color red.  The quale (singular of “qualia”) is my subjective experience of the “redness” of red.  Maybe the “redness” of red for me is a kind of warmth.  Other qualia might be the tanginess of a sour taste, the sadness of depression, etc.

Now the experiment:  Consider a computer equiped with a light gathering device and a spectrograph.   When light of wavelength X enters the light gathering device, the spectrograph gives a reading that the light is red.  When this happens the computer is programmed to activate a printer that prints a piece of paper with the following statement on it “I am seeing red.”

I place the computer on my back porch just before sunset, and in a little while the printer is activated and prints a piece of paper that says “I am seeing red.”

 Now I go outside and watch the same sunset.  The reds in the sunset I associate with warmth, by which I mean my subjective reaction to the redness of the reds in the sunset is “warmth.”

1.  Did the computer “see” red?  Obviously yes.

2.  Did I “see” red.  Obviously yes.

3.  Did I have a subjective experiences of the redness of red, i.e., did I experience a qualia?  Obviously yes.

4.  Did the computer have a subjective experience of the redness of red, i.e., did it experience a qualia?  Obviously no.

Conclusion:  The computer registered “red” when red light was present.  My brain registered “red” when red light was present.  Therefore, the computer and my brain are alike in this respect.  However, and here’s the important thing, the computer’s experience of the sunset can be reduced to the functions of its light gathering device and hardware/software.  But my experience of the sunset cannot be reduced to the functions of my eye and brain.  Therefore, I conclude I have a mind which cannot be reduced to the electro-chemical reactions that occur in my brain.

Comments
I think the discussion is about whether our feeling are in any way a reliable indicator that man and machine are fundamentally different at some level. I think feelings and emotions are the basis of many such discussions for example over the existance of the soul. While I believe it can't be known if machines can be programed to 'feel' in any meaningful sense of the word, nor is it possible to prove that our feelings are purely illusionary, I'll say that is is dangerous for man to assume that they are. For if suffering, feelings and other subjective experiences are assumed to be purely the result of some form of programming and are purely illusionary, we could one day rationalise all kinds of crime against humanity under the (possible) pretense that man and machine are fundamentally alike, and that we can treat a human being no differently from a car, computer or briefcase.WinglesS
January 10, 2008
January
01
Jan
10
10
2008
06:47 PM
6
06
47
PM
PDT
This is a subject about which I know something, having spent countless thousands of hours over the past 19 years programming computers in an attempt to simulate the human reasoning process in games-playing AI (artificial intelligence). I can speak with some authority on this subject, having won both silver and gold medals in two international AI games-playing competitions in two different disciplines. These attempts have been both spectacularly successful and spectacularly unsuccessful. By combining the brute force of a computer (a machine with two CPUs performing a billion integer calculations per second, each accessing two gigabytes of RAM, over a period of nearly two months, nonstop, 24/7) with some highly sophisticated, intelligently designed algorithms, the program was able to solve some problems that no human has been able to solve. (See here.) On the other hand, the programs are completely stupid. By that I mean that they have no capacity to learn and modify their "thinking" process on their own, as do the humans who create them. I have come to the conclusion that life, consciousness, and creative intelligence represent the three most interesting phenomena in the universe, because they are all highly negentropic -- and all attempts to explain them away in materialistic terms result in logical absurdity and self-refutation.GilDodgen
January 10, 2008
January
01
Jan
10
10
2008
06:40 PM
6
06
40
PM
PDT
aiguy says: "In the end, there is no justification for imagining that whatever the cause of life was, it experienced qualia." I reply: Unless one holds that a cause must always be greater than its effect: a Creator without the experience of qualia could not impart that attribute to His creatures.Gerry Rzeppa
January 10, 2008
January
01
Jan
10
10
2008
06:33 PM
6
06
33
PM
PDT
"Conclusion: The computer registered “red” when red light was present. My brain registered “red” when red light was present. Therefore, the computer and my brain are alike in this respect..." And alike also in that both were intelligenty designed and intelligently pre-programmed to arrive at a specific response to a specific stimulus?Emkay
January 10, 2008
January
01
Jan
10
10
2008
06:32 PM
6
06
32
PM
PDT
Hi Jason, I suspect you would find that it is not possible to intelligent in any recognizable sense and not have qualia. I suspect you're wrong. But as the saying goes, if suspicions were theories, we'd all be scientists. Certainly an agent will by definition have “something it is like to be that agent”. But that “something it is like to be” is an essential part of what Qualia are. You can define your terms however you'd like; if definitions were theories, we could simply define the answer to any question. But they're not. In the end, there is no justification for imagining that whatever the cause of life was, it experienced qualia. Thus, if you define intelligence as entailing qualia, then there is no justification for imagining that the cause of life was "intelligent" by that definition.aiguy
January 10, 2008
January
01
Jan
10
10
2008
06:08 PM
6
06
08
PM
PDT
"It is not at all clear what the relationship is between qualia and design abilities. " I suspect you would find that it is not possible to intelligent in any recognizable sense and not have qualia. Certainly an agent will by definition have "something it is like to be that agent". But that "something it is like to be" is an essential part of what Qualia are.Jason Rennie
January 10, 2008
January
01
Jan
10
10
2008
05:20 PM
5
05
20
PM
PDT
Mapou, This would mean that the the brains of these savants are recording their own state, moment by moment! Having studied AI and neural networks for many years, I can assure you that this is completely biologically impossible. The reason is that the size of the brain makes no difference since it must record its own state over and over. This would be a very interesting result to demonstrate - you should publish it. It does entail that you understand how memory is stored biologically however, which most of us cognitive scientists agree is not the case. (I would rethink your conception of episodic memory as lossless storage of complete state). Of course, the materialists will always fall back on the old tired but worthless argument that we don’t yet know everything that is going on in the brain, therefore we cannot draw a definite conclusion. Actually, a materialist would argue that we do know that minds reduce to brains. I myself am not a materialist, and I'm honest enough to admit we do not know anything of the sort, either way. Which of course is a perfectly valid argument for denying that a theory like ID that relies on the metaphysical supposition of dualism cannot be considered scientific.aiguy
January 10, 2008
January
01
Jan
10
10
2008
05:11 PM
5
05
11
PM
PDT
BarryA, Did the computer have a subjective experience of the redness of red, i.e., did it experience a qualia? Obviously no. You go wrong a couple of different ways here, I think. 1) First, you state the answer as "obviously no", and you could ridicule anyone who disagreed by saying, "Oh, you really think my Dell PC has qualia? Hah! Then you better not turn it off - that would be murder!" and so on. But as obvious as it seems, your answer is an intuition rather than a principled response. So the interesting questions are: a) Are there any empirically-grounded principles which can answer that question? (NO) b) Would our intuitions hold up under different circumstances (i.e. if we managed to create a very human-seeming robot)? (NO) 2) It is not at all clear what the relationship is between qualia and design abilities. For all we know scientifically, the "intelligent designer" that IDers suppose is responsible for life might have every mental ability that human beings have (and to a far greater degree, since living things are more complicated than what we can design), but lack qualia!aiguy
January 10, 2008
January
01
Jan
10
10
2008
05:09 PM
5
05
09
PM
PDT
Barry - It's interesting, I think, how everyday language assumes your position. If, for example, you lost a limb, God forbid, we might describe the event by saying, "Barry lost his arm in an accident" - indicating that we still consider (what's left of) "you" to be the same old Barry. And even in a case of severe head trauma, God forbid, we would probably find ourselves saying things like, "Barry's lost his ability to communicate" or perhaps even "Barry's lost his mind" - but the implication would still be that the Barry we know/knew and love/loved is still around... somewhere.Gerry Rzeppa
January 10, 2008
January
01
Jan
10
10
2008
05:07 PM
5
05
07
PM
PDT
CN, let's approach your question from the other side. If Deep Blue's progamers put in a sub-routine that said: "If red light is sensed, feel warmth" would that give Deep Blue a subjective experience of warmth when it saw the sunset? Obviously not.BarryA
January 10, 2008
January
01
Jan
10
10
2008
04:51 PM
4
04
51
PM
PDT
Barry, while I agree with you that our subjective awareness of color sensations (and other types of qualia) should be enough to convince anybody that our minds are more than just a bunch of neurons, I'm afraid that this is not enough to convince the materialists and agnostics among us. I think that a potentially more successful avenue of inquiry is to find something that the human mind can do that cannot be explained with neural networks alone. It must be something that can be quantified experimentally, i.e., objectively. Lately, I am of the opinion that human episodic memory is biologically implausible. This is especially true with regard to autistic savants who can instantly memorise an entire complex musical piece and play it back flawlessly. Some savants can remember every last detail of their lives, even what they were thinking and feeling at any given moment. This would mean that the the brains of these savants are recording their own state, moment by moment! Having studied AI and neural networks for many years, I can assure you that this is completely biologically impossible. The reason is that the size of the brain makes no difference since it must record its own state over and over. Of course, the materialists will always fall back on the old tired but worthless argument that we don't yet know everything that is going on in the brain, therefore we cannot draw a definite conclusion. I think we already know enough about how neurons work and how many neurons are in the brain to arrive at a definite conclusion that the brain could not possibly retain all this information.Mapou
January 10, 2008
January
01
Jan
10
10
2008
04:44 PM
4
04
44
PM
PDT
I'm not a materialist, but maybe I can offer a slightly different perspective. It's clear that neurons do something that is, in some ways, comparable to computation. Whether we want to call that computation is a matter of mere semantics. It is also clear that neurons are very different from silicon logic gates--for example, in that they are capable of spontaneously forming useful connections. This much, I think, is not controversial. What should also be uncontroversial: the brain is responsible for at least some (if not most or all) mental processes. Visual processing and recognition, for example, is clearly a biological brain function. Now, maybe there are some mental processes (by which I mean to include consciousness, experience of qualia, etc.) that are not the result of brain activity. That's not a hypothesis that we should take lightly. Right now there is no candidate for a mechanism for a dualistic mind/brain connection, no detailed explanation of what the mind is, what it does, how it works, how it comes to be, or really anything other than a list of hard to explain mental phenomena that it may be responsible for. Again, I'm not a materialist. I am not in any way committed to the proposition that "matter and energy are all that exist" or anything like that. But I do expect that if someone is going to claim that something exists (e.g., an immaterial mind), then they had better have some idea of what it is and how it works, and those ideas had better be instructive in some way.Reed Orak
January 10, 2008
January
01
Jan
10
10
2008
04:42 PM
4
04
42
PM
PDT
Ok, devil's advocate coming: Why could you not view the brain as a hugely complex macro-kernal (computer speak) that performs a zillion computations for some simple stimuli, and gives the appearance of qualia (and free-will)? I guess a materialist is forced to come to this conclusion? Materialists: please comment:-)CN
January 10, 2008
January
01
Jan
10
10
2008
04:29 PM
4
04
29
PM
PDT
1 5 6 7

Leave a Reply