Uncommon Descent Serving The Intelligent Design Community

The Brain is Just Smart Meat? Well, Maybe Not.

arroba Email

Surfing the internet last night I ran across Why Minds Are Not Like Computers, a very interesting 2009 article by Ari N. Schulman in The New Atlantis.

Schulman describes the way computers work by running algorithms, and he explores the question of whether the human mind can be reduced to similar computational terms. Of course, most materialists are philosophically committed to a positive answer to that question, and Schulman quotes Rodney Brooks from his 2002 book Flesh and Machines: How Robots Will Change Us,  where Brooks asserts that “the body, this mass of biomolecules, is a machine that acts according to a set of specifiable rules,” and hence that “we, all of us, overanthropomorphize humans, who are after all mere machines.”

Has Brooks’ assertion been demonstrated or is it merely assumed on the basis of his philosophical prejudices? Schulman gives good reasons for concluding the latter is the case. It is a great article and I recommend that you follow the link. Here’s a taste:

The game, now popularly known as the Turing Test, is above all a statement of epistemological limitation—an admission of the impossibility of knowing with certainty that any other being is thinking, and an acknowledgement that conversation is one of the most important ways to assess a person’s intelligence. Thus Turing said that a computer that passes the test would be regarded as thinking, not that it actually is thinking, or that passing the test constitutes thinking. In fact, Turing specified at the outset that he devised the test because the “question ‛Can machines think?’ I believe to be too meaningless to deserve discussion.” But it is precisely this claim—that passing the Turing Test constitutes thinking—that has become not just a primary standard of success for artificial intelligence research, but a philosophical precept of the project itself. . . .

Suppose that the mind is in fact a computer program. Would it then be possible to conclude that what’s inside the mind is irrelevant, as is supposed by some interpretations of the Turing Test? If we have some computer program whose behavior can be completely described as if it were a black box, such a description does not mean that the box is empty, so to speak. The program must still contain some internal structures and properties. They may not be necessary for understanding the program’s external behavior, but they still exist. So even if we possessed a correct account of human mental processes in purely input-output terms (which we do not), such an external description by definition could not describe first-person experience. The Turing Test is not a definition of thinking, but an admission of ignorance—an admission that it is impossible to ever empirically verify the consciousness of any being but yourself. It is only ever possible to gain some level of confidence that another being is thinking or intelligent. So we are stuck measuring correlates of thinking and intelligence, and the Turing Test provides a standard for measuring one type of correlate. . . .

AI proponents understand that communication is possibly the most important way of demonstrating intelligence, but by denying the importance of each agent’s internal comprehension, they ironically deny that any real meaning is conveyed through communication, thus ridding it of any connection to intelligence. While AI partisans continue to argue that the existence of thinking and social interaction in programs is ¬demonstrated by their mimicry of observed human input-output behavior, they have merely shifted the burden of proof from the first-person experience of the programs themselves to the first-person experiences of the people who interact with them. So although behaviorists and functionalists have long sought to render irrelevant the truth of Descartes’ cogito, the canonization of the Turing Test has merely transformed I think therefore I am into I think you think therefore you are.

Nature is a machine coupled with the spirit of god. so life is not a machine but only the parts are. Again the great presumption here is that we are our brain. We are not. We were created in the image of God and think like him therefore. He has no brain. Our brain just organizes our thoughts including the memory etc. We are not like computers and they will never be like us. They are too dumb. our thoughts are profound and complex and so much so even being the couch doesn't reveal our true motivations. The world is still saying we are a brain and control by natures machine of the brain. There is no reason to presume our thinking is just from the brain. Christianity teaches we will think in the afterlife without a brain. Animals have brains and surely the size of it doesn't make them intelligent. Animals are too dumb relative to their brain size. our brains simply are bigger for issues of memory or this or that. If we think like God then its more then simple equations. Robert Byers
I find the idea of likening the brain to a computer to be absurd. But even if we do that, what are the consequences? Computers are not autonomous. Your computer is merely an interface for you. Though there is some degree of autonomy, at some point you are still needed.  In instances when you are not needed, the program is found within the computer itself. In other words, all instances of autonomy are reducible to causal programming. If the programming is not there, that's where you come in. For instance, automatic updates might be autonomous, but they can be reduced to a code in the computer. Is there a code which automatically programs when the automatic updates should start and stop? In this case the answer is no. Thus, if we:  1. see a computer changing it's update schedule, 2. see there is no code which we can attribute to the action, then we must infer that the computer is being acted upon by an outside source. The computer is thus, not completely autonomous. My point is that reductionists do the opposite with the brain. Though chemical processes cannot possibly be responsible for the plethora of tasks underway in the brain, they still liken it to a completely autonomous device.  For now, lets pretend that the brain is a manner of computer. At best, brain activity may be compared to the new chromebook on the market today. To the best of my knowledge, the chromebook is heavily, if not entirely, dependent on non-locality (the Internet). It is where your files are stored, it is where updates come from, and so forth. If we look inside a chromebook, we will realize that the coding is limited. We will realize that we cannot possibly explain all the activities in light of local coding. Why then would we compare it to a standard computer?  Let's now cease to see the brain as a computer, because it isn't. There is coding, but the brain, as a single unit, is not where the coding is. Anyone familiar with microbiology will know that the coding is cellular. If the coding is what allows comparisons with a computer then it is the cell we should be looking at as the computer of the body, not the brain. The brain is a mass of cells arranged in a specific pattern, communicating with each other, to fulfill a single, common task. The brain itself has no codes. If we replace the word "cell" with computer, we encounter a setup frequently found in computing: networks. That's what the brain can be aptly compared to. The idea that the brain is a computer is completely unfounded and was never a case. The idea that the mind is in the brain is also unfounded since it's the coded portion which is said to produce consciousness. But what of cells? If we liken the cell to a computer then what type of computer is it? It again goes back to the chromebook example. It has been demonstrated that all cellular activity is not reducible to the cell itself, going even as far as the quantum level.  In sum, my question for those who say the brain is a computer is first: where is the programming of the brain itself? Where is the brain's code? We already know that there are billions of tiny computers orchestrating an array of functions so why engage in such folly? oyer

Leave a Reply