Origenes, has put up an interesting argument that we need to ponder as food for thought.
It’s Friday, so this should be a good thing to start our weekend on:
ORIGENES: Here I will argue that self-prediction cannot be accommodated by materialism. In daily life we all routinely engage in acts of self-prediction — ‘tomorrow morning at 9 o’clock I will do 2 push-ups’, ‘I will do some Christmas shopping next Friday’ … and so forth. The question is: how does materialism explain that familiar phenomenon? Given that specific behavior (e.g. doing 2 push-ups) results from specific neural states, how is it that we can predict its occurrence?
The fact that one can predict her/his own behavior suggests that we have mental control over the physical, which is obviously unacceptable for the materialist, who claims the opposite to be true. Therefore the task set out for the materialist is to naturalize self-prediction. And in doing so there seems to be no other option available than to argue for the existence of some (physical) system, capable of predicting specific neural states and the ensuing behavior. But here lies a problem. There is only one candidate for the job, the brain, but, as I will argue decisively, the brain cannot do it.
The Argument from Self-prediction
1. If materialism is true, then human behavior is caused by neural events in the brain and environmental input.
2. The brain cannot predict future behavior with any specificity.
3. I can predict future behavior with specificity.
4. Materialism is false.
– – – –
Support for 2
In his book ‘the Sensory order’ (1976) Von Hayek argues that, in order to predict a system, you need a distinct system with a higher degree of complexity. His argument can be summarized as follows:
… Prediction of a system O requires classification of the system’s states.
If these states can differ in n different aspects, that is, they can be subsumed under n different predicates, there are 2^n different types of states a classificatory system P must be able to distinguish. As the number of aspects with regard to which states might differ is an indicator of O’s complexity and as the degree of complexity of a classificatory system P is at least as large as the number of different types of states it must be able to distinguish, P is more complex than O.
[‘The SAGE Handbook of the Philosophy of Social Sciences’ edited by Ian C Jarvie, Jesus Zamora-Bonilla]
Von Hayek then goes on to conclude that:
No system is more complex than itself. Thus: No system can predict itself or any other system of (roughly) the same degree of complexity (no self-referential prediction).
IOWs the brain cannot predict itself, because, in order to predict the brain, one needs a system with a higher degree of complexity than the brain itself.
– In order to predict specific behavior, the brain cannot run simulations of possible future neuronal interactions, because it is simply too complex. The human brain is perhaps the most complex thing in the universe. The average brain has about 100 billion neurons. Each neurons fires (on average) about 200 times per second. And each neuron connects to about 1,000 other neurons.
– A prediction of specific behavior would also require predicting environmental input, which lies beyond the brain’s control. We, as intelligent agents, can, within limits, ignore a multitude of environmental inputs and stick to the plan — ‘tomorrow morning at 9 o’clock I will do 2 push-ups, no matter what’ —, but the brain cannot do this. The intractable environmental (sensory) input and the neural firing that result from it necessarily influences the state of the brain tomorrow morning at 9 o’clock.
What do you think? END