Uncommon Descent Serving The Intelligent Design Community

Human Consciousness

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

(From In the Beginning … ):

For the layman, it is the last step in evolution that is the most difficult to explain. You may be able to convince him that natural selection can explain the appearance of complicated robots, who walk the Earth and write books and build computers, but you will have a harder time convincing him that a mechanical process such as natural selection could cause those robots to become conscious. Human consciousness is in fact the biggest problem of all for Darwinism, but it is hard to say anything “scientific” about consciousness, since we don’t really know what it is, so it is also perhaps the least discussed.

Nevertheless, one way to appreciate the problem it poses for Darwinism or any other mechanical theory of evolution is to ask the question: is it possible that computers will someday experience consciousness? If you believe that a mechanical process such as natural selection could have produced consciousness once, it seems you can’t say it could never happen again, and it might happen faster now, with intelligent designers helping this time. In fact, most Darwinists probably do believe it could and will happen—not because they have a higher opinion of computers than I do: everyone knows that in their most impressive displays of “intelligence,” computers are just doing exactly what they are told to do, nothing more or less. They believe it will happen because they have a lower opinion of humans: they simply dumb down the definition of consciousness, and say that if a computer can pass a “Turing test,” and fool a human at the keyboard in the next room into thinking he is chatting with another human, then the computer has to be considered to be intelligent, or conscious. With the right software, my laptop may already be able to pass a Turing test, and convince me that I am Instant Messaging another human. If I type in “My cat died last week” and the computer responds “I am saddened by the death of your cat,” I’m pretty gullible, that might convince me that I’m talking to another human. But if I look at the software, I might find something like this:

if (verb == ‘died’)
fprintf(1,’I am saddened by the death of your %s’,noun)
end

I’m pretty sure there is more to human consciousness than this, and even if my laptop answers all my questions intelligently, I will still doubt there is “someone” inside my Intel processor who experiences the same consciousness that I do, and who is really saddened by the death of my cat, though I admit I can’t prove that there isn’t.

I really don’t know how to argue with people who believe computers could be conscious. About all I can say is: what about typewriters? Typewriters also do exactly what they are told to do, and have produced some magnificent works of literature. Do you believe that typewriters can also be conscious?

And if you don’t believe that intelligent engineers could ever cause machines to attain consciousness, how can you believe that random mutations could accomplish this?

Comments
There are several errors packed in here. First, we’re perfectly capable of knowing the universe without using thought.
Ummm, Barry, “knowing” requires thought.
It does? Interesting. My cats know many things about the universe, but I didn't think you be comfortable ascribing "thoughts" to them (as the capacity for thought would certainly imply consciousness, yes?) So I can go either way. Either my cats manage to know the universe without thoughts, and the above is correct; or my cats possess some level of consciousness and use thoughts to know the universe. Your choice.
There is no “choice” as I explained in my last comment to you.
I'm not finding anyone who agrees with you, and several people who disagree, so for the moment I'm going to provisionally assume that choice exists. For example:
Of course, "the activity o mathematics is not just randomly writing down formal proofs for random theorems", because "the choices of axioms, of problems, of research directions, are influenced by a variety of considerations --- practical, artistic, mystical"...
Carlo Cellucci, "Introduction to Filosofia e matematica", in Reuben Hersh's _18 Unconventional Essays on the Nature of Mathematics_. (emphasis added). (I think you'd particularly enjoy the first essay in that collection. Might give you some ammunition.)
The views often expressed on this subject by methodologists may be termed the aestheticism or utilitarianism in constructing axiom systems. .... According to the mentioned views the choice of axioms is guided by a personal predilection or by the objective to make the derivation of new theorems easy. What set of axioms is chosen and what is their intuitive content is of no importance, all that matters is the set of all theorems derivable on their basis. .... [However], it is obvious from the history of mathematics that in many branches axioms are chosen on account of their intuitive content, e.g., on account of their generality.
A. Grzegorczyk, "On the validation of the sets of axioms in mathematical theories", Studia Logica Volume 13, Number 1, 202, 1969. (emphasis added)
Abstract: I discuss criteria for the choice of axioms to be added to ZFC, introducing the criterion of stability. Then I examine a number of popular axioms in light of this criterion and propose some new axioms.
Sy-David Friedman, "Stable Axioms of Set Theory", Trends in Mathematics, 2006, Part 2, 275-283. (emphasis added). You may, of course, propose a rule for your own work that the only valid axioms are those that are Universal. As I've mentioned a few times now, there's no rule preventing you from doing so. If you want to apply this rule to everyone else, though, I think you're going to have to make a mathematical argument before you'll be taken seriously.
He was a professor of philosophy at Oxford by the age of 26.
Really? I may have been led astray by wikipedia then. His entry there has:
Lewis then taught as a fellow of Magdalen College, Oxford, for nearly thirty years, from 1925 to 1954, and later was the first Professor of Medieval and Renaissance English at the University of Cambridge and a fellow of Magdalene College, Cambridge.
A quick google search turns up:
C.S. Lewis received his First Class degree in Classical Moderations from University College, Oxford University, in 1920. In 1922, he received another First Class degree in Literae Humaniores, and yet another in English Language and Literature in 1923.
I believe that's considered a humanities degree. If you have a cite to the contrary let me know and I'll update wikipedia.
I’d like you to explain without using explanation (which presupposes logic) how we are capable of understanding the universe without using thought.
When we observe this behavior in others, we usually classify it as instinct.BarryR
September 11, 2010
September
09
Sep
11
11
2010
07:56 PM
7
07
56
PM
PDT
153 is to BarryR not to BarryA.StephenB
September 11, 2010
September
09
Sep
11
11
2010
04:35 PM
4
04
35
PM
PDT
---BarryA: "You can claim that no other arithmetic except that which you learned in grammar school exists, but since I have firsthand experience with other arithmetics I don’t think I’m going to believe you." Having studied higher math myself in a formal setting where bluffing doesn't work, I am not buying your argument from authority. I remember well enough the law of exponents and many other laws which admit of no exceptions. Indeed, I once studied the my "identities" over a Christmas vacation. I know well enough the relationship between a philosophical syllogism and the laws of geomtery because I have studied philosophy in a formal setting as well. I also remember the pre-calculus laws of sines and cosines. In keeping with that point, I gather that you have never heard of the law of limits with respect to calculus. Indeed, given your misguided notion that the laws of higher math contradict the laws of arithmetic (except for those times that you are claiming there are no laws at all) I am beginning to suspect that you are not even aware of the lower levels of association and accumulation that rule the arithmetic functions that you so disdain. So why on earth would I want to discuss your irrelevant foray into graph theory, which is little more than a distraction and just one more attempt to avoid your horror of the obvious--math has laws, as does logic.StephenB
September 11, 2010
September
09
Sep
11
11
2010
04:34 PM
4
04
34
PM
PDT
markf, CS Lewis taught for nearly 30 years 1925-1954 at Magdalen College Oxford before he was a professor of English at Magdalene College at Cambridge.CannuckianYankee
September 11, 2010
September
09
Sep
11
11
2010
03:50 PM
3
03
50
PM
PDT
RE 150 I think one could maintain that Lewis could have been a professor of philosphy and not a professor of philosphy or both simultaneously. Why not? Vividvividbleau
September 11, 2010
September
09
Sep
11
11
2010
03:46 PM
3
03
46
PM
PDT
Sorry that should read: "temporary post in 1924 as tutor covering for E. F. Carrit"markf
September 11, 2010
September
09
Sep
11
11
2010
02:21 PM
2
02
21
PM
PDT
#146 He was a professor of philosophy at Oxford by the age of 26. I don't think so. I believe the only philosophical position he held was a temporary post as tutor covering for 1924 E. F. Carrit. That lasted less than a year. In fact I don't think he become professor of anything until he was made professor of medieval and renaissance literature at Magdalene College, Cambridge, in 1955 (i.e. aged 57)markf
September 11, 2010
September
09
Sep
11
11
2010
02:15 PM
2
02
15
PM
PDT
StephenB and Clive Shame on the both of you!! Here we have BarryR, a real intellectual who has come down from the mount to lecture us the mere mortals and the both of you have the audacity to question him. Don't you realize the sacrifice he is making by dirtying his intellectual hands by even gracing us with his presence? Sniff sniff. I mean really how rude and disrespecful. I can tell how he writes that he is very very smart and he has published papers to boot. Besides all the publishing, to really really smart audiences,he has told us how smart he is and how dumb you are. Shoot what papers have you published, did you guys even get out of grammar school? Chesterton? hmmph not even worth refuting. Lewis? minor leaguer sniff sniff. Heck you guys are so dumb you don't even know that 2 apples plus 2 apples dont equal four apples!! Come on now and show some respect will ya!! Vividvividbleau
September 11, 2010
September
09
Sep
11
11
2010
02:15 PM
2
02
15
PM
PDT
---BarryR: "Of course it’s undergirded by logical laws! When I use graph theory, I pick and chose which logical laws form that undergirding!" Now you are saying that you want to choose from among the laws. Earlier, you argued that there ARE NO LAWS from which to choose. ---"Yes, it contradicts your grammar school arithmetic. I’m not using that arithmetic." So, now you are arguing that the laws of higher math contradict the laws of arithmetic? ---"[The graphs} Is this allowed? Is one of them (or both) wrong?" Do you really want me to go into another discussion about the difference between a working definition and a law? First you say that there are no laws. Then you say that there are laws, but you get to choose from among them. Then you say that the graph is undergirded by laws. Then you say that they are undergirded by definitions. Just so you will know, definitions and assumptions are changeable; laws are not.StephenB
September 11, 2010
September
09
Sep
11
11
2010
01:48 PM
1
01
48
PM
PDT
Barry,
There are several errors packed in here. First, we’re perfectly capable of knowing the universe without using thought.
Ummm, Barry, "knowing" requires thought.
Second, while logical thought may not be subjective, the choice as to which logic to use and what facts to operate over remains subjective.
There is no "choice" as I explained in my last comment to you.
Third, logical thought is often irrelevant to the universe
Not sure what you mean here, but the external world is an inferred world, you remove logic and you remove understanding and making sense of the external world as a whole.
Now some of this is due to Lewis speaking to a general audience, but some of it also comes from his speaking about science and philosophy without being trained in either.
He was a professor of philosophy at Oxford by the age of 26.
So by the time he reaches the laws of thought are also the laws of things I have several counterexamples at hand. I work with some of the largest computers in the world precisely because the laws of thought do not follow the laws of things; things are far more complicated than our thoughts, which is why we build supercomputers to help us think about them.
Ummm, Barry, we program the computers, without our logic there would be no computers to begin with, nor any programming. Did you not gather anything else from the essay? I mean, there's a lot to digest, I understand that, and surely you're adept at understanding it, given that it was for the "general audience" of which you consider yourself above, so surely you understand that the quote,
I asked whether /in general/ human thought could be set aside as irrelevant to the real universe and merely subjective. The answer is that at least one kind of thought — logical thought — cannot be subjective and irrelevant to the real universe: for unless thought is valid we have no reason to believe in the real universe.
is valid, and that "First, we’re perfectly capable of knowing the universe without using thought." is not. I'd like you to explain without using explanation (which presupposes logic) how we are capable of understanding the universe without using thought. How are we able to know without thinking? And how would you explain this without logic or thought?Clive Hayden
September 11, 2010
September
09
Sep
11
11
2010
11:38 AM
11
11
38
AM
PDT
StephanB@141
I said I know enough about [graph theory] to know that it doesn’t invalidate the principle that mathematics is undergirded by logical laws,
Of course it's undergirded by logical laws! When I use graph theory, I pick and chose which logical laws form that undergirding! From Bondy and Murty's _Graph Theory_:
A graph G is an ordered pair (V(G), E(G)) consisting of a set V(G) of verticies and a set E(G), disjoint from V(G), of edges, together with and incidence function \phi(G) that associates with each edge of G an unordered pair of (not necessarily distinct) verticies of G.
We now have (part of) the undergirding necessary to start reasoning about graphs. This is not the One True Universal definition of graphs. I can modify it to require the unordered pairs of vertices that define the edges be distinct. I can further restrict this to state that the ordered pair must contain two unique vertices. My question to you is: Is this permissible? May I arbitrarily modify the undergirding of a branch of mathematics? Bondy and Murty don't have a problem with this: on the following page they define simple graphs as having no loops or parallel edges, which is the consequence of my changes to the definition. Diestel gives a slightly different definition in his _Graph Theory_ (I'd quote it, but it's set theoretic and I think spelling out the symbols will only lead to more confusion). So now we have two different scary yellow Springer books (with identical titles) giving slightly different undergirding to the idea of graph theory. Is this allowed? Is one of them (or both) wrong? And most important, how do you know?BarryR
September 11, 2010
September
09
Sep
11
11
2010
12:40 AM
12
12
40
AM
PDT
CH@133
I suppose “There are no rules about how you pick (and modify) your rules.” is not supposed to be a rule either, then.
I think of it as an observation. A rule would be formulated as "There *cannot* be any rules about how you pick (and modify) your rules." I'm happy to argue for that single rule if you like. I don't see where it makes much difference, though.BarryR
September 11, 2010
September
09
Sep
11
11
2010
12:09 AM
12
12
09
AM
PDT
StephanB@141
If they map back to the proposition that 1 + 1 = 147, you are following the wrong map.
How do you know this? Yes, it contradicts your grammar school arithmetic. I'm not using that arithmetic. You can show that it's inconsistent with the arithmetic that I am using --- except I constructed my arithmetic so that wouldn't be the case. You can claim that no other arithmetic except that which you learned in grammar school exists, but since I have firsthand experience with other arithmetics I don't think I'm going to believe you. Or you can show me the One True Universal Correct Arithmetic that you believe exists. But you can't. You've set yourself up a formidable task: demonstrate nearly all mathematics from the 17th C. forward as fundamentally flawed, and do so without any appreciable knowledge of what these mathematics are or how they work. Your best shot so far is observing that these mathematics don't map well to apples. I'll grant you that point. What else do you have?BarryR
September 11, 2010
September
09
Sep
11
11
2010
12:07 AM
12
12
07
AM
PDT
CH@139
Please read De Futilitate from Lewis
He write better than I remember (and far better than Chesterton). And nice work in coming up with a cite that's so directly on point. I think this falls apart here:
I asked whether /in general/ human thought could be set aside as irrelevant to the real universe and merely subjective. The answer is that at least one kind of thought --- logical thought --- cannot be subjective and irrelevant to the real universe: for unless thought is valid we have no reason to believe in the real universe.
There are several errors packed in here. First, we're perfectly capable of knowing the universe without using thought. Second, while logical thought may not be subjective, the choice as to which logic to use and what facts to operate over remains subjective. Third, logical thought is often irrelevant to the universe --- G. H. Hardy was quite proud of the uselessness of his mathematics. Now some of this is due to Lewis speaking to a general audience, but some of it also comes from his speaking about science and philosophy without being trained in either. So by the time he reaches
the laws of thought are also the laws of things
I have several counterexamples at hand. I work with some of the largest computers in the world precisely because the laws of thought do not follow the laws of things; things are far more complicated than our thoughts, which is why we build supercomputers to help us think about them. To leap from that flawed conclusion to ruling out materialistic explanations of thinking simply isn't justified even if I believed the conclusion. It's easy to convince yourself you're making sense when you're making arguments like this. It's harder to convince three anonymous reviewers. Unfortunately, I don't think this work was ever subjected to peer review.BarryR
September 10, 2010
September
09
Sep
10
10
2010
11:59 PM
11
11
59
PM
PDT
---BarryR: "This is how grammar school arithmetic is taught, and to the best of my knowledge, that’s all the math you understand. And that’s fine — we can’t all be mathematicians." You are the one who said that 1+1 does not necessarily = 2, so I thought it would be wise to bring you back down to earth with an obvious example that refutes your misguided notion. ---"But I had hoped your education would have gotten you to the point where you could grok at mathematics can be divorced from apples and studied in its own right. Once you’ve made that leap, then you have an abstract structure that can be used to derive other abstract structures, some of which are even useful." All this is fine, but irrelevant. The best way to convince people that you are educated is to make relevant and cogent arguments. Granted, simplicity that has not passed through complexity is worthless. On the other hand, reveling in complexity in order to evade basic issues smacks of timidity. The trick is to find simplicity on the other side of complexity. If there is one thing that I have learned in dealing with academic elitists, it is this: Anyone who truly understands his subject can, when called upon, explain it so that a twelve year old could understand it. Everyone else is bluffing. ---"But there’s no guarantee that they’ll map back onto apples." If they map back to the proposition that 1 + 1 = 147, you are following the wrong map. ---You say you know graph theory. Great. What are the universal laws of graph theory and how do you know they’re universal? I’ll bring home my Scary Yellow Springer Book o’ Graph Theory tonight and we can compare notes." I said I know enough about it to know that it doesn't invalidate the principle that mathematics is undergirded by logical laws, just I know enough about other approaches to mathematics to know the same thing. We don't need a discussion about Springer's Graph theory, nor do we need a discussion about your notions about symbols. None of these points have anything to do with the basic issue, which is this: The flexibility that is permitted for granting working assumptions does not translate into a flexibility for ignoring logical laws or first principles.StephenB
September 10, 2010
September
09
Sep
10
10
2010
06:26 PM
6
06
26
PM
PDT
CH@139 Well, there's a simple way to settle this: show me your universal mathematics. We know it guarantees that 2+2 always equals 4. What else does it do? I'm quite the mercenary when it comes to mathematics. Show me a more powerful system and I'll drop what I'm currently using and adopt it. So, I'm ready and willing to convert. All you have to do is tell me what I'm converting to.BarryR
September 10, 2010
September
09
Sep
10
10
2010
05:29 PM
5
05
29
PM
PDT
BarryR,
As best I can understand what you’ve been telling me, there is a Correct, Universal logic. Which one is it, and how did you arrive at that conclusion? (If you say “Chesterton”, you will be asked some very pointed questions that you don’t have the technical skill to answer.)
This is known as a false dilemma, if it is logical, then there is no "different ones" to be compared, for no logic could be used to compare them. If there were no, as you said, Correct, Universal logic, then there is no comparison between anything that might be gathered together and called First Order, Second Order and so on. You couldn't even say that they themselves were logical or not, only conventional, to be changed at will, and the entire force of your argument, indeed of all arguments, would be broken into nothing. You would just be talking confusedly. You would just be playing with counters. I think you desperately need Chesterton and C.S. Lewis. Please read De Futilitate from Lewis, you can google it. But you seem to be trying to actually make a point, a logical point, I'm assuming, using reason, I'm assuming, which cannot itself deny logic, or else you have no point to make. All argument presupposes it, even yours.Clive Hayden
September 10, 2010
September
09
Sep
10
10
2010
03:11 PM
3
03
11
PM
PDT
CH@134
Are you implying that you can make new rules of logic by the analogy of axioms? Surely I can do that too, then.
You may be starting to get the hang of this. Yes, you can make new rules of logic, or entirely new logics. Among the ones that are currently being studied right now are: First Order Logic, aka first-order predicate calculus Second Order Logic Infinitary logic It's not usually used to study logic, but I'd add lambda calculus as well. Each of these broad categories is divided up into several specializations, each with their own "rules of logic". Each of these logics is perfectly consistent. Some are more expressive than others, some are more powerful than others. As best I can understand what you've been telling me, there is a Correct, Universal logic. Which one is it, and how did you arrive at that conclusion? (If you say "Chesterton", you will be asked some very pointed questions that you don't have the technical skill to answer.)BarryR
September 10, 2010
September
09
Sep
10
10
2010
02:53 PM
2
02
53
PM
PDT
StephenB@135
If I give you one apple followed by another apple, you will have two apples. You will not have 147 apples.
If we all decided that valid mathematical operations shall be those limited to those can can be carried out on apples, you might have a point. This is how grammar school arithmetic is taught, and to the best of my knowledge, that's all the math you understand. And that's fine --- we can't all be mathematicians. But I had hoped your education would have gotten you to the point where you could grok at mathematics can be divorced from apples and studied in its own right. Once you've made that leap, then you have an abstract structure that can be used to derive other abstract structures, some of which are even useful. But there's no guarantee that they'll map back onto apples. You say you know graph theory. Great. What are the universal laws of graph theory and how do you know they're universal? I'll bring home my Scary Yellow Springer Book o' Graph Theory tonight and we can compare notes.BarryR
September 10, 2010
September
09
Sep
10
10
2010
02:39 PM
2
02
39
PM
PDT
---BarryR: "In grammar school this is called “rounding”, and if you can think back that far, you probably had to memorize that 0.5 was to be rounded up, not down, and that this was a completely arbitrary convention." So you really think that flexibility in mathematical expressions is a good example of how mathematics has no inflexible laws. I gather you also believe that if I round off Pi from 3.14159 to 3.14, the relationship between a circle's diameter and its radius is also up for grabs. You should not be sneering at Chesterton. On the contrary, you desperately need to absorb his message.StephenB
September 10, 2010
September
09
Sep
10
10
2010
02:27 PM
2
02
27
PM
PDT
---BarryR: "In a paper I published last year, I had words to the effect of “Define graph G as a directed, acyclic, weighted graph…”." ---"(Oh, wait — you don’t know graph theory. Don’t worry, it’s pretty easy. Have a look here. It won’t hurt. I promise.)" I know enough about graph theory to know that appealling to it will not help your case. ---"There are no universal laws or rules that required I solve my problem using this particular kind of graph." You are confusing operating assumptions and methods, which are flexible, with the laws of logical and the related laws of mathematics, which are not. Mathematical principles are related to logic principles, which explains why you disavow the rules not for one, but for both. Thus, you reject mathematical laws, the law of non-contradiction, and the law of causality all in one sweep. That is no coincidence. On the matter or relative rules, researchers can posit anything they like for the sake of argument, or they can even do if formally in the form of a hypothesis. They are also free to choose their methods, unless, of course, they are ID scientists who are forbidden by your secularist comrades to do so under the tyrannical rule of methodological naturalism*. *(No doubt you corrected your comrades and re-educated them to the point where they now understand that since the scientist is the only one who knows what problem he is trying to solve, only he/she can choose the appropriate method. [Just a little humor there. I have no doubt that, in spite of your protests on behalf of the INAPPROPRIATE freedom to abandon causality, you would refuse ID the APPROPRIATE freedom to choose its own methods.]) We are not free, however, to abandon the law of causality or the law of non-contradiction which informs all intelligently conceived arguments, whether they be philosophical, mathematical, or scientific. ---"There are no rules about how you pick (and modify) your rules. You’re free to believe otherwise, but I don’t think you’re going to be able to do so while having a successful research career." We are not discussing how you pick YOUR operating assumptions or axioms, we are discussing THE transcendental rules that cannot be chosen. They must, for the sake of rationality, be apprehended, understood, and honored. If I give you one apple followed by another apple, you will have two apples. You will not have 147 apples. If an apple exists, it cannot also not exist at the same time, nor can it come into existence without a cause. These are a few, not all, of the priciples of right reason, and no matter how many graphs you draw or how many letters you scramble, the rational conclusions are inescapable.StephenB
September 10, 2010
September
09
Sep
10
10
2010
01:05 PM
1
01
05
PM
PDT
BarryR,
There are no universal laws or rules that required I solve my problem using this particular kind of graph.
So what?
I can also add new rules, in this case mapping the graph onto the Cartesian plane.
A good middle-school geometry class should have been enough to clue people in that axioms are tools and we’re free to pick our tools — the constraints come with how those tools are able to interact.
Are you implying that you can make new rules of logic by the analogy of axioms? Surely I can do that too, then. I'd like to know how big yellow is. And I'd like to compare that to how far London Bridge is from Christmas Day. If logic can be so manipulated, depending on whatever axioms I choose, or choose to invent, this should be no problem, right? I define that colors are certain sizes, then compare that size to the axiom I invent for distance between a place and a date. My answer to this problem is "cell phone." Are you saying I'm wrong or right? Does it depend on what axioms I choose? No, because it's nonsense. But it's only nonsense if you have sense to begin with. You will not know that a line is crooked unless you have some idea of a straight line. But maybe I can choose my axioms which will make choosing axioms impossible and possible at the same time. This shouldn't be a problem, because there is no such thing as logic, right? By the way, I'd still love to see your response to the problem of induction as so eloquently categorized by Chesterton. This time without the a hominem.Clive Hayden
September 10, 2010
September
09
Sep
10
10
2010
12:49 PM
12
12
49
PM
PDT
BarryR,
There are no rules about how you pick (and modify) your rules. You’re free to believe otherwise, but I don’t think you’re going to be able to do so while having a successful research career.
I suppose "There are no rules about how you pick (and modify) your rules." is not supposed to be a rule either, then. Your entire comment amounts to the trivial conclusion that different rules apply depending on what you want to build, such as building a house made of glass instead of wood would have different methodologies. Of which, the house made of glass, you of all people shouldn't build. That would be my first rule. But choosing how you want to build anything, what sorts of materials to use, how large, how tall, etc., does not equate to suspending laws of reason and logic or mathematics. Your comment, quite frankly, changes the subject from real laws to conventions---which, the latter, can be altered for your own purposes. And I find it notable that you use the word "embarrassing" so often.Clive Hayden
September 10, 2010
September
09
Sep
10
10
2010
12:12 PM
12
12
12
PM
PDT
BarryR, I should like to impress upon you the fact that transcendent 'logical' information, which is completely separate from matter and energy, runs the show as far as the operation of this universe is concerned, and that this fact is empirically verifiable by work that has been accomplished in quantum mechanics. Yet after reading your snobby condescending posts towards others, I think that the only thing that would be sure to impress you would be whenever you looked at yourself in the mirror. Thus I will refrain from wasting my time.bornagain77
September 10, 2010
September
09
Sep
10
10
2010
11:41 AM
11
11
41
AM
PDT
BarryR,
I’ve been resisting a crack about how Dembski’s math only makes sense if you’re a mathematical illiterate, but I don’t know if I can hold out much longer….)
Go ahead, I'd like to ban you for your jeering. And quite frankly, discussing "rounding up" doesn't give me much confidence that you really understand his mathematics. John Lennox, you know who I mean, the Oxford mathematician, endorses Dembski's mathematics. But BarryR, the one who talked about whether one is inside or outside of school walls determines mathematical truths, and banks rounding up, doesn't agree with Dembski's mathematics. Pardon me if I'm not impressed. Clive Hayden
September 10, 2010
September
09
Sep
10
10
2010
10:21 AM
10
10
21
AM
PDT
"Borges, Orwell and Shaw"? Somebody please pick me up off the floor.allanius
September 10, 2010
September
09
Sep
10
10
2010
09:57 AM
9
09
57
AM
PDT
StephanB@118
By arguing that logic and mathematics have no rules...
In a paper I published last year, I had words to the effect of "Define graph G as a directed, acyclic, weighted graph...". (Oh, wait --- you don't know graph theory. Don't worry, it's pretty easy. Have a look here. It won't hurt. I promise.) Now that you have an idea what those words mean.... There are no universal laws or rules that required I solve my problem using this particular kind of graph. Once I've identified the kind of graph I'm going to be using, my readers (including my reviewers) will have an expectation of the rules I'm going to follow. For example, acyclic graphs should really be free of cycles. There are a number of useful operations you can perform on acyclic graphs that only work in the absence of cycles (such as determining the critical path). If I want to break one of these rules, that's perfectly fine (although it's best that I do so explicitly). I might say "As the cycles in this graph represent a finite number of program iterations, we may trivially transform this graph to one free of cycles." My reviewer will read that and decide whether she believes it or not, and if she doesn't it's unlikely that the paper will be published. I can also add new rules, in this case mapping the graph onto the Cartesian plane. Once I've gone through establishing exactly what the rules are that I'll be using, I can make an argument that under these rules my conclusion becomes inevitable. I then need to convince my reviewer that this particular set of rules is a reasonable model for the real-world process I'm interested in, that it's novel, is something more than an incremental contribution, etc. If I've done all that, then I can have a reasonable expectation that the paper will be published. Another example: clock cycles in a computer are discrete --- as far as the computer can tell, there's no such thing as half a clock cycle. Let's say this is a universal rule. I published a paper in 2007 where I treated clock cycles as continuous, not discrete. Because I was free to disregard this universal rule, I was able to solve problems several orders of magnitude larger than what had been done using the discrete approach, and the error I introduced to do so was negligible enough not to be worth measuring. There are no rules about how you pick (and modify) your rules. You're free to believe otherwise, but I don't think you're going to be able to do so while having a successful research career.BarryR
September 10, 2010
September
09
Sep
10
10
2010
09:29 AM
9
09
29
AM
PDT
Hold out? What an interesting position to explain yourself. Here all this time I thought you had been obvious. Rather like a ladle as oppossed to a spoon.Upright BiPed
September 10, 2010
September
09
Sep
10
10
2010
09:07 AM
9
09
07
AM
PDT
BarryR,
So yes, you are correct that, if you limit yourself to grammar-school arithmetic, 2+2 can only equal 4. You’ll have to take it on faith that beyond the walls of grammar schools this is not considered a universal law.
So it is a universal law in school, but not one outside of school? What does a building have to do with anything? Either 2+2=4 or it doesn't, no matter where it is found, inside or outside of a building.
Inference is just another tool, like logic, math, and experiment. Inferences are often wrong. Logic is often irrelevant, math is often intractable, and don’t get me started on experiments.
Do you have a total skepticism at bottom with regard to the cogency of these tools? I mean, surely, you're sophisticated enough to see that there is no hope for a total skepticism, that in the last regard, inference using the laws of logic must be absolute.
This is philosophy for people who won’t read philosophy, written by someone who didn’t read philosophy either. As rhetoric, this is good enough to be notable, but compared to contemporaries like Borges, Orwell and Shaw it’s frankly quite shallow.
When do you reckon you'll start making an argument outside of ad hominem? I'm not interested in talking about the man or the style in which he wrote, but rather the argument in The Ethics of Elfland about real laws, such as laws of reason and math, and weird repetitions in nature we call laws for shorthand.
For example:
Thus when Mr. H. G. Wells says (as he did somewhere), “All chairs are quite different”, he utters not merely a misstatement, but a contradiction in terms. If all chairs were quite different, you could not call them “all chairs.”
As wit, this works. As argument, nothing further need be said to dismiss it.
That must be why you didn't dismiss it with argument......When Chesterton says:
Other vague modern people take refuge in material metaphors; in fact, this is the chief mark of vague modern people. Not daring to define their doctrine of what is good, they use physical figures of speech without stint or shame, and, what is worst of all, seem to think these cheap analogies are exquisitely spiritual and superior to the old morality. Thus they think it intellectual to talk about things being "high." It is at least the reverse of intellectual; it is a mere phrase from a steeple or a weathercock. "Tommy was a good boy" is a pure philosophical statement, worthy of Plato or Aquinas.
He is exactly spot on. As wit this works, as argument, this works. Wit is argument. These elementary truths do not cease being truths because they are elementary. I notice a contempt for basic truths in your comments. As if being on a top floor means you don't need the foundation or the first floors. You cannot suspend yourself. If fundamentals go, all else falls. No matter to what degree of complication mathematics may reach, if the concept of numbers and the multiplication table go, all else will be in ruins. You have to have something as a basis or you will not continue in progress. Progress itself means that there is a basis unchanged. If an acorn grows into a beechwood it isn't progress, but mere change. The same is true for mathematics and moral philosophy and logic. If Logic and Inference are wrong, you can only discern this by more or better inference and logic. To scrap them won't get you any closer to any truth, because you won't be going anywhere. Indeed these are both presupposed in even determining that either of them was used in a wrong way by a faulty and tired human brain. But this can never be an indictment against inference and logic itself without being self referentially incoherent.Clive Hayden
September 10, 2010
September
09
Sep
10
10
2010
09:05 AM
9
09
05
AM
PDT
markf@122 Exactly, and you also run into problems when trying to map a potentially infinite number of discrete objects onto a finite amount of silicon. #119 thru #121 The banking example also works. If you go to close out your account and you have $1.00 of principle and $0.002 dollars of interest, how much money does the teller hand you? Depending on how nice the bank is, this will either be: 1.00 + 0.002 = 1.00 or 1.00 + 0.002 = 1.01 OH NOES!!!!! TEH UNIVERSIALITIFULLNESS OF ADDING STUFF HAS BEN VIOLATED!!!!11!!! In grammar school this is called "rounding", and if you can think back that far, you probably had to memorize that 0.5 was to be rounded up, not down, and that this was a completely arbitrary convention. An equivalent way of stating this is to define "+" in such a way as no rounding after the fact is necessary. For the tellers counting out change, the former is easier. For the people who are writing the bank's software, you have to deal with the latter whether you want to or not. (I'm fascinated --- in a horrified kind of way --- that you'd define mathematics as "that which is appropriate to use in a bank" and still not get it right. I've been resisting a crack about how Dembski's math only makes sense if you're a mathematical illiterate, but I don't know if I can hold out much longer....)BarryR
September 10, 2010
September
09
Sep
10
10
2010
08:58 AM
8
08
58
AM
PDT
1 8 9 10 11 12 15

Leave a Reply