Uncommon Descent Serving The Intelligent Design Community

Human Consciousness

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

(From In the Beginning … ):

For the layman, it is the last step in evolution that is the most difficult to explain. You may be able to convince him that natural selection can explain the appearance of complicated robots, who walk the Earth and write books and build computers, but you will have a harder time convincing him that a mechanical process such as natural selection could cause those robots to become conscious. Human consciousness is in fact the biggest problem of all for Darwinism, but it is hard to say anything “scientific” about consciousness, since we don’t really know what it is, so it is also perhaps the least discussed.

Nevertheless, one way to appreciate the problem it poses for Darwinism or any other mechanical theory of evolution is to ask the question: is it possible that computers will someday experience consciousness? If you believe that a mechanical process such as natural selection could have produced consciousness once, it seems you can’t say it could never happen again, and it might happen faster now, with intelligent designers helping this time. In fact, most Darwinists probably do believe it could and will happen—not because they have a higher opinion of computers than I do: everyone knows that in their most impressive displays of “intelligence,” computers are just doing exactly what they are told to do, nothing more or less. They believe it will happen because they have a lower opinion of humans: they simply dumb down the definition of consciousness, and say that if a computer can pass a “Turing test,” and fool a human at the keyboard in the next room into thinking he is chatting with another human, then the computer has to be considered to be intelligent, or conscious. With the right software, my laptop may already be able to pass a Turing test, and convince me that I am Instant Messaging another human. If I type in “My cat died last week” and the computer responds “I am saddened by the death of your cat,” I’m pretty gullible, that might convince me that I’m talking to another human. But if I look at the software, I might find something like this:

if (verb == ‘died’)
fprintf(1,’I am saddened by the death of your %s’,noun)
end

I’m pretty sure there is more to human consciousness than this, and even if my laptop answers all my questions intelligently, I will still doubt there is “someone” inside my Intel processor who experiences the same consciousness that I do, and who is really saddened by the death of my cat, though I admit I can’t prove that there isn’t.

I really don’t know how to argue with people who believe computers could be conscious. About all I can say is: what about typewriters? Typewriters also do exactly what they are told to do, and have produced some magnificent works of literature. Do you believe that typewriters can also be conscious?

And if you don’t believe that intelligent engineers could ever cause machines to attain consciousness, how can you believe that random mutations could accomplish this?

Comments
The following is taken from an email exchange with Stephen Redburn (Cantab/St. Edmund's). I wrote him asking for clarification on a few Oxbridge differences that continue to baffle me: Lecturer: Employed by a Faculty rather than a college in Oxbridge. Usually needs a PhD but some of the old types when I was up 20 years ago still regarded it as a beastly German invention and just had MAs. I don't think you'd get a job without a PhD now though. Usually a member of a College, but doesn't have to be, in Cambridge. (They say it is rather lonely to be in Cambridge and not be in a college.) Expected to lecture and publish papers. Fellow: On the governing board of a College (or Professional Institute like the Institute of Civil Engineers etc. outside academic life). Divided into levels. The top ones are permanent and have full voting rights and attend the management meetings, but I am sure that at St. Ed's they all attended the monthly meeting. Professor: In the UK it is the head of the department or equivalent. The old UK system had only one professor but commonly now they have several in a large department. If the job was funded by the Royal Family who put money into a pot to pay their salary it is called a Regius professor. Regius Professorships are "Royal" Professorships at the universities of Cambridge, Oxford, Dublin, Glasgow, Aberdeen and Edinburgh and each appointment save those at Dublin is approved by the Crown.
http://gmmalliet.weebly.com/oxbridge.htmlPetrushka
September 14, 2010
September
09
Sep
14
14
2010
08:31 AM
8
08
31
AM
PDT
I find it a bit odd, but the official Oxford Glossary doesn't spend any time distinguishing among Don, Tutor, Reader, Professor, Fellow, Lecturer. His field seems to have been English rather than Philosophy, except for six months when he was a substitute.Petrushka
September 14, 2010
September
09
Sep
14
14
2010
08:27 AM
8
08
27
AM
PDT
http://www.ox.ac.uk/about_the_university/introducing_oxford/oxford_glossary/index.htmlPetrushka
September 14, 2010
September
09
Sep
14
14
2010
08:22 AM
8
08
22
AM
PDT
#211 Clive This is a rather trivial debate but I am not busy at the moment. I am going to come straight out and say I don't believe you. I see you have changed from "tutor" to "fellow" (the two are different, Lewis was a tutor in philosophy at University College for a year, but not a fellow, then became both fellow and tutor in English at Magdelen) but let's go with fellow. Nowadays a fellowship at Oxford certainly doesn't require you to be a professor. I very much doubt the rules were different in the 1920s. In addition I cannot find any biography of CS Lewis that mentions him being a professor until 1955. Many of them use terms such as "important teaching post". If he had been a professor surely they would have said so? If you can provide a reference to prove your point please do and I will retire shamefaced from this trivial debate.markf
September 14, 2010
September
09
Sep
14
14
2010
05:10 AM
5
05
10
AM
PDT
markf,
Sorry but that is not true (I speak as a Cambridge graduate which uses the same tutorial system). A tutor is not a formal position. It just means someone who gives tutorials. It can be of any academic rank. Mine were mostly graduates taking their PhD. CS Lewis did not get a professorship until was 57 and that was not in Philosophy.
Sorry, but that is wrong. Being a Fellow of Magdalen College, Oxford, which Lewis was for 29 years, was a professorship when Lewis had it. I'm not interested how it's done today at Cambridge.Clive Hayden
September 14, 2010
September
09
Sep
14
14
2010
04:35 AM
4
04
35
AM
PDT
Barry,
Yep. It lasted a year. He never went back to it. Never took a graduate degree in the field, never published any original research in the field, never held a permanent position in the field and never was elected to a professorship in the field. Based on the above, you want to label him a professor of philosophy. You’re free to do so, but you’re not free to have other people take you seriously.
You said, initially, that he didn't have training in philosophy, not it's that he wasn't a professor of philosophy long enough to suit you. Odd that you move the goal posts.... considering Peter Kreeft, Ph.D., who is a professor of philosophy at Boston College, Victor Reppert, PhD., an American philosopher, Gary Habermas, Ph.D., professor of philosophy at Liberty, Dr. William Dembski, who has two Ph.D's, Anthony Matteo, Ph.D, professor of philosophy at Elizabethtown College, just off the top of my head, consider his works important in philosophy. That's peer review. Indeed, all of these men have endorsed certain portions of Lewis's philosophy. But, BarryR, who almost got a minor in philosophy, doesn't think Lewis was a philosopher. You're right about one thing, Lewis didn't want to be a professional philosopher, but that doesn't mean he wasn't trained in it. Now we've spent too many comments on me just addressing your ad hominem of why Lewis's argument in De Futilitate was wrong. I've yet to see an actual argument.
If you feel I’m being uncivil, then tell me. If I’ve offended you, then tell me. You just might get an apology.
Yes you have been, very jeering and condescending. So.............I'm still waiting as to why Chesterton was wrong and why Lewis was wrong. You can start any time.Clive Hayden
September 14, 2010
September
09
Sep
14
14
2010
04:24 AM
4
04
24
AM
PDT
Stephenb #194 (cont) Final comment for this morning .... (I must be mad) Barry has only made the claim, he has not provided any support for that claim. Glancing through the comments above I see the following "laws" which Barry has shown can be false given a change of axioms: #103 2+2=4 a+b+c = c+b+a It is not possible to divide by zero #126 a+b gives a unique result #157 Law of exponents #166 The law of limits with respect to calculus. No doubt you will disagree with his proof in each case but you can hardly say he has not provided any support. In some cases he has written extensively and provided references. On what axioms do the laws of association and accumulation depend? Please be specific, name the axiom or axioms on which they depend and explain the relationship between the two. .... That should read[commutative] and associative laws. This is a bit confusing - as far as I know "commutative" and "associative" are properties which functions may or may not have. A law would have to be something like - (A) "the addition and multiplication functions on real numbers are commutative and associative" Assuming that is what you meant, then these laws are dependent on the axioms of the arithmetic of the natural numbers. I am not a mathematician so I may be wrong about the detail, but I believe the appropriate axioms are the Peano axioms with addition of the definition of the addition and multiplication functions. It needs a better mathematician than me to prove exactly (A) follows from these axioms - perhaps BarryR can oblige.markf
September 14, 2010
September
09
Sep
14
14
2010
03:35 AM
3
03
35
AM
PDT
Onlookers: It is often quite revealing to look back at the original post, when a thread goes ever so determinedly off the rails. For, that is telling us just what it is that some of those who come here to comment are ever so desperate not to discuss; nor are they willing to allow others to discuss in peace. Here is Prof Sewell, as excerpted form this thread's original post: ___________________ >> For the layman, it is the last step in evolution that is the most difficult to explain. You may be able to convince him that natural selection can explain the appearance of complicated robots, who walk the Earth and write books and build computers, but you will have a harder time convincing him that a mechanical process such as natural selection could cause those robots to become conscious. Human consciousness is in fact the biggest problem of all for Darwinism, but it is hard to say anything “scientific” about consciousness, since we don’t really know what it is, so it is also perhaps the least discussed. Nevertheless, one way to appreciate the problem it poses for Darwinism or any other mechanical theory of evolution is to ask the question: is it possible that computers will someday experience consciousness? If you believe that a mechanical process such as natural selection could have produced consciousness once, it seems you can’t say it could never happen again, and it might happen faster now, with intelligent designers helping this time. In fact, most Darwinists probably do believe it could and will happen—not because they have a higher opinion of computers than I do: everyone knows that in their most impressive displays of “intelligence,” computers are just doing exactly what they are told to do, nothing more or less. They believe it will happen because they have a lower opinion of humans: they simply dumb down the definition of consciousness, and say that if a computer can pass a “Turing test,” and fool a human at the keyboard in the next room into thinking he is chatting with another human, then the computer has to be considered to be intelligent, or conscious . . . >> ____________________ A very good issue and challenge: our most direct and central experience is of being unified selves, aware, thinking, perceiving, having a sense of conscience, etc. Where does that come from, and can we have a credible evolutionary materialistic account of its origin and character? It should come as no surprise to learn that the evolutionary materialist paradigm has no good answer for this. In wiki's -- testimony against known interest -- summary:
The term [hard problem of consciousness] . . . refers to the difficult problem of explaining why we have qualitative phenomenal experiences. [I.e. we experience consciously aware life from inside; e.g. we experience the rich redness of a nice dress on a lovely lady] It is contrasted with the "easy problems" of explaining the ability to discriminate, integrate information, report mental states, focus attention, etc. Easy problems are easy because all that is required for their solution is to specify a mechanism that can perform the function. That is, their proposed solutions, regardless of how complex or poorly understood they may be, can be entirely consistent with the modern materialistic conception of natural phenomen[[a]. Hard problems are distinct from this set because they "persist even when the performance of all the relevant functions is explained."
Translating: we cannot simply reduce it on evo mat assumptions, and so it is "hard." (The idea that just perhaps, the real answer is that -- echoing Plato -- Soul is older than what is hard, heavy, hot or cold, dry or wet, is simply put out of mind.] Ilion, in no 5, is apt:
A huge part of the problem (perhaps the entirety of it?) is that trying to “say exactly what consciousness is” generally involves mechanistic/materialistic eliminative reductionism … which is to say, “explaining” consciousness by explaining it away. Consciousness is itself; it’s not made out of something else.
In other words, we are again meeting he a priori materialism that ever so often presentes itself as THE "scientific" view, never mind that the very materialists themselves dierectly experience what it is to be ensouled creatures, and never mind that when they try to explain mind on such premises, they end up in the absurdities of resorting to chance and mechanical necessity as the only relevant causal factors, thus running into any number of absurdities: reducing ground-consequent to physical causal chains and derived accidents of culture and personal life history, ever so soon becomes plainly self referentially absurd. In this context, distractive tangents are unsurprising. (And to see the attempt to dismiss Lewis' argument by undermining the man instead of addressing he issues on the merits, or to dismiss the self-evident truth that 2 + 2 = 4 by injecting a self-contradictory equivocation in the meaning of "+" are all too aptly illustrative of the problem.) So, far, the evidence is on the side of mind being its own self, and the cause of the physical cosmos we as minded creatures, observe. GEM of TKIkairosfocus
September 14, 2010
September
09
Sep
14
14
2010
03:11 AM
3
03
11
AM
PDT
BarryR #198 Likewise pleased to meet you. I am not sure how I could link to you on usenet or Google Groups given the only thing I know about you is your Wordpress ID. My e-mail is: mark(dot)t(dot)frank(at)gmail.com Cheersmarkf
September 14, 2010
September
09
Sep
14
14
2010
01:50 AM
1
01
50
AM
PDT
#203 cont To be fair I would say that CS Lewis was trained in philosophy. I understand his degree was in Greats which includes the philosophy of the Greek philosophers and he got a triple first. He was also taken seriously enough to at least have the famous debate with GE Anscombe. However, as far as I know, he never had a philosophical paper published in an academic journal and his name never came up in 3 years of studying philosophy. So, given that there is only time to read a limited number of philosophers in a life time - I am unlikely to get beyond Narnia as far he is concerned.markf
September 14, 2010
September
09
Sep
14
14
2010
01:31 AM
1
01
31
AM
PDT
markf@203
Lewis did not get a professorship until was 57 and that was not in Philosophy.
Ah, that was at Cambridge, right?
In June, Lewis accepted the Chair of Medieval and Renaissance Literature at Cambridge.
I had missed that.BarryR
September 14, 2010
September
09
Sep
14
14
2010
12:59 AM
12
12
59
AM
PDT
CF@202
Do you not know that a tutor is a professor at Oxford?
No, I didn't know that. The Girlfriend spent a year over there while she was getting her physics degree. She reports the correct title is "Don". These days they tend to have Ph.Ds., but standards were a bit more lax when Lewis was there. Do you have a cite to the contrary?
Lewis was hired in place of another professor, E.F. Carritt, while he was gone for a year, and guess what, E.F. Carritt was also a “tutor”, which means professor.
I taught several semesters for professors while I was getting my Master's degree. It's a pretty common setup. That didn't make me a professor.
Did you have a chance to look up what is entailed in the Greats (Literae Humaniores) as a degree at Oxford?
I cited it back at 155. Glad to see you're getting caught up. You'll notice no dissertation is required, nor did he do sufficient (any?) original research afterwards to be considered for an English professorship, much less a philosophy position.
“His first job at Oxford, then was teaching philosophy”
Yep. It lasted a year. He never went back to it. Never took a graduate degree in the field, never published any original research in the field, never held a permanent position in the field and never was elected to a professorship in the field. Based on the above, you want to label him a professor of philosophy. You're free to do so, but you're not free to have other people take you seriously. (Now you've got me wondering --- there were formal programs in philosophy at the time, right? If Lewis really did want to be a philosopher, why wasn't he enrolled in one of those instead of one of the Western Classic programs? The "Greats" are fine for everything up to, what?, Augustine? Did he ever taken any classes covering philosophy after 1700? I honestly don't know.)
I see these reasons as cop-outs.
I see my posts getting delayed for moderation with no explanation. I see that a post mentioning this and apologizing for the delay got killed. I don't know if this is you or someone else, but this isn't how adults behave. If you feel I'm being uncivil, then tell me. If I've offended you, then tell me. You just might get an apology.BarryR
September 14, 2010
September
09
Sep
14
14
2010
12:23 AM
12
12
23
AM
PDT
#202 Do you not know that a tutor is a professor at Oxford? Sorry but that is not true (I speak as a Cambridge graduate which uses the same tutorial system). A tutor is not a formal position. It just means someone who gives tutorials. It can be of any academic rank. Mine were mostly graduates taking their PhD. CS Lewis did not get a professorship until was 57 and that was not in Philosophy.markf
September 14, 2010
September
09
Sep
14
14
2010
12:11 AM
12
12
11
AM
PDT
BarryR,
Lewis was hired as a tutor of English in Magdalen college. Later…
Do you not know that a tutor is a professor at Oxford? Lewis was hired in place of another professor, E.F. Carritt, while he was gone for a year, and guess what, E.F. Carritt was also a "tutor", which means professor. Please read page 15 of C.S. Lewis as Professor. Did you have a chance to look up what is entailed in the Greats (Literae Humaniores) as a degree at Oxford?
There’s a bit of a literary trope here that I don’t think you’re familiar with. Using “X as Y” in the title indicates that the author understands that X is not considered to be a Y, and is making the proposition that it might be educational to see what happens if we made that assumption.
It would've been so painless to have just read the introduction the the book that I linked for you, in which case yours is not an argument. The authors are dead serious in taking Lewis as a philosopher. Please, save me from correcting, I really mean this, and read, at the very least, the intro to the book. It will save us a lot of time. "His first job at Oxford, then was teaching philosophy" and "It was only when he received his position as Fellow of Magdalen College in 1925, his first permanent job [at age 27], that his career direction as a literary scholar was set. But even then, part of the reason he may have gotten the job was Magdalen was looking for someone who could teach both philosophy and English" page 15. I know you had almost enough credits to get a minor in philosophy, which a Minor is usually about 26 credits, but the Greats at Oxford is a four year degree in itself. Essentially Lewis got three three degrees at once and got Firsts (the highest grades possible) in all three. I know, I know, you almost got a minor in philosophy, and I'm sure you could translate the Iliad at 14 like Lewis did. Let's put this into perspective, Lewis could actually speak Attic Greek, not just read it and write it. He also knew Medieval Italian, Latin, French, German, Anglo-Saxon and Old Icelandic, as well as being an English Scholar. You might be interested in a book of his called Studies in Words, in which he traces the different usages of words throughout time and throughout different languages and explains their usage in that manner, which gives insight into their ancients' worldview and their philosophy.
Putting that to one side, Lewis isn’t wrong because he’s not a philosopher, he’s wrong because he’s making several mistakes that actual philosophical training would have prevented.
How so?
I’m afraid my answer may contravene the moderation policy, so I’ll have to forbear.
The policy is just being civil, you don't have to be uncivilized to explain to me how Lewis made mistakes a trained philosopher wouldn't have made.
I’m happy to give you an answer via private email or in a non-moderated public forum. I’m a regular at talk.origins, so if you like I can start a thread over there. I can give you the Chesterton critique there as well.
I see these reasons as cop-outs. You're welcome to write to me at my email, but it better not be uncivilized there either. And if it isn't, I see no reason why we cannot continue here.Clive Hayden
September 13, 2010
September
09
Sep
13
13
2010
10:56 PM
10
10
56
PM
PDT
#194 You are putting the cart before the horse. We know the universe in part, because the laws of mathematics allow us to provide a quantitative analysis. If we could know the universe and its laws in advance of applying mathematical laws, then the application would be redundant. You are assuming as given the very knowledge that can only be obtained through the laws of mathematics It is true that the use of mathematics allows us to provide a quantitative analysis because some axiomatic systems work well and in that sense helps us get to understand the universe even better. I meant know in the sense of "se connaitre" rather than "savoir". I should rephrase my paragraph as: “It is hardly surprising that the mathematical systems we know best describe aspects of the universe we deal with regularly. That is because we were motivated/inspired to devise mathematical systems that describe the universe we deal with.” The point is that the fact that some mathematics describes some aspects of the universe very closely does not prove that mathematical laws are in some sense necessarily true except in the trivial sense that they follow from their axioms.markf
September 13, 2010
September
09
Sep
13
13
2010
10:49 PM
10
10
49
PM
PDT
StephanB@196 What did you think of section 3.3 of Edmond's paper?BarryR
September 13, 2010
September
09
Sep
13
13
2010
06:28 PM
6
06
28
PM
PDT
markf@193
That just isn’t true. Euclidean geometry is only an approximation to any real world we have met – see above.
Most mathematicians believe the real world is an approximation of Euclidean geometry. ;-)
The consequences of axioms never change. The consequences of the rules of chess never change. The choice of axioms is continually changing to meet the demands of the real world or just to satisfy the curiousity of mathematicians. The rules of chess change to sometimes to give us a different game.
I can't improve on that.BarryR
September 13, 2010
September
09
Sep
13
13
2010
06:20 PM
6
06
20
PM
PDT
markf@186 As to interesting questions:
1) Does Euclidean geometry have any special status relative to other geometries? 2) Are the results discoveries or inventions? 3) Do the laws of mathematics refer to some abstract objects – numbers, circles, triangles etc – which in some sense exist to be discovered? Here I suspect I differ from BarryR. I think he proposes that there are infinitely many systems of such abstract objects including one for each possible geometry. He just doesn’t give any such system any special status. I believe they are just rules for manipulating symbols much as the rules of chess are (unless of course you believe in an abstract world of pawns, kings and bishops etc).
It's a pleasure to make your acquaintance. You might want to drop by talk.origins (via google groups if you don't have a usenet feed) and say hello. My introduction to these questions came via Davis and Hersh's _The Mathematical Experience_. It's not a book about math, it's a book about how math is created / performed / discovered. To say it's the best introduction to serious mathematics for a general audience implies that there exists other such books, and outside of the A Very Short Introduction series I don't know of any. For a quick sampling, check out their definition of the ideal mathematician. I'm just going to address one of your points here:
I think he proposes that there are infinitely many systems of such abstract objects including one for each possible geometry. He just doesn’t give any such system any special status.
That's an excellent summary of the Platonic approach, and you're arguing instead for a more formalist approach. I know intelligent people who hold each view (the professional mathematicians are uniformly Platonists and the Computer Scientists are uniformly Formalists, I tend to swing back and forth). With either approach, though, there's a problem of how to give special status to one over another --- the axiomatic systems that result in 2+2=4 get far more use than those where they don't. I find utility to be a simple enough explanation: we focus on math that maps well onto the world around us and (usually) ignore the math that doesn't. So while I consider all axiomatic systems to be epistemically and ontologically equal, when it comes to relevance, we pick our favorites. G. H. Hardy argued that beauty also plays a role:
There are masses of chess-players in every civilized country—in Russia, almost the whole educated population; and every chess-player can recognize and appreciate a ‘beautiful’ game or problem. Yet a chess problem is simply an exercise in pure mathematics (a game not entirely, since psychology also plays a part), and everyone who calls a problem ‘beautiful’ is applauding mathematical beauty, even if it is a beauty of a comparatively lowly kind. Chess problems are the hymn-tunes of mathematics.
From A Mathematician's Apology Figuring out that there are no universal axioms isn't that difficult --- I managed to pick this up in 8th grade, but then I had an exceptionally good geometry teacher. Deciding whether or not math reflects a different reality or is just symbol manipulation is ultimately unanswerable, and most of the time I don't think it matters. Figuring out where beauty lies in mathematics, though --- that's a hard problem worth solving.BarryR
September 13, 2010
September
09
Sep
13
13
2010
06:17 PM
6
06
17
PM
PDT
CH@170
C. S. Lewis as Philosopher by David Baggett, Gary R. Habermas, and Jerry L. Walls
Interesting. Have you read Baggett's other book, Harry Potter and Philosophy: If Aristotle Ran Hogwarts? There's a bit of a literary trope here that I don't think you're familiar with. Using "X as Y" in the title indicates that the author understands that X is not considered to be a Y, and is making the proposition that it might be educational to see what happens if we made that assumption. Other examples: Paul Radin, Primitive Man as Philosopher Robert J. Yanal, Hitchcock as philosopher Anton Pannekoek, Lenin as philosopher and let's not forget Caleb Sprague Henry, Satan as moral philosopher I'd say Baggett is well aware that Lewis is not considered a philosopher.
As I said before, by 26 he was a philosophy professor at Oxford.
I took to heart your suggestion of consulting a biography of Lewis. You are wrong on both counts. Lewis was hired as a tutor of English in Magdalen college. Later...
The number of students reading English at Oxford vastly increased after the war. Whereas during the was Jack [as C. S. Lewis was known to his friends] had easily been able to tutor all Magdalen English students as well as some from other colleges, spending one hour alone with each student every week, he now found that he needed help. It soon became clear to his friends that his work load was excessive. They suggested that a professorship would be less demanding than tutoring. When a Merton professorship of modern English literature became available, Tolkien, who was a Merton Professor of English and, therefore, an elector, rallied to Jack's support, hoping to achieve what had been a prewar ambition, of having Jack and himeslf installed as the two Merton Professors. He suggested this to his fellow electors, H.W. Garrod, C.H. Wikinson, and Helen Darbishire and found to his surprise that all three were against Jack. They felt he had not produced a sufficient amount of scholarly work, pointing out that his most successful books were three novels and some popular religoius or theological books. They thought his election would lower the status of the professorship and even discredit the English School. Jack's former tutor, F.P. Wilson, was eventually elected to the professorship.
From George Sayer and Lyle W. Doreett's _Jack: A Life of C. S. Lewis_ p328-9. He wasn't a professor of philosophy. He wasn't even a professor of English. Occasionally you'll see him referred to a "Professor" by someone outside the college, but this was only an honorific. You could have looked that up yourself, you know.
How about reading it in the original Greek and taking in all of the metaphysics and philosophy at once?
Thank you for reminding me. The first tutorial I took in the Honors College was to have been a survey of Theater Criticism, but I started off with some unkind words about Aristotle's Poetics and Dr. Bill had the good sense to throw out the syllabus and spend an entire semester on that tiny, tiny book. There were only three of us in a tiny, windowless room in the basement of the TComm building. We'd read a sentence, I'd immediately jump in trying to understand it, Bill would try to rein me in, and then the other student, Toni, would reliably interject something profound after we'd gone back and forth for a while. Sometimes we'd make it through a chapter, sometimes we'd spend the entire class period on a single word (yes, in the original Greek). After 17 years of college education across Theater, English, Political Science, Philosophy, Math, CompSci and Biology, that remains my favorite class. It taught me how to read, and I made two lifelong friends as well (Toni even flew across the country to attend my Ph.D. defense.) (It's funny, though, that I considered it to be a theater class instead of a philosophy class when I was thinking about the philosophy classes I had taken.) That being said, translations are fine if you're reading outside of a university and useful when your reading inside them.
You’re not trained in philosophy or literature.
My undergraduate focus was on dramaturgy, so I'll cop to the literature end. I think I had enough philosophy classes to have earned a minor, but that would have required a modicum of paperwork and I've never been good at paperwork. I suppose I could ask if you have any training it either, but I don't think there's any need.
Putting that to one side, Lewis isn’t wrong because he’s not a philosopher, he’s wrong because he’s making several mistakes that actual philosophical training would have prevented.
How so?
I'm afraid my answer may contravene the moderation policy, so I'll have to forbear. I'm happy to give you an answer via private email or in a non-moderated public forum. I'm a regular at talk.origins, so if you like I can start a thread over there. I can give you the Chesterton critique there as well.BarryR
September 13, 2010
September
09
Sep
13
13
2010
05:39 PM
5
05
39
PM
PDT
---BarryR @179 "Before you waste you time, I certainly believe there are mathematical laws; they’re just context-dependent." [Which is the argument I have been making all along. Plane Geometry has its laws; spherical geometry has its laws; arithmetic has its laws etc. That is why I objected to your use of the word universal] BarryR @86: Math and logic. There are no “laws of math and logic”. There are axioms and ways of relating axioms to each other, but you’re free to choose your axioms and your relations to suit whatever it is you’re trying to accomplish [Which is the argument you made at the beginning and which you morphed into the previous paragraph.] BarryR pre-interaction: "THERE ARE NO MATHEMATICAL LAWS." BarryR post interaction. "I CERTAINLY BELIEVE THERE ARE MATHEMATICAL LAWS." Obviously, I disagree with your first position and agree with your second position. Now, let's begin to discuss the laws of logic that you claim do not exist.StephenB
September 13, 2010
September
09
Sep
13
13
2010
05:04 PM
5
05
04
PM
PDT
@194: Excuse me. That should read[commutative] and associative laws.StephenB
September 13, 2010
September
09
Sep
13
13
2010
04:32 PM
4
04
32
PM
PDT
---markf: "It is hardly surprising that the mathematical systems we know best describe aspects of the universe we know very well. That is because we were motivated/inspired to devise mathematical systems that describe the universe we know well." You are putting the cart before the horse. We know the universe in part, because the laws of mathematics allow us to provide a quantitative analysis. If we could know the universe and its laws in advance of applying mathematical laws, then the application would be redundant. You are assuming as given the very knowledge that can only be obtained through the laws of mathematics. Similary, if we could evaluate evidence in the absence of logical laws, we wouldn't need logic. Reason's rules inform evidence, and mathematical rules inform science. ---"BarryR has shown for several examples of mathematical law that you have suggested that they are dependent on certain axioms." Barry has only made the claim, he has not provided any support for that claim. On what axioms do the laws of association and accumulation depend? Please be specific, name the axiom or axioms on which they depend and explain the relationship between the two.StephenB
September 13, 2010
September
09
Sep
13
13
2010
04:14 PM
4
04
14
PM
PDT
#190 In some cases the same law was discovered by two people, neither of which was aware of the other’s work. This is only possible if both were discovering a truth that was independent of their capacity to simply conceive a set of axioms for the sake of argument. Two people might independently discover an inevitable consequence of the rules of chess. However, the rules of chess were clearly conceived by men. It is no coincidence that the ordered universe, which is law-like, is synchronized with the laws of mathematics. It is hardly surprising that the mathematical systems we know best describe aspects of the universe we know very well. That is because we were motivated/inspired to devise mathematical systems that describe the universe we know well. Nevertheless we keep on stumbling across aspects of the universe which are not modelled well by current maths. In some cases we devise new maths - non-Euclidean geometry. In other cases we are currently stumped - fluid dynamics. If one establishes a mere rule, as in chess, one has simply set down a certain set of conditions that does not necessarily have any relationship with the real world. There is nothing in the laws of chess that can quantify what happens in ordered nature. On the other hand, the laws of mathematics faithfully reflect what goes on in the world that they measure. That just isn't true. Euclidean geometry is only an approximation to any real world we have met - see above. They never fail and they never change. A rule could change overnight and, for that reason, it would be unreliable as a means of measurement. The consequences of axioms never change. The consequences of the rules of chess never change. The choice of axioms is continually changing to meet the demands of the real world or just to satisfy the curiousity of mathematicians. The rules of chess change to sometimes to give us a different game. In keeping with that point, Chess is simply a game like football or basketball, though a bit more cerebral. There is no such thing as a “law” of baseball. — Not sure about baseball but welcome to the laws of soccer. Also, don’t forget that mathematicians themselves call these laws by their proper name. Snells law, for example, is not called Snell’s axiom or Snell’s assumption or Snell’s proposition. These laws do not depend on some broader axiomatic formulation. If they did, they would not be laws. They would simply be derivatives of the mathematicians imagination. Given the link to the laws of football I don't think we should place too much emphasis on whether they are called laws, axioms, lemmas, results or whatever. I am not quite sure why you introduced Snell's law which is a law of physics not maths and the result of experimental observation rather than deduction. BarryR has shown for several examples of mathematical law that you have suggested that they are dependent on certain axioms.markf
September 13, 2010
September
09
Sep
13
13
2010
01:45 PM
1
01
45
PM
PDT
BarryR, C. S. Lewis as Philosopher by David Baggett, Gary R. Habermas, and Jerry L. Walls should be added to your embarrassing amount of books. I don't think you're really appreciating what it means to get a first in Greats at Oxford. As I said before, by 26 he was a philosophy professor at Oxford.
Interesting. I appear to have had more training in philosophy than C. S. Lewis. (Enough classes to qualify for a minor during my undergrad as well as several graduate classes in logic when I was getting my Masters.)
How so?
Putting that to one side, Lewis isn’t wrong because he’s not a philosopher, he’s wrong because he’s making several mistakes that actual philosophical training would have prevented.
How so?
Part of my training was in learning how to spot those kinds of errors. That’s part of the difference between reading Plato as Literature and reading Plato as Philosophy.
How about reading it in the original Greek and taking in all of the metaphysics and philosophy at once? You're not trained in philosophy or literature. I find it frankly irrational to consider that you can know what was in Lewis's mind while he read Plato and every other philosopher that he studied.
Tell you what: remove the moderation block and I will go through and give you a point by point critique of Chesterton. If I’m going to put in the time to do this, I want some assurance that the post will actually show up here.
It'll show up as long as it's not against the moderation policy.Clive Hayden
September 13, 2010
September
09
Sep
13
13
2010
01:41 PM
1
01
41
PM
PDT
PPS: Wiki actually goes into some of the chaos, by raising the scene in 1984 in which the party is imagined as saying 2 + 2 = 5: ________________________ >> The phrase "two plus two equals five" ("2 + 2 = 5") is a slogan used in George Orwell's Nineteen Eighty-Four[1] as an example of an obviously false dogma one must believe, similar to other obviously false slogans by the Party in the novel. It is contrasted with the phrase "two plus two makes four", the obvious – but politically inexpedient – truth. Orwell's protagonist, Winston Smith, uses the phrase to wonder if the State might declare "two plus two equals five" as a fact; he ponders whether, if everybody believes in it, does that make it true? Smith writes, "Freedom is the freedom to say that two plus two make four. If that is granted, all else follows." Later in the novel, Smith attempts to use doublethink to teach himself that the statement "2 + 2 = 5" is true, or at least as true as any other answer one could come up with. Eventually, while undergoing electroshock torture, Winston declared that he saw five fingers when in fact he only saw four ("Four, five, six - in all honesty I don't know"). The Inner Party interrogator of thought-criminals, O'Brien, says of the mathematically false statement that control over physical reality is unimportant; so long as one controls their own perceptions to what the Party wills, then any corporeal act is possible, in accordance with the principles of doublethink ("Sometimes they are five. Sometimes they are three. Sometimes they are all of them at once"). >> __________________________ Are we SURE it isn't "really" 1984? Just like, we could simply decide that the new millennium began in 2000 not 2001. (In Jamaica, that famous iconoclast, Mutty Perkins, was heard: one coco, two cocos [= taro etc], . . . highlighting that since there was no year of zero, the 2,000th year marked the close of the second Christian Millennium as counted on the conventional timeline.) How vital is the understanding that Aristotle (and Plato before him) bequeathed to us:
The truth says of what is, that it is; and of what is not, that it is not.
And, sirs, truth is the real target of the mind-benders. Inconvenient but self-evident or objective truth. In that sad cause, they would have the year forever be 1984 . . . but there cometh a day in the which we shall all have to reckon with Him who is Truth and Reason Himself.kairosfocus
September 13, 2010
September
09
Sep
13
13
2010
01:07 PM
1
01
07
PM
PDT
---markf: "Are the results discoveries or inventions? I don’t care too much what we call them." Don't you think that if two things are radically different they should go by different names? If someone discovers a law, it means that he/she has uncovered an unchanaging truth. In some cases the same law was discovered by two people, neither of which was aware of the other's work. This is only possible if both were discovering a truth that was independent of their capacity to simply conceive a set of axioms for the sake of argument. It is no coincidence that the ordered universe, which is law-like, is synchronized with the laws of mathematics. If one establishes a mere rule, as in chess, one has simply set down a certain set of conditions that does not necessarily have any relationship with the real world. There is nothing in the laws of chess that can quantify what happens in ordered nature. On the other hand, the laws of mathematics faithfully reflect what goes on in the world that they measure. They never fail and they never change. A rule could change overnight and, for that reason, it would be unreliable as a means of measurement. ---"Consider the consequences of the rules of chess – for example it is true, given the rules, that you cannot mate your opponent if all you have left is a bishop and your king. Is this a discovery or an invention – given that Chess is an invention and you could always alter the rules to make it possible to mate with a bishop?" The rules of chess are not laws. They are not necessarily unchangeable, nor are they synchonized with a lawfully ordered universe in such a way that each makes sense in light of the other. By contrast, the laws of math confirm the laws of science and the laws of science confirm the laws of math. That is one of the main reasons that the universe is a rational place. Lawful mathematics tells us something about the law-like ordering of the universe. If mathematics had no laws, it would be undependable as a means to measure nature. Mathematics may develop, but its laws do not change. In keeping with that point, Chess is simply a game like football or basketball, though a bit more cerebral. There is no such thing as a "law" of baseball. --- "Who cares? Evidently, both you and I care a great deal about the subject matter. There is a truth which needs to be acknowledged: We have rational minds, we live in a rational universe, and there is a correspondence between the two. Also, don't forget that mathematicians themselves call these laws by their proper name. Snells law, for example, is not called Snell's axiom or Snell's assumption or Snell's proposition. These laws do not depend on some broader axiomatic formulation. If they did, they would not be laws. They would simply be derivatives of the mathematicians imagination.StephenB
September 13, 2010
September
09
Sep
13
13
2010
01:01 PM
1
01
01
PM
PDT
PS: Maybe we can cite Wiki, as that ever so reliably materialistic reference will at least set a baseline: ___________________ >>Addition is a mathematical operation that represents combining collections of objects together into a larger collection. It is signified by the plus sign (+). For example, in the picture on the right, there are 3 + 2 apples—meaning three apples and two other apples—which is the same as five apples. Therefore, 3 + 2 = 5. Besides counts of fruit, addition can also represent combining other physical and abstract quantities using different kinds of numbers: negative numbers, fractions, irrational numbers, vectors, decimals and more. Addition follows several important patterns. It is commutative, meaning that order does not matter, and it is associative, meaning that when one adds more than two numbers, order in which addition is performed does not matter (see Summation). Repeated addition of 1 is the same as counting; addition of 0 does not change a number. Addition also obeys predictable rules concerning related operations such as subtraction and multiplication. All of these rules can be proven, starting with the addition of natural numbers and generalizing up through the real numbers and beyond. General binary operations that continue these patterns are studied in abstract algebra. Performing addition is one of the simplest numerical tasks. Addition of very small numbers is accessible to toddlers; the most basic task, 1 + 1, can be performed by infants as young as five months and even some animals. In primary education, children learn to add numbers in the decimal system, starting with single digits and progressively tackling more difficult problems. Mechanical aids range from the ancient abacus to the modern computer, where research on the most efficient implementations of addition continues to this day . . . . Let N(S) be the cardinality of a set S. Take two disjoint sets A and B, with N(A) = a and N(B) = b. Then a + b is defined as N (A U B) . . . >> ___________________ To then impose an extraneous and contradictory "definition" on what is plain and simple, to make a rhetorical point and thus to derail a serious discussion, is plainly inexcusable. But then, these days ever so much of that is going on, as our civilisation descends -- at the hands of the materialists and their fellow travellers -- into the chaos Plato warned of in the Laws, Bk X; 2300 years ago.kairosfocus
September 13, 2010
September
09
Sep
13
13
2010
12:45 PM
12
12
45
PM
PDT
Onlookers: The above abundantly illustrates what happens when the law of non-contradiction is wantonly disregarded, and equivocations are injected willy-nilly into our reasoning structure. Ever-deepening confusion and absurdity. A sad picture. And ever so revealing. Reductio ad absurdum. On steroids. In fact, the addition operator was introduced precisely to express what happens with the cardinality of the resulting set when disjoint sets of identifiable items are joined in a union. Once that is happening and the cardinality of the original two sets is 2, then the cardinality of the resulting unified set is and must be 4, on pain of absurdity. But, that is simply a fancy way of describing what happens when you:
1 --> Get a match box with enough matches in it. 2 --> Take out two pairs, and set them up in two clusters 3 --> Push the clusters together to form a single unified cluster.
What we keep seeing above is the attempt to inject an equivocation in the symbol for the addition binary operation, and the consequential contradictions and absurdities that follow. Now, you will see that I have confined myself to natural numbers, as that is all we need to show that there are objectively verifiable self-evident, truths in mathematics; that thus have universal character. Now, we may then EXTEND the addition operator to other cases, as we expand the entities we are addressing, but in so doing we must be careful to be consistent with the original definition. So, we speak of adding fractions [and remember a decimal often has a fractional component; the place value notation being a misleadingly "simple" way of expressing that]; reals, negatives, and complex numbers. When we start to get into vectors, matrices and further constructs, the properties change and one has to be careful. But the parallelogram law of vector addition is equivalent to reduction on base vectors and addition of coefficients of the base vectors, e.g. the famous i, j, k vectors of simple mathematics. And complex numbers are a special case of vectors. All such extensions rest on the confident acceptance of the universal validity of the already existing case of the addition of simple wholes. So, here we are as we watch the evolutionary materialistic, selectively hyperskeptical, radically relativistic view in action: intelligent, highly educated adherents are trying to argue against the legitimacy of the fact that 2 + 2 = 4 is self evidently and undeniably true once we understand the meaning of the symbols [which of course is a contextual thing, but the reality the symbols capture is beyond the particular symbols we are using], something that is instantly accessible to the conscious reasoning mind, by simply playing with a box of matches. Utterly and ever so sadly revealing! GEM of TKIkairosfocus
September 13, 2010
September
09
Sep
13
13
2010
12:34 PM
12
12
34
PM
PDT
Cabal:
Nobody in his right mind would ever claim that two apples plus two apples mysteriously might reappear as onehundredandfortyseven (147) apples.
Precisely. That is why we define the operations, numerals and symbols to capture these facts of reality. Two match sticks each, in two groups, joined together will -- and must -- give four. 2 + 2 = 4 summarisesw this. What was done above was to inject an arbitrary redefinition of+, and pretend that this changes the truth of 2 + 2 = 4. But to try to do that simply results in a cluster of patent contradictions. And, the same reasoning extends to plane geometry and the like, as well as to trigonometry. When se set out axiomatic systems to integrate such facts that we discover ans substantiate as necessary truths, then we see that axiomatic systems are necessarily incomplete if coherent, and incoherent if completer. Worse, there is no way to set up a system of axioms known to be consistent. (Thence the concept of undecidable propositions relative to given systems of axioms.) But such integrative schemes account for the facts of mathematics, it is not that the facts depend on them. GEM of TKIkairosfocus
September 13, 2010
September
09
Sep
13
13
2010
10:12 AM
10
10
12
AM
PDT
#179, #180 Stephenb There is a lot of terminology floating around here which is in danger of turning this into a discussion of semantics. I suggest ditching the words "universal" and "absolute". I would not deny that there are mathematical results, some of which are sufficiently general to be called laws, which flow from agreed sets of axioms. These results always hold if the axioms hold. So pi is always going to 3.14..... given the axioms of Euclidian geometry. I hope you would also accept that any given set of axioms may match the material world to a greater or lesser extent. Euclidean geometry is a good approximation for relatively small areas of the earth's surface - pretty poor for large scale maps. The interesting questions include: 1) Does Euclidean geometry have any special status relative to other geometries? (If not, then we can only say the ratio of the circumference to the diameter is sometimes 3.14 .... and sometimes not, depending on conditions. ) 2) Are the results discoveries or inventions? I don't care too much what we call them. Consider the consequences of the rules of chess - for example it is true, given the rules, that you cannot mate your opponent if all you have left is a bishop and your king. Is this a discovery or an invention - given that Chess is an invention and you could always alter the rules to make it possible to mate with a bishop? Who cares? 3) Do the laws of mathematics refer to some abstract objects - numbers, circles, triangles etc - which in some sense exist to be discovered? Here I suspect I differ from BarryR. I think he proposes that there are infinitely many systems of such abstract objects including one for each possible geometry. He just doesn't give any such system any special status. I believe they are just rules for manipulating symbols much as the rules of chess are (unless of course you believe in an abstract world of pawns, kings and bishops etc).markf
September 13, 2010
September
09
Sep
13
13
2010
09:57 AM
9
09
57
AM
PDT
1 6 7 8 9 10 15

Leave a Reply