Recently, Mark Frank and I had a brief dialogue in the OP,“Didn’t everyone already know this about dogs?” I’ve decided to clean it up a bit and re-post it because after my last question, I received no responses. At the outset, I would like to say that I place no blame about lack of responses on Mark Frank or anyone else in the last OP (as my post was rather quickly buried.) Having said that, in this OP I would like somebody to address the question. After one go around where I’d suggested that “success” should be counted as an increase in genetic information, Mark Frank corrected me, writing: In biology success is breeding in the available environment. As a Read More ›
A short article in the popular press reports that certain sauropods with long necks could not have held those necks upright. I read the article, leaned back in my chair and did a lot of serious thinking. . .
Read More ›
Ari N. Schulman, “Why Minds Are Not Like Computers,” The New Atlantis, Number 23, Winter 2009, pp. 46-68.
“The problem, therefore, is not merely that science is being used illegitimately to promote a materialistic worldview, but that this worldview is actively undermining scientific inquiry.”—UncommonDescent
Unless otherwise noted, all quotations from the article, “Why Minds Are Not Like Computers,” are italicized.
Mr. Schulman walks the tightrope of analysis and criticism, describing how a materialistic worldview actively undermines scientific inquiry in the area of Artificial Intelligence (AI). Analysis and (self-criticism) should be part of all scientific endeavor; the strict materialist does no such thing; instead, he plays dodgeball.
Much of the article, especially the discussions of the brain, computers, Turing Machines, the Turing Test, and the Chinese Room Problem were all helpful in understanding the state of affairs in AI for the layman. My comments are those of a such a layman, included that you might see what a layman might take from such an article. Never-the-less, questions remain . . .