Artificial Intelligence Books of interest Intelligent Design Mind Naturalism

Bill Dembski on how a new book expertly dissects doomsday scenarios

Dembski: “At the end of the discussion, however, Kurzweil’s overweening confidence in the glowing prospects for strong AI’s future were undiminished. And indeed, they remain undiminished to this day (I last saw Kurzweil at a Seattle tech conference in 2019 — age seemed to have mellowed his person but not his views).” But Larson says it’s all nonsense.

Artificial Intelligence Culture Intelligent Design Mind Naturalism

At Scientific American: Why we live in a simulation

This is likely intended as a spoof: “There is nothing in philosophy or science, no postulates, theories or laws, that would predict the emergence of this experience we call consciousness. Natural laws do not call for its existence, and it certainly does not seem to offer us any evolutionary advantages.” But it happens to be true.

Artificial Intelligence Cosmology Intelligent Design Philosophy Physics

Sabine Hossenfelder explains why she thinks that the computer sim universe is pseudoscience

Hossenfelder: You can approximate the laws that we know with a computer simulation – we do this all the time – but if that was how nature actually worked, we could see the difference. Indeed, physicists have looked for signs that natural laws really proceed step by step, like in a computer code, but their search has come up empty handed.

Artificial Intelligence Intellectual freedom Intelligent Design

A Twitter mob made a mistake when it went after an AI industry giant

Pedro Domingos: In my confrontation with the AI cancel crowd, I was particularly helped by the fact that several of the ringleaders are (or call themselves) professional AI ethicists. Some of them are even well-known within their field. When they serially engaged in childish and unethical behavior in full view of their colleagues, they did my job for me.

Animal minds Artificial Intelligence Intelligent Design

Can we teach a computer to feel things? A dialogue…

You are having an experience reading the vital signs. The dog is having quite a different experience living them. You have all of his data and none of his experience. The dog has none of his data and all of his experience. Suppose you took all that data and instantiated it into a robot. Is the robot having your experience or the dog’s? Or neither, actually?