- Share
-
-
arroba
I was going to post this as a comment in Salvador’s thread (https://uncommondesc.wpengine.com/index.php/archives/1481) but it became too long, so I’m posting it as a new topic.
On the subject of computer checkers I have some observations. I hate to brag (okay, I lied!) but I am one of the world’s leading authorities on this subject.
I know David Fogel. He is a very bright researcher and a really cool, nice guy. (Plus, he is a very talented pianist and musician. We have shared our piano CDs. Is there some correlation between music, mathematics and computer science?)
There are three essential elements in an artificial-intelligence (AI) games-playing computer program:
1) A move generator. The board and pieces must be represented in the computer’s memory, either as arrays or bit-maps. An algorithm must be devised to generate moves based on the rules of the game.
2) A tree-search algorithm. This is even more complex and requires sophisticated software engineering. The game tree explodes exponentially, so an alpha-beta minimax algorithm is employed. Other sophisticated algorithms must be used to curtail the exponential explosion: quickly accessible hash tables that contain previously-seen positions, iterative deepening with best-move prediction accessed from the hash tables, use of killer moves and move sorting based on a history heuristic, and much more.
3) A terminal-node evaluation function which attempts to ascertain the value of a position statically, since the tree search must terminate at some point. These values are backed up through the search tree with the alpha-beta minimax algorithm.
Most of the playing strength of these programs comes from the tree search (which is not possible without the move generator), since humans have a hard time keeping track of this much detail.
Fogel and his team of software engineers programmed elements 1 and 2, without which element 3 (the product of the genetic algorithm) would have been completely useless. The fitness function was the definition of a win at the end of the tree search: no moves left for the opponent. This was a goal defined by the programmers. The leaf-node evaluation function is not essentially strategic; “strategy” is mostly a byproduct of the tree search.
This is not to say that Fogel’s research is meaningless. It is intriguing and worthwhile. What all this means is that a cleverly designed trial-and-error process, combined with a lot of even more clever software engineering and human-devised algorithms, can produce interesting and productive results.
It should be noted that human-crafted (i.e., intelligently-designed terminal-node evaluation functions) totally trounce Blondie’s eval function with ease.
The really cool stuff in games-playing programs is the development of endgame databases. The massive brute force and memory of modern computers, and highly efficient retrograde analysis algorithms, have produced mind-boggling results.
My colleague and I computed both win-loss-draw databases and perfect-play lookup databases for the game of checkers. (We are the only ones to have computed PPL databases for checkers.) The PPL databases required nearly two months of 24-hour-per-day computation on my dual-processor machine equipped with four gigs of RAM. I kept both CPUs and all four gigs busy for that entire time.
You can download a copy of our paper on checkers endgame databases here. This paper was published in the premier international peer-reviewed publication for games-playing AI programs, the ICGA Journal, and was reproduced in the hardcover book, Advances in Computer Games, Proceedings of the 10th Advances in Computer Games Conference.
Alex Moiseyev, the current human checkers world champion, after playing the PPL databases, exclaimed in his thick Russian accent: “This play is INHUMAN!”
You can download a free small version of my checkers program at http://WorldChampionshipCheckers.com. You can get a copy of the massive and really cool PPL and WLD databases for the cost of disk duplication and shipping ($25) from bobnewell@bobnewell.net. The highly compressed data require 12 gigabytes of disk space.
There is an interesting aspect to this massive computing capability. I call it “the elegance of brute force.” The PPL databases have corrected classic human play that has been around for centuries. Was any new information produced?
I suggest that the answer to this question is both yes and no. Solutions were found that no human could ever have found without the power of modern computation. But what human or team of humans could compute PI to a billion decimal places?