[This just in from yet another colleague:]
Somewhere (I canÃ¢â‚¬â„¢t find the reference) I read recently in something by an anti-ID, pro-stochastic-macroevolution writer a crowing remark that a spider hatched in isolation immediately starts to build a perfect web and gets it perfectly right on the first attempt.
From one point of view a spider web is a Ã¢â‚¬Å“simple geometrical/combinatorial objectÃ¢â‚¬Â [like a crystal] that wouldnÃ¢â‚¬â„¢t take too many binary info-bits to specify, but I conjecture that the Ã¢â‚¬Å“instruction manualÃ¢â‚¬Â for BUILDING a spider web probably could be shown to require more than 500 binary bits (and therefore be Ã¢â‚¬Å“physically impossibleÃ¢â‚¬Â to have arisen by any combination of natural law and chance).
Also it seems unlikely to get to a complete web-construction procedure by a sequence of lesser constructs each of which provides a differential-reproductive advantage compared to its predecessor and which can be generated by the information content of a single point-mutation.
Might this not provide a Ã¢â‚¬Å“research projectÃ¢â‚¬Â for a combination of a naturalist and a mathematically-skilled collaborator that could lead to a paper that couldnÃ¢â‚¬â„¢t be denied by Ã¢â‚¬Å“peer reviewÃ¢â‚¬Â to be mainstream publishable?