[This just in from yet another colleague:]
Somewhere (I can’t find the reference) I read recently in something by an anti-ID, pro-stochastic-macroevolution writer a crowing remark that a spider hatched in isolation immediately starts to build a perfect web and gets it perfectly right on the first attempt.
From one point of view a spider web is a “simple geometrical/combinatorial object†[like a crystal] that wouldn’t take too many binary info-bits to specify, but I conjecture that the “instruction manual†for BUILDING a spider web probably could be shown to require more than 500 binary bits (and therefore be “physically impossible†to have arisen by any combination of natural law and chance).
Also it seems unlikely to get to a complete web-construction procedure by a sequence of lesser constructs each of which provides a differential-reproductive advantage compared to its predecessor and which can be generated by the information content of a single point-mutation.
Might this not provide a “research project†for a combination of a naturalist and a mathematically-skilled collaborator that could lead to a paper that couldn’t be denied by “peer review†to be mainstream publishable?