Uncommon Descent Serving The Intelligent Design Community

At Sci-News: Moths Produce Ultrasonic Defensive Sounds to Fend Off Bat Predators

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Scientists from Boise State University and elsewhere have tested 252 genera from most families of large-bodied moths. Their results show that ultrasound-producing moths are far more widespread than previously thought, adding three new sound-producing organs, eight new subfamilies and potentially thousands of species to the roster.

A molecular phylogeny of Lepidoptera indicating antipredator ultrasound production across the order. Image credit: Barber et al., doi: 10.1073/pnas.2117485119.

Bats pierce the shadows with ultrasonic pulses that enable them to construct an auditory map of their surroundings, which is bad news for moths, one of their favorite foods.

However, not all moths are defenseless prey. Some emit ultrasonic signals of their own that startle bats into breaking off pursuit.

Many moths that contain bitter toxins avoid capture altogether by producing distinct ultrasounds that alert bats to their foul taste. Others conceal themselves in a shroud of sonar-jamming static that makes them hard to find with bat echolocation.

While effective, these types of auditory defense mechanisms in moths are considered relatively rare, known only in tiger moths, hawk moths and a single species of geometrid moth.

“It’s not just tiger moths and hawk moths that are doing this,” said Dr. Akito Kawahara, a researcher at the Florida Museum of Natural History.

“There are tons of moths that create ultrasonic sounds, and we hardly know anything about them.”

In the same way that non-toxic butterflies mimic the colors and wing patterns of less savory species, moths that lack the benefit of built-in toxins can copy the pitch and timbre of genuinely unappetizing relatives.

These ultrasonic warning systems seem so useful for evading bats that they’ve evolved independently in moths on multiple separate occasions.

In each case, moths transformed a different part of their bodies into finely tuned organic instruments.

[I’ve put these quotes from the article in bold to highlight the juxtaposition of “evolved independently” and “finely tuned organic instruments.” Fine-tuning is, of course, often associated with intelligent design, rather than unguided natural processes.]

See the full article in Sci-News.

Comments
ET JVL, if you were really interested in Dembski’s metric, you would take it up with Dembski.
JVL Is it possible that you are resisting my exploring Dr Demski’s metric from his 2005 monograph because you know it’s rubbish and you don’t want anyone to find that out.
It's a false argumentation. Intentionally false(what a surprise :lol: ). There is no functional information without the right decoder. JVL wants to read a DVD with a microwave oven or a hammer or any other method available in the universe. Any coded information has a unique key for decoding but somehow JVL wants to be able to read information without the key . Intentionally he put an impossible condition that do not exists in reality. But that is what atheists do.Lieutenant Commander Data
August 17, 2022
August
08
Aug
17
17
2022
07:03 AM
7
07
03
AM
PDT
ET: if you were really interested in Dembski’s metric, you would take it up with Dembski. I like to see how things work myself. If I get stuck then I might call on him but I suspect he's pretty busy and I haven't got a reason to bother him. So far the mathematics involved is pretty straight-forward. The tricky parts will be figuring out pS(T) really. That's why I've been carefully picking examples to get used to how it all works.JVL
August 17, 2022
August
08
Aug
17
17
2022
06:35 AM
6
06
35
AM
PDT
JVL, if you were really interested in Dembski's metric, you would take it up with Dembski.ET
August 17, 2022
August
08
Aug
17
17
2022
06:17 AM
6
06
17
AM
PDT
ET: JVL is engaging in a distraction. Anytime you'd like to compute Dr Dembski's metric for a particular example please do so. I'd like to see that. But then I'm assuming it's worth checking Dr Dembski's metric to see if it does what he hoped it would do.JVL
August 17, 2022
August
08
Aug
17
17
2022
06:02 AM
6
06
02
AM
PDT
Alan Fox: I’d like to hear what the connection is supposed to be with coin tosses and living systems. I don't think there is. I'm using sequences of coin flips (like Dr Dembski did) in order to see how his metric responds to things I already know how to analyse. If I create a sequence of Hs and Ts by flipping a fair coin I expect the result to be no design. If I create a sequence of Hs and Ts following some algorithm or procedure (i.e. not randomly generated) then will the metric be able to detect that? Let's say I have a sequence of something which has functional information in it, convert it to 0s and 1s (similar to Hs and Ts) will the metric be able to detect that? If it can then that's pretty impressive, worth checking I would have thought. If it can't then back to the drawing board I guess. Anyway, I'm just having a go. The tricky parts are the pS(T) clearly but also the P(T|H) in some situations. Let's say I take the Gettysburg Address, convert all the characters (including the spaces) into ASCII and run them altogether in a sequence. What is T? Is it the original English text or the 0s and 1s? If it's the text then P(T|H) is tricky . . . am I checking against the words being randomly picked? The letters and spaces and punctuation being randomly picked? If T is the Os and 1s then I'm checking against that sequence being randomly generated which is much easier. Probably, in that example I'd take the monkeys and the typewriters approach and just look at the sequence of characters and spaces.JVL
August 17, 2022
August
08
Aug
17
17
2022
06:00 AM
6
06
00
AM
PDT
There isn't any connection between coin tosses and biology. JVL is engaging in a distraction. There isn't any model for evolution by means of blind and mindless processes. The only data that fits it involves genetic diseases and deformities.ET
August 17, 2022
August
08
Aug
17
17
2022
05:43 AM
5
05
43
AM
PDT
At some point, I'd like to hear what the connection is supposed to be with coin tosses and living systems. Fair coin tosses are random. Living organisms and the niches that design them are not. KF, take note that if your model does not fit the data, it is most likely your model is wrong.Alan Fox
August 17, 2022
August
08
Aug
17
17
2022
05:27 AM
5
05
27
AM
PDT
Kairosfocus: There are inherent issues and there is a prior, decisive result. I am working with Dr Dembski's metric, seeing how it works, checking it against some easy to understand and to compute examples. So far it seems to be doing what it was designed to do. I'm contemplating another, more complicated example. If you don't like that you don't have to read what I post.JVL
August 17, 2022
August
08
Aug
17
17
2022
04:42 AM
4
04
42
AM
PDT
ET, broken window theory, even using initials of vulgarities opens a door better kept shut. KFkairosfocus
August 17, 2022
August
08
Aug
17
17
2022
04:40 AM
4
04
40
AM
PDT
JVL, there is no ''just'' in that. There are inherent issues and there is a prior, decisive result. KFkairosfocus
August 17, 2022
August
08
Aug
17
17
2022
04:39 AM
4
04
39
AM
PDT
ET: you are just another willfully ignorant punk. I gave you an example to work on and you choked. So STFU. I'm learning how Dr Dembski's metric works by starting with simple examples. I'm contemplating the next one and hope to build up to something really complicated like what you proposed. You are welcome to evaluate his metric for your example if you wish. I would find that very enlightening.JVL
August 17, 2022
August
08
Aug
17
17
2022
04:23 AM
4
04
23
AM
PDT
JVL, you are just another willfully ignorant punk. I gave you an example to work on and you choked. So STFU.ET
August 17, 2022
August
08
Aug
17
17
2022
04:06 AM
4
04
06
AM
PDT
Kairosfocus: the turnabout projection tactic. Again, kindly refer to say Taub and Schilling as documentation of how negative log probability becomes information, why; and to my notes with Connor to trace the reasoning. Look, I've just been trying to evaluate Dr Dembski's metric as he wrote it. I did do an example where I left the core intact and one where I broke it apart and I got the same result. So I will continue to evaluate it in one piece as I think that's easier. I will no longer even consider your version because it clearly gives erroneous results for some simple examples. Clearly the problem with your version is replacing pS(T) with something too simple. In fact, the larger the number of trials of coin flips your version get more and more out of whack. But I'll leave it up to you whether or not you want to revise it. Especially as additivity is desired so Itot = I1 + I2, which is decisive, this is a natural metric. Next, you have from the beginning been pointed to the product rule for logs, which directly implies the other two terms compose a threshold, so WmAD’s expression is a case of a metric of functionally specific information beyond a threshold of complexity; which makes good sense. Clearly I understand how to use logs so maybe you should stop being so condescending. Clearly Dr Dembski's metric is a way of seeing if there is enough specified complexity present for a particular pattern or event, T, to conclude the threshold has been passed which would justify the design inference. FOR THAT T. You keep making a general argument which no one is arguing against. All of that was on the table from the outset, it is a matter of reducing – log2[ . . . ] and yet hundreds of comments later you are still resisting, evading, dismissing, distracting. And, as outlined yet again, simply giving reasonable bounds for the threshold allows us to see that intelligently directed configuration is the only plausible . . . and best . . . explanation for OoL given the FSCO/I in the cell, and of body plans given the further info. Clearly you do not understand my point, what I am exploring or what Dr Dembski designed his metric to do. Where, the excess simply on info carrying capacity is so large that no plausible degree of redundancy compatible with how elaborate, specific and methodical the cell’s mechanisms for protein synthesis etc are. That shouts, high and exacting specificity. That is decisive on the design inference issue. Which seems to be why it is being resisted. How is exploring a metric written by one of the major intellects behind the modern ID movement being resistant? I want to see how the metric works by checking it against simple, easy to understand and to compute examples. I HAVE NOT argued against the basic idea of there being some threshold of specified complexity which justifies the design inference. I HAVE been seeing how the metric works. And frankly, if you struggle with and resist or evade something so basic, that is good reason for me to hold that attempts to go on to other points — which are actually outlined in part above [e.g. I used WmAD’s 10^140 remarks to deduce an implied value and pointed out that tossed coins yield no functional specificity but coins expressing ASCII code would] — would be even more futile and would end up in a cloud of needless contention. Dr Demski himself used coin tossing as an example. If the coin being tossed is not fair or if some kind of surreptitious manipulation was going on then we would expect to see something that is not random or so improbable that there must be some kind of 'design' or influence being exerted. Have you even read his 2005 monograph? Maybe you should look it over again as it seems that you have misinterpreted it or misremembered it. Or both.JVL
August 17, 2022
August
08
Aug
17
17
2022
04:06 AM
4
04
06
AM
PDT
Alan Fox is a willfully ignorant ass. An ID hypothesis has been presented. Alan choked on it.
All we see are claims that evolutionary biology fails, routinely based on poor understanding of evolutionary theory, followed by “therefore ID” by default.
Liar. And ignorance thrown in! Learn how to use a dictionary, Alan. Evolution by means of blind and mindless processes was considered. That is the antithesis of default, you lowlife loser.ET
August 17, 2022
August
08
Aug
17
17
2022
04:04 AM
4
04
04
AM
PDT
AF, lying again to set up and knock over a strawman: >>All we see are claims that evolutionary biology fails,>> 1: No, what is there is a challenge to explain origin of 100 - 1,000 kbases worth of FSCO/I to make the first genome and 10 - 100+ mn bases dozens of times over to create basic body plans. 2: For which, after 160 years there are no good blind watchmaker answers. 3: Adaptation to niches and the like comes after body plan origin, where for instance body plans are expressed early in embryological development and random changes [even small ones] strongly tend to be lethal . . . one reason for miscarriages for instance. Later changes would not have body plan origin character. 4: What we have seen is imposition of a priori evolutionary materialism as ideology by the back door, as Lewontin attested and too many others to mention back up. There is a name for that, ideological captivity. 5: But of course, it is oh so convenient to resort to turnabout projection, as we see shortly. >> routinely based on poor understanding of evolutionary theory,>> 6: It is Lewontin, Mahner, US NAS/NSTA, Crick, Monod etc who document the imposition, so if there is a misunderstanding, it is in leading, sometimes Nobel Prize holding proponents. 7: So, we see here confession by projection to the despised other. >> followed by “therefore ID” by default.>> 8: Lie. 9: You have been around the ID debates for many years and KNOW or could readily learn, that the per aspect design filter defaults to blind chance and/or mechanical necessity, and only infers design per inference to the best empirically supported explanation on observing tested, reliable signs of design. 10: That you are forced to resort to a slanderous strawman caricature is a sign that your own position has the worse case on the merits. 11: Further, you and other inveterate objectors have been present any number of times when abductive logic, the logic of scientific inference to explanation, has been explained over and over again. That you insistently caricature that logic shows want of good faith. 12: Let me give an iconic case, from Lyell's title for vol iii of his Principles of Geology:
PRINCIPLES OF GEOLOGY: BEING AN INQUIRY HOW FAR THE FORMER CHANGES OF THE EARTH’S SURFACE ARE REFERABLE TO CAUSES NOW IN OPERATION. [--> appeal to Newton's Rules, in the title of the work] BY CHARLES LYELL, Esq, F.R.S. PRESIDENT OF THE GEOLOGICAL SOCIETY OE LONDON . . . JOHN MURRAY , , , 1835 [--> later, publisher of Origin]
>>Dembski’s math argument follows this pattern.>> 13: An attempt to strawman caricature and dismiss. 14: Pray, tell us most learned AF, is it true or false that standard analysis and explanation of metrics for information run as:
307 kairosfocus August 7, 2022 at 3:04 pm PS, my old 1971 edition [--> so, I have also a newer edn on my shelves and several similar works] Taub and Schilling, Princs of Communication, p. 415:
consider a communication system in which the allowable messages are m1, m2, . . . , with probabilities of occurrence p1, p2, . . . Let the transmitter select message mk, of probability pk; let us further assume that the receiver has correctly identified the message [–> no channel noise approximation]. Then we shall say, by way of definition of the term information, that the system has conveyed an amount of information Ik = log2 [1/pk] . . . . while Ik is an entirely dimensionles number, by convention, the “unit” it is assigned is the bit.
Of course, log [1/pk] is – log [pk], and if pk and pj apply to two independent messages 1/pk * 1/pj are such that this makes Itot = Ik + Ij. For pk = 1/2, we get Ik = 1 bit, what a coin flip would give. With a bias, so that say H is more likely and T less likely, The bias reduces the info capacity of the more likely and raises that of the less. And so forth. PPS, the linked in my note accessible through my handle gives more.
15: Do, tell, what is the fundamental flaw in Harry S Robertson as he -- highly relevantly -- extends to the informational school of stat thermodynamics, in Thermal Physics, as I clip from my always linked briefing [and onward from my shelves]:
pp. vii - viii: . . . the standard assertion that molecular chaos exists is nothing more than a poorly disguised admission of ignorance, or lack of detailed information about the dynamic state of a system . . . . If I am able to perceive order, I may be able to use it to extract work from the system, but if I am unaware of internal correlations, I cannot use them for macroscopic dynamical purposes. On this basis, I shall distinguish heat from work, and thermal energy from other forms . . . And, in more details, pp. 3 - 6, 7, 36: . . . It has long been recognized that the assignment of probabilities to a set represents information, and that some probability sets represent more information than others . . . if one of the probabilities say p2 is unity and therefore the others are zero, then we know that the outcome of the experiment . . . will give [event] y2. Thus we have complete information . . . if we have no basis . . . for believing that event yi is more or less likely than any other [we] have the least possible information about the outcome of the experiment . . . . A remarkably simple and clear analysis by Shannon [1948] has provided us with a quantitative measure of the uncertainty, or missing pertinent information, inherent in a set of probabilities [NB: i.e. a probability different from 1 or 0 should be seen as, in part, an index of ignorance] . . . . [deriving informational entropy . . . ] H({pi}) = - C [SUM over i] pi*ln pi, [--> weighted sum average, H, entropy is avg info per symbol/case/alternative outcome, tied to how surprising it is] [where [SUM over i] pi = 1, and we can define also parameters alpha and beta such that: (1) pi = e^-[alpha + beta*yi]; (2) exp [alpha] = [SUM over i](exp - beta*yi) = Z [Z being in effect the partition function across microstates, the "Holy Grail" of statistical thermodynamics]. . . . [H], called the information entropy, . . . correspond[s] to the thermodynamic entropy [i.e. s, where also it was shown by Boltzmann that s = k ln w], with C = k, the Boltzmann constant, and yi an energy level, usually ei, while [BETA] becomes 1/kT, with T the thermodynamic temperature . . . A thermodynamic system is characterized by a microscopic structure that is not observed in detail . . . We attempt to develop a theoretical description of the macroscopic properties in terms of its underlying microscopic properties, which are not precisely known. We attempt to assign probabilities to the various microscopic states . . . based on a few . . . macroscopic observations that can be related to averages of microscopic parameters. Evidently the problem that we attempt to solve in statistical thermophysics is exactly the one just treated in terms of information theory. It should not be surprising, then, that the uncertainty of information theory becomes a thermodynamic variable when used in proper context . . . . Jayne's [summary rebuttal to a typical objection] is ". . . The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about its microstate consists of the values of the macroscopic quantities . . . which define its thermodynamic state. This is a perfectly 'objective' quantity . . . it is a function of [those variables] and does not depend on anybody's personality. There is no reason why it cannot be measured in the laboratory." . . . . [pp. 3 - 6, 7, 36; replacing Robertson's use of S for Informational Entropy with the more standard H.]
16: Note, surprise, tied to degree of improbability, is informational and - log probability is a natural info metric, where base 2 gives value in bits. Try, p(head) = 1/2, - log2[1/2] = log2[2] = 1 bit, binary digit. 17: Where a coin has two states and is a one bit register. a six sided die has that many states and has info carrying capacity 2.585 bits, a 10 sided DD die has 3.322 bits, 20-state AA chains have raw capacity 4.322 bits per monomer, 4-state D/RNA is 2 bits per element and three letter codons store 6 bits capacity. Practical systems have redundancies that reduce effective info as Abel, Durston et al worked out. 18: Further, pray, tell us most learned one, what log [p*q*r] is, but log p + log q + log r, and whether - log [p*q*r]= - {log p + log q + log r} = i[p] - [log q + log r], where p is an information relevant probability and q and r are numbers, leading to the second bracket being a threshold, Th, where too once log is base 2, given that one adds or subtracts like things, such would be in bits. 19: Picking up, once we bound Th to some reasonable blind search threshold, say 500 bits for the sol system and 1,000 for the cosmos, then let us roll the tape:
255 kairosfocus August 6, 2022 at 3:43 am AF, you are found continuing to refuse to acknowledge first facts and established knowledge. Let us start, what is a binary digit? ____ Why is it that a p-state per digit register has log p/log 2 bits per character information storage capacity? _______ Why is it that in a practical code there will normally be a difference of frequencies of states in normal text? Why then does – H = [SUM] pi log pi give an average value of info per character? _______ Why is this called entropy and why is it connected to physical thermodynamics by the information school? _________ Why can we identify for an n length, p state string that there are p^n possibilities forming a configuration space? Why is it, then, that for codes to compose messages or algorithmic instructions or compactly describe functional states, normally, there will be zones of functionality T in a much larger space of possibilities W? ______ We could go on but that is enough to make a key point clear. KF PS, it is commonplace in physics that while there are established general laws or frameworks, readily or exactly solvable problems may be few. When I did Q theory, no more than three exactly solved problems existed. This has to do with how fast complexity of real world problems grows. Approximate modelling is a commonplace. An old joke captures the point. Drunk A meets drunk B under a streetlight on hands and knees searching. I lost my contacts. So A joins in the search. After a while A asks are you sure you lost them here? Oh no, I lost them over in the dark but this is where the light is. The context was statistical thermodynamics. 293 kairosfocus August 7, 2022 at 5:06 am F/N: The point of the above is, it is highly reasonable to use a threshold metric for the functional, configuration based information that identifies the span beyond which it is highly reasonable to draw the inference, design. First, our practical cosmos is the sol system, 10^57 atoms, so 500 bits FSCO/I, X_sol = FSB – 500 in functionally specific bits Likewise for the observable cosmos, X_cos = FSB – 1,000, functionally specific bits And yes this metric can give a bits short of threshold negative value. Using my simple F*S*B measure, dummy variables F and S can be 0/1 based on observation of functionality or specificity. For a 900 base mRNA specifying a 300 AA protein, we get X_sol = [900 x 2 x 1 x 1] – 500 = 1300 functionally specific bits. Which, is comfortably beyond, so redundancy is unlikely to make a difference. Contrast a typical value for 1800 tossed coins X_sol = [1800 x 0 x 0] – 500 = – 500 FSBs, 500 bits short. If the coins expressed ASCII code in correct English X_sol = [1800 x 1 x 1] – 500 = 1300 FSBs beyond threshold, so comfortably, designed. [We routinely see the equivalent in text in this thread and no one imagines the text is by blind watchmaker action.] A more sophisticated value using say the Durston et al metric would reduce the excess due to redundancy but with that sort of margin, there is no practical difference. Where, in the cell, for first life just for the genome [leaving out a world of knowledge of polymer chemistry and computer coding etc] we have 100 – 1,000 kbases. 100,000 bases is 200,000 bits carrying capacity, and again there is no plausible way to get that below 1,000 bits off redundancy. Life, credibly, is designed.
20: Nowhere did you substantially address such. We can confidently conclude your credibility is further confirmed negative. KFkairosfocus
August 17, 2022
August
08
Aug
17
17
2022
01:32 AM
1
01
32
AM
PDT
JVL, the turnabout projection tactic. Again, kindly refer to say Taub and Schilling as documentation of how negative log probability becomes information, why; and to my notes with Connor to trace the reasoning. Especially as additivity is desired so Itot = I1 + I2, which is decisive, this is a natural metric. Next, you have from the beginning been pointed to the product rule for logs, which directly implies the other two terms compose a threshold, so WmAD's expression is a case of a metric of functionally specific information beyond a threshold of complexity; which makes good sense. All of that was on the table from the outset, it is a matter of reducing - log2[ . . . ] and yet hundreds of comments later you are still resisting, evading, dismissing, distracting. And, as outlined yet again, simply giving reasonable bounds for the threshold allows us to see that intelligently directed configuration is the only plausible . . . and best . . . explanation for OoL given the FSCO/I in the cell, and of body plans given the further info. Where, the excess simply on info carrying capacity is so large that no plausible degree of redundancy compatible with how elaborate, specific and methodical the cell's mechanisms for protein synthesis etc are. That shouts, high and exacting specificity. That is decisive on the design inference issue. Which seems to be why it is being resisted. And frankly, if you struggle with and resist or evade something so basic, that is good reason for me to hold that attempts to go on to other points -- which are actually outlined in part above [e.g. I used WmAD's 10^140 remarks to deduce an implied value and pointed out that tossed coins yield no functional specificity but coins expressing ASCII code would] -- would be even more futile and would end up in a cloud of needless contention. KFkairosfocus
August 17, 2022
August
08
Aug
17
17
2022
12:34 AM
12
12
34
AM
PDT
...anti-ID arguments...
The issue is until some ID proponent comes up with a positive, scientific, testable hypothesis, there is no anti-ID argument. All we see are claims that evolutionary biology fails, routinely based on poor understanding of evolutionary theory, followed by "therefore ID" by default. Dembski's math argument follows this pattern.Alan Fox
August 16, 2022
August
08
Aug
16
16
2022
09:52 PM
9
09
52
PM
PDT
JVL: The ID community is strangely silent on a possible specified complexity test that might be of real benefit to their view. First, there has to be a view.Alan Fox
August 16, 2022
August
08
Aug
16
16
2022
09:41 PM
9
09
41
PM
PDT
JVL at 571, I give a toss but the math is beyond my skill level, that's all. Dembski uses a mathematical model that is fine for other mathematicians but not for the layman. His other examples for design are compelling and straightforward. I've read his criticisms of recent anti-ID arguments and they appear to represent not an inability to grasp what he is saying but an outright avoidance and that entertaining the idea that living things are designed, even for a moment, might be too much for those who lived with Darwinian ideas for so long.relatd
August 16, 2022
August
08
Aug
16
16
2022
03:03 PM
3
03
03
PM
PDT
Hang on, I just had a thought . . . Is it possible that you are resisting my exploring Dr Demski's metric from his 2005 monograph because you know it's rubbish and you don't want anyone to find that out. Is that the problem? 'Cause otherwise, seriously, I'd think you'd be behind looking into it. But you're not. Why is that? The ID community is strangely silent on a possible specified complexity test that might be of real benefit to their view. Can anyone explain why they don't seem to give a toss? Anyone?JVL
August 16, 2022
August
08
Aug
16
16
2022
02:25 PM
2
02
25
PM
PDT
Kairosfocus: It turns out that structurally, WmAD’s expression is an example of this, which is so whatever secondary debates may be ginned up over his components of the said threshold That's why I'm exploring his metric and your interpretation of it. Closely linked, core cell systems (so OOL) and elaborations for body plans both have copious FSCO/I, far beyond the threshold. But, for a given situation, maybe the threshold is lower? That would be good to know. Yes? One, you and other objectors need to face rather than side step. You still don't seem to grasp what I am doing. I am querying a specified complexity metric to see if a) it gives results in agreement with cases that can be easily evaluated by other means and b) it is an improvement on other methods of checking for specified complexity. Why you are not supporting my efforts I cannot fathom. Everything I have done in this regard is, potentially, in support of the design inference. But you seem bound and determined to fight me all the way along. Why is that? Why am I doing that causes you such concern? I'm trying to see how Dr Dembski's metric works, if it's accurate, if it's calculable. But you just object, object, object. Why is that?JVL
August 16, 2022
August
08
Aug
16
16
2022
12:20 PM
12
12
20
PM
PDT
JVL, really! I set out to point out a first, core point. That, functionally specific, complex organisation and/or associated information beyond a reasonable threshold strongly points to design as key cause. It turns out that structurally, WmAD's expression is an example of this, which is so whatever secondary debates may be ginned up over his components of the said threshold. Closely linked, core cell systems (so OOL) and elaborations for body plans both have copious FSCO/I, far beyond the threshold. So, there is very good reason to conclude that OoL and origin of body plans both reflect intelligently directed configuration, design. That is a big result and a main result. One, you and other objectors need to face rather than side step. KFkairosfocus
August 16, 2022
August
08
Aug
16
16
2022
12:02 PM
12
12
02
PM
PDT
As should be obvious from the lengthy discussion about Dr Dembski's metric and Kairosfocus's version it's clear that the problem with the later is that Kairosfocus's term, K2 (which he says has a 'natural limit of 100') completely wipes out the importance of Dr D's pS(T); that's the term that involves the semiotic descriptions of the patterns in the sample space. Clearly that function is designed to offset patterns or sequences that are unlikely but not designed because there is no structured way to describe them. This is a topic that Dr D clearly discusses in his monograph. By replacing pS(T) with a limited constant was guaranteed to fail for many cases, even ones well over the accepted threshold of 500 bits. Also, it's important to note that the whole point of the metric was to detect specified complexity even if no functionality is known, a worthy goal it must be said. Whether or not it's actually useable or accurate beyond the simple examples I've been testing it with is a different conversation. Perhaps I can come up with a suitable test case . . .JVL
August 16, 2022
August
08
Aug
16
16
2022
09:43 AM
9
09
43
AM
PDT
Alan Fox in general. You are intelligent except when you don't want to be. You have no credibility. You are only here to say 'I don't understand' when, in fact, you do. You just want to make sure Intelligent Design doesn't get into schools. Politics is god for you.relatd
August 16, 2022
August
08
Aug
16
16
2022
08:00 AM
8
08
00
AM
PDT
To reduce life to proteins is a kind of self-brainwashing or imaginary simplification of something very,very complex . Brain likes things to be simple but a cell is a combination of many complex systems that function simultaneously. Talking about proteins is not talking about life it's just a childish illusion of "understanding" life. Nobody understands what life is.Lieutenant Commander Data
August 16, 2022
August
08
Aug
16
16
2022
07:31 AM
7
07
31
AM
PDT
Alan Fox:
My understanding, as much as I can tell, of evolution is generally in line with the mainstream. I just invented my name for natural selection to take back the word “design” into general use.
It isn't in line with mainstream, though. There isn't any evidence that natural selection can design anything. Natural selection is a process of elimination. It is nothing more than contingent serendipity. Mainstream doesn't have any mechanisms capable of producing the diversity of life. Any mechanisms relying on a differential accumulation of genetic change is doomed to fail. That's because such a mechanism is incapable of producing the diversity of life. You are confused because you don't understand biology.ET
August 16, 2022
August
08
Aug
16
16
2022
05:34 AM
5
05
34
AM
PDT
AF, the niche or rather circumstances cull they do not innovate, only chance variation is available. Meanwhile you are forced to imply an underlying well behaved continent of function; multi part, correct orientation, organisation and coupling imples isolated islands of function. This is for record, you are in multiple denial. KF PS, many chemical rxns are available [and the thermodynamically favoured are adverse], that is why for example protein synthesis requires such elaborate systems and procedures.kairosfocus
August 16, 2022
August
08
Aug
16
16
2022
03:08 AM
3
03
08
AM
PDT
<...the chemical reactions to form a functional protein is never binary .OK, I've read this several times and can make no sense of it. Can Relate explain the non-binary aspect of chemical reactions?Alan Fox
August 16, 2022
August
08
Aug
16
16
2022
01:00 AM
1
01
00
AM
PDT
Your personal notion of “NICHE designs” still remains in your imagination.
Not really, Joe. I am not offering a new process in adaptive biological evolution. My understanding, as much as I can tell, of evolution is generally in line with the mainstream. I just invented my name for natural selection to take back the word "design" into general use. Sorry if that confused you.Alan Fox
August 16, 2022
August
08
Aug
16
16
2022
12:55 AM
12
12
55
AM
PDT
Alan Fox:
There is no way, now and for the foreseeable future, that you or anyone else can predict the functionality of an unknown nucleotide sequence (or the potential product it is a template for), simply by doing simplistic counts and transformations of numbers.
So what? Only fools think that is an actual thing. And here you are. Functionality is OBSERVED. Then the information required for it, is measured.ET
August 15, 2022
August
08
Aug
15
15
2022
06:53 PM
6
06
53
PM
PDT
1 2 3 4 5 6 23

Leave a Reply