The Failures of Mathematical Anti-Evolutionism
Part of the answer is that information is fundamentally imma-
Download 0.99 Mb. Pdf ko'rish
|
The Failures of Mathematical Anti-Evolutionism (Jason Rosenhouse) (z-lib.org)
Part of the answer is that information is fundamentally imma- terial. Information might be expressed in some material form, but it is immaterial nevertheless. The same information can be carved in stone, written in ink on paper, shouted across a room, or stored on a 6.3 how evolution increases genetic information 171 hard drive, but the information itself is independent of its substrate. Ink on a page is governed by the same physical principles regardless of whether it expresses meaningful words or meaningless scribbles, but the former contains information while the latter does not. That information is immaterial warms the hearts of anti- evolutionists. As they see it, information is a fundamental attribute of living systems, and since it cannot be understood in purely material terms, it forces us to introduce a nonmaterial dimension. The rest of the answer is that a track one conception of informa- tion is all about sending messages between senders and receivers, and such activities are only engaged in by intelligent agents. Shannon’s famous paper was called, “A Mathematical Theory of Communica- tion,” and not, “A Mathematical Theory of Things Natural Forces do on Their Own.” If genes contain information, and if information is about communication between intelligent agents, then this suggests that genetic information in some way represents the will of an intelligent agent. Consequently, information talk is ubiquitous in the anti- evolutionist literature. They are constantly challenging scientists to explain the origin of novel genetic information. Sometimes they are really nasty about it. For example, here is Phillip Johnson, from his 2000 book The Wedge of Truth: If the evolutionary scientists were better informed or more scientific in their thinking, they would be asking about the origin of information. The materialists know this at some level, but they suppress their knowledge to protect their assumptions. (Johnson 2000, 167) This is another of those times where, if you possess any skeptical impulses at all, they should be triggered by the thought of biolo- gists needing a law professor to tell them how to do their jobs. We should also note the rhetorical trickery involved in transforming the evolutionary scientists of the first sentence into the materialists of the second. When carrying out their professional work, evolutionary 172 6 information and combinatorial search scientists have no interest in materialism, or in any other metaphys- ical viewpoint for that matter. Anti-evolutionists see themselves as great philosophers of infor- mation, and they are constantly boasting of their own contributions to its conceptual development. In 2013, they published a large volume called, Biological Information: New Perspectives, which was the proceedings of a conference held primarily to showcase the ID view of this subject (Marks et al. 2013). The old perspective, in their telling, is that evolution has little trouble explaining the origin of novel genetic information in terms of known physical mechanisms. As they see it, they have shown mathematically that this view is untenable. In Section 5.6, we encountered William Dembski’s notion of “complex, specified information,” by which he just meant improbable things that fit a pattern. You might have wondered why he used the term “information” in his writing. If “probability” was what he meant, then he could simply have said that and not have used the term “information” at all. He explained his word choice in his 2004 book The Design Revolution. He writes: Specified complexity (or complex specified information, as it’s also called) is therefore a souped-up form of information. To be sure, specified complexity is consistent with the basic idea behind information, which is the reduction or ruling out of possibilities from a reference class of possibilities. But whereas the traditional understanding of information is unary, conceiving of information as a single reduction of possibilities, specified complexity is a binary form of information. Specified complexity depends on a dual reduction of possibilities, a conceptual reduction (i.e., conceptual information) combined with a physical reduction (i.e., physical information). (Dembski 2004, 137–138) We have already seen that scientists have good reasons for finding Dembski’s framework to be unworkable in practice. Do also note the presumption involved in the first sentence, where we are casually informed that specified complexity is known by more than one name. 6.3 how evolution increases genetic information 173 Since scientists do not use any of this terminology in the particular, idiosyncratic way that Dembski uses it, he is the only one assigning any names at all to these concepts. Biologists find this anti-evolutionist obsession with informa- tion a little strange. Information theory has played a role in evolu- tionary biology since at least 1961, when geneticist Motoo Kimura published a famous paper using Shannon’s conception to quantify the growth of genetic information in the course of evolution (Kimura 1961). Moreover, it is clear that there is some sense in which genetic information has increased in the course of natural history. The ear- liest life forms had small genomes that coded for simple organisms. Modern life forms have large genomes that code for complex organ- isms. It seems hard to believe that scientists would have overlooked something so obvious, and, in fact, they have not. The reality is that as soon as a precise definition of “information” is provided, it is never difficult to explain information growth in the course of evolution. For example, suppose we think of genetic information in Shannon’s sense. We could argue that each nucleotide on a string of DNA is chosen from among four possible bases. If we treat these four possibilities as equally likely, then we can say that each base conveys two bits of information (since log 2 4 = 2). This is a slight oversimplification since the four bases are actually not equally likely, but this detail is not important for our argument. Viewed in this way, the problem of information growth is really the problem of creating new genes. The solution to the problem is found primarily in a well-known process called gene duplication. As the name suggests, it sometimes happens during DNA replication that a stretch of genetic material gets duplicated. Literally, you end up with two copies of a gene where previously you only had one. The two copies can then diverge, with the result being more genes at the end than you started with. This is plainly an increase in genetic information. There is nothing speculative or cutting edge about this. Gene duplication is a common and well-understood process, and it has been |
Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling
ma'muriyatiga murojaat qiling