The Failures of Mathematical Anti-Evolutionism
Information and Combinatorial
Download 0.99 Mb. Pdf ko'rish
|
The Failures of Mathematical Anti-Evolutionism (Jason Rosenhouse) (z-lib.org)
Information and Combinatorial
Search 6.1 what is information? The anti-evolution arguments of the previous chapter asserted that evolution could not account for the complexity of biological adap- tations, and they used probability theory to bolster that conclusion. Those arguments failed because we never have enough information to carry out meaningful probability calculations of the sort the anti- evolutionists require. The arguments in this chapter take a different approach. They focus not on the complexity of adaptations, but instead on the genetic instructions leading to their assembly. These instructions constitute “information,” it is said, and neither evolution nor any other natu- ralistic theory can account for it because natural forces only degrade information. It is claimed that where novel information is found, it must have originated ultimately from an intelligent source. Unsur- prisingly, the branch of mathematics used to bolster this argument is “information theory.” Following our previous discussions, we will start with track one considerations about information before taking a brief look at track two. As always, it will not be a problem to skim over the track two details. I will need to make a few statements about exponents and logarithms that may be unfamiliar to you, or which might bring back bad memories from a high school math class. These details provide some useful context and background, but they are not necessary for following the flow of the argument. Now, “information” is a hard concept to pin down since it has different meanings in different contexts. In everyday conversation it refers to a meaningful message. You ask your colleague in the 163 164 6 information and combinatorial search next office what time the meeting is. He replies, “It’s at 2:00.” Information has been sent from a sender, your colleague, to a receiver, yourself. Understood in this sense, “information” is a hopelessly subjective concept. For example, you might look at a page written in a foreign language and say, “These words mean nothing to me.” The words do not become information until there is a receiver capable of interpreting them. In light of all this subjectivity, how can we develop a math- ematical treatment of information? Mathematical analysis requires precision and objectivity, both of which seem to be lacking in our everyday understanding of the term. The answer is that even in the everyday sense there is an aspect of information that can be quantified. We all have a sense that we learn more from unexpected statements than we do from mundane statements. If a child is offered a free choice between candy and broccoli, it is not very informative to learn they chose the candy. It is much more informative to learn that they chose the broccoli. Apparently, surprising statements are more informative than mundane statements, and that is what we try to quantify when we pass to track two. How do we assign a number to how surprised we are by a particular piece of information? Probability theory seems like a useful tool here. A statement is surprising precisely when it has a low probability of being uttered, and a statement is mundane when it has a high probability. Mathe- matically speaking, we might then say that information content is something possessed by an event in a probability space. This is progress, but how do we quantify information? To answer that question, we first ask what properties we want our measure to have. The first property we have already discussed: Low probability should correspond to high information and vice versa. A second criterion is that we probably do not want to speak of negative information. While misinformation is a serious societal problem, especially in the political realm, we should probably ignore it for the purposes of our mathematical theory. Thus, our information measure 6.1 what is information? 165 should never take on a negative value. Zero information content is the smallest amount there is. There is one more property that seems natural: The amount of information you get from two independent events should be the sum of the information content from each one individually. In other words, imagine that you learn a certain amount of information from one event, and then later you learn another amount of information from a completely separate event. Our measure should be such that the amount of information you learn from both events combined ought to be the sum of the two separate amounts. Summarizing, we want our information measure to have these three properties: 1. Information content should never be negative. 2. Low probability should correspond to high information, and vice versa. 3. The information content from two separate events should be the sum of the information content of each event separately. A mathematician will quickly notice that this list perfectly describes the logarithm function (with the proviso that we multiply by −1, for reasons that will become clear momentarily). This leads us to the following definition: Suppose that E is an event in a probability space. Let I(E) denote the information content of E, and let P(E) denote the probability of E. Then we define I(E) by the following equation: Download 0.99 Mb. Do'stlaringiz bilan baham: |
Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling
ma'muriyatiga murojaat qiling