The Failures of Mathematical Anti-Evolutionism
Download 0.99 Mb. Pdf ko'rish
|
The Failures of Mathematical Anti-Evolutionism (Jason Rosenhouse) (z-lib.org)
I(E)
= − log 2 (P(E)) . We measure information in “bits” (in the same sense that we might measure length in inches or volume in gallons). No doubt you learned about logarithms in a high school math- ematics class, but let me refresh your memory about how they work. The expression “log 2 (x) = y” is precisely equivalent to the statement “2 y = x.” In other words, to evaluate a base 2 logarithm, you ask yourself, “2 to what power will give me x?” For example, we have that log 2 4 = 2 because 2 2 = 4. We also have that log 2 32 = 5 because 2 5 = 32. 166 6 information and combinatorial search Let us see how this relates to our definition of information. First, suppose that E is certain to occur, meaning that it has proba- bility 1. We would then compute that I(E) = − log 2 (P(E)) = − log 2 ( 1) = 0, where the last part of the equation follows from the fact that 2 0 = 1. This makes sense. If an event is certain to occur then no information is learned when it actually occurs. Now suppose we flip a fair coin. How much information do we learn when we find out the coin landed heads? In this case, there are two equally likely outcomes. If we let H denote the event of getting heads, then we compute: I(H) = − log 2 (P(H)) = − log 2 1 2 = 1, where this time the last part of the equation follows from the fact that 2 −1 = 1/2. Since probabilities are fractions, their logarithms are negative, and this is why we had to multiply the logarithm by −1. (Recall that we want information content to be positive.) Thus, you learn one bit of information when you find out the coin landed heads. More generally, you get one bit of information from learning the result of any 50/50 experiment. Finally, what happens if E is impossible, meaning it has proba- bility 0? In this case we would compute: I(E) = − log 2 (P(E)) = − log 2 ( 0). Now we have a problem. If we ask “2 to what power gives us 0?” then the answer is that there is no such power. You cannot raise 2 to a power and end up with 0. Therefore, it would seem that the information content is undefined in this case. This, too, fits well in our scheme since it implies there is no possibility of learning any information from an event that cannot possibly occur. These examples show that our logarithmic measure of informa- tion works rather well. What we have seen to this point are the first steps toward a useful mathematical theory of information. 6.2 information theory and biology 167 The first to reason along these lines was Claude Shannon, who pioneered the study of information in a famous 1948 paper called “A Mathematical Theory of Communication” (Shannon 1948). He was studying certain practical problems regarding the efficient transmission of information over a channel such as a telegraph wire. With further development, his approach allows you to answer ques- tions like, “What is the maximum amount of information that can be transmitted across this channel?” or “How can I compress this information for transmission without losing anything?” However, like all great ideas, Shannon’s work quickly found applications in areas far removed from anything he considered. For example, Shannon published his ideas at roughly the same time that the first mechanical computers were being built. With the advent of computer science as a serious discipline, questions about the efficient manipulation and storage of information took on considerable impor- tance. Another domain of application is biology, as we will now discuss. 6.2 information theory and biology Information talk is commonplace in biology, and it is not hard to see why. It is difficult to say anything about genes or DNA without in some way using concepts related to information. Genes are said to contain the instructions for making an organism. The cell is said to contain machinery for translating and transcribing the information in DNA. We commonly speak of the genetic code. Different gene sequences that code for the same amino acid or protein are said to be synonymous, implying that they have the same meaning. In light of our remarks in Section 6.1, this might seem a little strange. Information, both in the everyday sense and in the mathemat- ical sense, is closely tied to the idea of there being a sender, a receiver, and a transmission channel. Is there anything playing those roles in the biological context? Moreover, information in human affairs seems intimately related to communication, but what is the analog of this 168 6 information and combinatorial search in biology? After all, DNA does not know it contains information. At the cellular level, everything just plays out according to the principles of physics and chemistry. Scientists have a pretty good understanding of the physical processes that transform a genotype into a phenotype, and you can understand these processes perfectly well without making any reference to information. We might then wonder whether information is genuinely a fundamental concept in molecular biology or is instead just a useful source of metaphors. This question has been the cause of considerable controversy among scientists and philosophers for many years. In a 1995 article, biologists Eörs Szathmáry and John Maynard Smith took the former view: A central idea in contemporary biology is that of information. Developmental biology can be see as the study of how information in the genome is translated into adult structure, and evolutionary biology of how the information came to be there in the first place. Our excuse for writing an article concerning topics as diverse as the origins of genes, of cells and of language is that all are concerned with the storage and transmission of information. Download 0.99 Mb. Do'stlaringiz bilan baham: |
Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling
ma'muriyatiga murojaat qiling