Part 1: 00' 00" - 09' 40''
Topics: Introduction, What is information?
Leo Kadanoff: [???] He went on to broader interests in subjects including information theory, philosophy and parts of biology. The best write-up I could find about him was the Discovery Institute's write-up on the web: "mathematician philosopher William A. Dembski is senior fellow with the Discovery Institute. He has taught at the Northwestern University, the University of Notre Dame, and the University of Dallas. He has done postdoctoral work in mathematics at MIT, in physics in Chicago, and in computer science at Princeton. He is a graduate of the University of Illinois, of the University of Chicago, and of Princeton.
His fields include mathematics, physics and philosophy, as well as theology. We probably hear only a fraction of those interests today in his talk about the "Creation of Information in Evolutionary Search".
William Dembski: Okay, well, Leo, it is a pleasure to be back here. Leo was my adviser back in 87/88, along with Patrick Billingsley and [???]. The topic is actually "Conservation of Information in Evolutionary Search. I want to speak about that
Leo Kadanoff: I said creation! [???]
William Dembski: I'm called a creationist enough, so I make that distinction when I can. What I will describe is the work that I have done with the Evolutionary Informatics Lab - this is their website.
William Dembski: The key person there who runs the lab is Robert Marks. He was for twenty-five years on the faculty of the University of Washington. His field was computational intelligence, he is one of the creators of that field which includes evolutionary computing, neural networks, and fuzzy logic. So, he has been at Baylor for about ten years and we started collaborating about a decade ago but it really came to head about 2007 and we have been publishing since about 2009 in this area. So, what I will describe is really in this talk the theoretical work which came out of these three papers.
William Dembski: "Conservation of Information in Search: Measuring the Cost of Success", that was a IEEE publication, then the next paper "The Search for a Search", that was a Japanese journal on computational intelligence, and the last that is [???], that was a conference proceeding. So, anyway, what I would like to do is talk about, just go through the key-words in the titles. Let's start with information.
William Dembski: What is information? We live in the information age, right?
William Dembski: But the statement that I came across years ago - actually in a philosophy course - which to me really puts it best is the following quote from a philosopher at MIT Robert Stalnaker, that is in his book "Inquiry", 1984, "To learn something, to acquire information, is to rule out possibilities. To understand the information conveyed in a communication is to know what possibilities would be excluded by its truth." This for me has captured what is most crucial about information. So, if you want a definition here is how I would define it: "Information is the realization of one possibility to the exclusion of others within a reference class of possibilities" [???] I want to round this up.
William Dembski: I just want to add: it is one thing to say, "okay, this is what information is", but if you want to do science, especially if you want to do exact science, you got to have to measure information. And how do you measure information? Well, you measure it by probabilities. The smaller the probability, the greater the information. Now, information theory adds to that, it takes the log, it usually does logarithmic transformation of probabilities, it takes averages, that is very common in communication theory, [???] it does other transformations as well, integrals, powers and things like that. But at its core, information is measured in probabilities, so let me say something about that: but before I elaborate on the definition of some measurements, I want to give you another way of thinking about information as a decision.
William Dembski: Decision and homicide come from the same Latin word, they come from "caedere", to kill, to slay or to cut off. Just as a homicide kills somebody, a decision withdraws options, rules out possibilities. The reason I give this is, I'm trying to massage your intuitions, but a decision is something active. Often, when we think of information we point to something, we say there is an item of information. There is a sense in which items of information have validity, but information fundamentally I think is more of a verb than a noun. I show this in my next slide. We think of information as a decision, then information becomes in the first instance of [???] an act rather than an item. That's when we speak about an item of information we keep in mind the act that produced it. Let's give you some examples...
William Dembski: Let's say I tell you it is raining outside. What have I done? Well, I've excluded that is not raining outside. So I have actually given you some information. If I say it is raining outside or it is not raining outside, have I given you any information? Well, I haven't ruled anything out. But what is the reference class there? It is the weather, it is the weather that is outside. Now, what if I put that in quotes "it is raining outside". Now it is a symbol string, that is being communicated across a communication channel. In that case the reference class is going to be other symbol strings that might be competing with it. In that case "It is raining outside or it is not raining outside" - now with the quotes - becomes another symbol string, that could be put across a communication channel.
William Dembski: It would actually contain more information, because it is longer, it is more improbable, it is harder to reproduce the same symbol string. So what constitutes information is going to be in a sense context [???], context is the reference class in which you are considering it. If I say "it is raining outside", what about measuring that probability? If I say that in Chicago - it rains here some, maybe with a certain probability. If I tell you in the Sahara desert "it is raining outside", that is going to be much more improbable, there will be much more information conveyed in that. In terms of the measurement of information, this is how information theorists do it: think of - for instance - a poker hand. If I tell you "this is a hand which has a pair", or "two pairs", there are a lot of different poker hands, about 2.5 million poker hands. But if I tell you "Royal flush", that narrows it down quite a bit. The range of possibilities becomes more constricted, it is more improbable and there is more information. We are doing some basics here, but this is at a more general level than you would be getting it in an information theory book, which tends to look at symbols, strings, and trying to get them [???] across a communication channel. Now, what is communication in that case?
William Dembski: I would define communication as the coincidence or correlation of two acts or items of information. Look at Shannon's original diagram in his "Mathematical Theory of Communication" from 1949, you have basically a source and a receiver, and then you have some act of information here which will be mirrored in some way over there. We do this all the time: we see this sort of set-up when I am sending an email communication, there will be some simple strings from my keyboard, that are getting coded in a certain way, and there will be some transport protocols, and there will be use of error correction, and it will be moved until it ends up on your computer. This process is happening several times, there will be multiple - if you will - acts of information that are going to happen.
William Dembski: It is interesting to look at the history: Shannon's original concern in coming up with the communication of information was the transmission of intelligence. That is an exact quote, released [???]. I think that was even in his undergraduate papers.
In my opinion, there are some problems already in this part of the talk. Some can only be spotted with some knowledge of William Dembski's publications, others should be spotted by an audience just generally interested in information theory, e.g.:- William Dembski is talking only about information of Shannon's type. This seems to be a very narrow approach.
- William Dembski is well aware of the problems with his paper "The Search for a Search: Measuring the Cost of Success", see for example Tom English's The theorem that never was: Diversionary “erratum” from Dembski and Marks. Dembski knows that there is no valid proof for one of his main theorems in this paper (his grandiosely named Horizontal No Free Lunch Theorem), but he chose to ignore this fact, even delete an erratum without further comment. And then he presents this paper to a less informed audience as one of the three "Key Publications on CoI"!
- And one amusing thought: "It is raining outside". Who creates this information? The intelligent observer William Dembski or the unintelligent weather in Chicago, which realized the possibility of raining?
==========
ReplyDeleteAnd one amusing thought: "It is raining outside". Who creates this information? The intelligent observer William Dembski or the unintelligent weather in Chicago, which realized the possibility of raining?
==========
Thanks for posting this! The question you ask is a good one, I bet they will answer it about the time they explain whether or not modified gene duplicates with new functions constitute new information, and if so, how much.
I don't think that Shannon's concept of information serves them well. As for getting an answer: they are quite interested readers, but seem to have given up on the direct exchange with their critics. I don't think that Dembski (or Marks) will acknowledge officially anything which isn't presented as a peer-reviewed paper. The next section shows that they react, but not retract...
DeleteDembski makes it clear that he wants to measure physical information. If he's going to do that in terms of probability, then the probability must be physical, i.e., chance. He can't be talking about epistemic probability or subjective probability (credence), And he has written a lot about chance. So the information is "out there."
DeleteDiEb, I'm with philosophers who say that it is incoherent to assign physical probabilities to properties of the Universe (the whole of physical reality). Others say that the chance of an event that has occurred is 1. Either way, Dembski is in the lurch when it comes to the chance of life in the Universe.
DeleteMore to the point than "it is raining outside" is "Alice has no grandchildren, and Bob has two." Difference in reproductive success is physical information by Dembski's reckoning. He for some bizarre reason mentions the etymological link between decision and homicide. Well, let's ask opportunistically how the fitness function he (later) says is input to the "search" accounts for two cases in which Alice killed babies growing within her.
I shouldn't post when bleary-eyed. I'm trying to say that there is no fitness function in reality. It is an abstraction of differential reproduction, useful in modeling under certain restricted circumstances. Dembski reifies the abstraction. I'm going to bail, and let Ernst Mayr explain differential reproduction and natural selection without referring to fitness. (One good transcript deserves another.)
DeleteHere, where survival and differential reproduction are concerned, anything but blindness prevails. We have a proverb that is applicable here, "Nothing succeeds like success," and that is the secret of natural selection. Success, in this case, means leaving offspring. But what is it that determines this success? If success were determined by blind chance, as are most processes that lead to genetic variation, we would not be justified in speaking of natural selection, for selection implies discrimination. But, and this is the cornerstone of evolutionary theory since Darwin, it is justifiable to refer to differential reproduction as natural selection because individuals differ from each other in their genetic endowment, and it is, at least in part, the nature of this genetic endowment that determines reproductive success.
[...]
I hope that this discussion has made clear how unfortunate such terms as "struggle for existence" or "survival of the fittest" are, because they tend to distract our attention from the central aspect of the phenomenon [not mechanism!] of natural selection, its purely statistical nature. Anything adding to the probability of survival and reproductive success will automatically be selected for.
"To learn something, to acquire information, is to rule out possibilities. To understand the information conveyed in a communication is to know what possibilities would be excluded by its truth." -- Robert Stalnaker
ReplyDeleteEssentially a quote mine, as this is a logical, not probabilistic, approach to information. Stalnaker is referring to elimination of possible worlds.
I have read nothing of Robert Stalnaker: philosophy isn't my forte - it seems to be like mathematics, but lacking proper definitions :-)
DeleteI've done a bit of reading in philosophy of information, just so I'll have some idea of what the "Isaac Newton of information theory" fails to mention. I recognized the logical approach, and looked up Stalnaker in Wikipedia to double-check.
Delete