1.14.2006

The Big Splurtz

The most intriguing question about how we got here, to me is the appearance of the violation of the second law of thermodynamics. That just means that things tend to get messier over time, not more orderly. That's why you never see a bunch of ants putting a beetle back together. So how in the heck does something as mindbogglingly complex as ourselves come about? Note that as soon as we are here, we start to fall apart. Only by extraodinary means--like being sequestered in the womb--can we avoid the boo-boos that start to add up and turn us back into raw material.

Eric Chaisson sets his sights on this problem in "The great unifier" in this week's New Scientist. There's an associated website that is VERY well done in Flash. The argument is that complexity can be quantified by energy rate density--the energy flow per unit volume. He speculates that optimization of these energy flows might be the grand principle behind evolution of stars, planets, and the whole shebang.

I wondered if all energy flow qualifies, or just certain kinds. For example, adiabatic processes (which can in principle be reversed, if I remember my thermodynamics correctly) are different from irreversable events. This is the cruz in computing, in fact. An irreversible process necessarily produces a certain amount of heat (energy). For example, if you compute 1+2 = 3, and store the result, you have lost information about where the 3 came from--it could have been 0 + 3 for example. This seems trivial, but it's crucial to computer design. A flip-flop is a logic element that holds a bit of information . It has no memory of past states, and thus destroys information routinely. This generates heat.

Believe it or not, you can theoretically build a computer that doesn't destroy information (or it can take it somewhere else and destroy it there, acting as a heat sump). A guy named Fredkin wrote a paper that describes how.

The point is--is it just energy flow, or does the quality of energy flow have to correspond to information flow? In that case, the definition of complexity would be "information flow per unit volume". That sounds a lot like computing, doesn't it? So I asked Dr. Chaisson by email.

As it turns out (he writes in his email to me), he spent quite a long time trying to figure out how to empirically measure information, but eventually decided that there wasn't even a good definition of it. There's a lot written about 'it from bit', touting the informational nature of the universe. Apparently there are practical difficulties in applying those ideas. Chaisson has written a book on the subject: Cosmic Evolution.

One of the reviewers on Amazon.com drew a moral or at least humanistic motivation from the ideas in the book. That's interesting to me because the same question drove me to write Canman, and I ultimately came to similar conclusions.

0 Comments:

Post a Comment

<< Home