Thursday, October 29, 2009

Filling the Cosmic Disk

It seems counter-intuitive that the Universe, infinite in Space, is finite in Time.

Now that we know that the Universe is expanding, creating Space as it goes, and that all the energy in the Universe will eventually be transformed into heat and get colder and colder, Mankind is forced to admit that the Universe will eventually end. The process of everything becoming heat is called Entropy, defined as the tendency for things to uncomplicate, fall apart, decay and unravel, right down to the very atoms of matter and the protons that hold them together.

We also know, from repeated attempts to create the Perpetual Motion Machine, that you can't beat Entropy. It is the ultimate City Hall, a cosmic bureaucracy so rigid in policy and practice that, although you might be able to side-track the process long enough for life to evolve, intelligence to exist (so as to comprehend Entropy) and civilizations to arise, ultimately Entropy will reassert itself and everything will continue on the downward slide to the cold-dark-universe model.

The question is, philosophically speaking, WHY does Entropy always win? Why does time only run in the direction of decay and darkness? Why was God able to create this whole fantastic wonderful profusion of glory simply by speaking it? The answer lies in some very complex and uncomfortable mathematics. I won't insult you by pretending I actually understand the mathematics...I am trusting that the baffling equations do in fact say what people claim they say!

Entropy can be described mathematically. Alas, the equations which define Entropy also define the accumulation of information. In Mathematics, things that are mathematically equivalent are the same. Entropy = Information. Therefore, systems that are running down are also running up. Greater simplicity in terms of energy = greater complexity in terms of information that could potentially be extracted from that system.

This has a great deal of support from recent research on noise in the measurement of periodic phenomena. If you are measuring a repeating phenomenon, a heart-beat, or the price of cotton stocks, occasionally values sneak into the data set that are so confusing as to appear random. Played over a speaker system they would sound like noise, a hiss or a howl, like feed-back or the electron hum from alternating current. There is now available an equation that allows one to compare the "relative entropy" of the noisy bits with the "relative entropy" of the other clearly-defined sections of data. If the relative entropy of the noisy bit is high, that means that there is information hidden in that data set of which we are not yet aware. Using more sensitive equipment may extract the information. If the relative entropy of the noisy bit is low, then it truly is random fluctuation or decoupling of the measured phenomenon from the main function of the system.

So, if our universe is undergoing entropic run-down because of an accumulation of information, where is the read-head, and who is watching the monitor??

And another thing: If we could reduce entropy throughout the aging universe to make it younger, wouldn't that make it dumber, too??

No comments:

Post a Comment