Regulation - Complexity sidebar

Error message

Deprecated function: implode(): Passing glue string after array is deprecated. Swap the parameters in drupal_get_feeds() (line 394 of /var/www/pied-piper.ermarian.net/includes/common.inc).

Pages

AuthorTopic: Regulation - Complexity sidebar
Law Bringer
Member # 6785
Profile #0
In order to keep the main topic on track, I'm copying Khoth's question on complexity to a new topic.

quote:
Originally written by Khoth:

Another specified complexity one. Which has more specified complexity, this:
3.141592653589793238462643383279502884197 16939937510582097494459230781640628620899862 803482534211706798214808651328230664709384 4609550582231725359408128481117450284102701 93852110555964462294895493038196442881097566 593344 612847564823378678316527120190914564856692346 03486104543266482133936072602491412737245870 0660631558817488152092096282925409171536436789 259036001133053054882046652138414695194151160 943305727036575959195309218611738193261179310 511854807446237996274956735188575272489122793 8183011949129833673362440656643086021394946395 224737190702179860943702770539217176293176752 38467481846766940513200056812714526356082778 57713427577896091736371787214684409012249534 301465495853710507922796892589235420199561121 2902196086403441815981362977477130996051870721 134999999837297804995105973173281609631859502 445945534690830264252230825334468503526193118 817101000313783875288658753320838142061717766 914730359825349042875546873115956286388235378 7593751957781857780532171226806613001927876611 19590921642019893809525720106548586327886593 6153381827968230301952035301852968995773622599 4138912497217752834791315155748572424541506959 50829533116861727855889075098381754637464939 319255060400927701671139009848824012858361603 563707660104710181942955596198946767837449448 2553797747268471040475346462080466842590694912

or this:

7179374994471968932899819554816509347394525412 7897953005543093323663420161354191224618245794 5007991881043142450052843980321551253470997289 08337787822 1452634601919553992833499234084118015045566334 7207398264725493323946042572726841525706874767 1480326433365462983652496104220542264765816 9930380986367279584737729158237504445599549967 3935890135441162676036522462453471222180512823 8030944572838229345710519033708369430073078 2957596813133954163355769893310165170024984599 9926008776073157327085886645037337014916101092 6685051032677031341935263881271161550994185 5132916732729597370580376499918629949769488217 9136093728937310110004152065233725898040520753 7917909073806182959359004022851842383812350 854901155321642848267232175037499812455181836 687209045641934788441288081790681649882294777 082853137133428565003710783670455167198091980 198024025183023450260794069430642445176697756 856066414749642060384938271218294804580887536 731538047282043934810459189799440255089320969 830516511303719972227279758102078724051956161 435739055431526483390933577220630487436045775 906953809191043509754455794535903587883653103 354336369471326087739253186376399712366710503 334367975814439373214539827385171822754427590 771117545492900959901649804179655816938459085 943008059569590382707234252538409682144688133 876300718646507084898432449192042888276258976 2781037270784587859497626087194

(To give you a clue, the first one is the first few digits of pi, and the second one is random.)

If memory serves me, the first million digits of pi have a completely uniform distribution of the numbers 0 to 9. So both strings of numbers should appear to be almost completely random. I'm not in the mood to count to verify for these strings, but they seem to have a fairly uniform distribution.

So if you don't know the significance of the first string, they would appear to have the same complexity of information.
Posts: 4643 | Registered: Friday, February 10 2006 08:00
Electric Sheep One
Member # 3431
Profile #1
Maybe this sidebar is also a good place for my little disquisition on entropy, which is in some ways related to complexity, but not really.

There are two basic definitions of entropy: thermodynamic entropy and statistical mechanical entropy. They are supposed to be the same, but as you will see this is not obvious.

Thermodynamic entropy:
Put some number of Joules of heat into some object, while keeping its temperature constant. The ratio of heat to absolute temperature (i.e. temperature measured in a scale where zero is absolute zero), in Joules per Kelvin, is the amount by which you have raised the object's entropy. If you can't manage to heat the object without changing its temperature (sometimes you can), just compute the entropy change for a very short time, over which the temperature changes negligibly. Then add up the entropy changes over a succession of these short stages, each with a slightly different temperature. The result is an integral of heat over temperature, and that gives the entropy change for the general case of varying temperature.

Only changes in entropy are defined in this way, not absolute entropy. So we fix absolute entropy by the convention that a perfect crystal at absolute zero should have zero entropy.

This is the entropy of thermodynamics. Exhaustive experience over the past two centuries has been formalized in the empirical law that it will never decrease in any closed system. It does not obviously have anything to do with complexity or disorder or anything like that. It's about heat and temperature, as empirically measurable.

Statistical mechanical entropy:
This is a purely theoretical concept of entropy, as a property of probability distributions. If I have N equally likely states, the entropy of this probability distribution is log(N) (the logarithm of N -- in physics, usually to base e). This can be generalized nicely to the case with arbitrary unequal probabilities, but this is not necessary for qualitative understanding.

The statistical mechanical entropy is naturally a pure number; when we want to relate it to thermodynamic entropy, we multiply it by a univeral constant which has units of Joules per Kelvin (Boltzmann's constant).

One might well ask, How on earth do probability distributions ever enter physics, in which everything is deterministic? Well, physics deals with large systems, for instance samples of gas containing zillions of molecules. And if even a small system is followed over a period of time, the sequence of its instantaneous states makes up a very large and possibly complicated set. Precise description of such large systems or histories would take too much work, so we use probability concepts as a way of crudely characterizing some of their large scale properties.

We look at a history as a set of instantaneous states, or at the instantaneous state of a large system as the set of all the states of the parts of the large system. We then try to infer a probability distribution that might yield this set as a typical sample. We compute the entropy of this distribution, multiply by Boltzmann's constant, and hope the result matches the thermodynamic entropy of the system.

In all known cases in which the statistical mechanical entropy can be unambiguously computed, it does. But as you might imagine from the above explanation, the step of inferring a distribution from a sample may be problematic; and even before this step, there is in general a lot of room for ambiguity in simply deciding how to relate histories to sets of instants, or large systems to sets of subsystems. In reality the successive instants of a history are not uncorrelated samples from a distribution, but are all determined strictly from the first instant, through the laws of physics. And the subsystems of a larger system interact with each other, so they are not independent either. Justifying the statistical approach in spite of these basic facts is a tricky business, and a subject of active (though not very productive) research.

So that's the state of the science of entropy. What does it mean for applying entropy beyond physics, say in evolutionary biology? Mostly, it means that it's very hard.

The Second Law really applies to thermodynamic entropy. There are no known valid proofs of it for statistical mechanical entropy. And since an encyclopedia and a book of random characters will clearly burn pretty much the same, it is hard to claim that anything like 'meaning' or 'complexity' can have any real significance for thermodynamic entropy. From the statistical mechanical point of view, both books are made of bazillions of molecules, which can be arranged in many more ways. The range of different states of printed characters, although huge, is comparatively tiny. Similar considerations will make it very difficult to draw any conclusions from entropy about biochemical evolution.

[ Tuesday, May 08, 2007 01:08: Message edited by: Student of Trinity ]

--------------------
We're not doing cool. We're doing pretty.
Posts: 3335 | Registered: Thursday, September 4 2003 07:00
Off With Their Heads
Member # 4045
Profile Homepage #2
quote:
Originally written by Randomizer:

So if you don't know the significance of the first string, they would appear to have the same complexity of information.
I think the point that both Stareye and Khoth were making in the other thread is that this makes "complexity" ill-defined. The second string of numbers might be my favorite 1350-number string, in which case the seemingly random set is no longer random to me, but it's random to everyone who doesn't know that.

This makes something "complex" if some importance can be assigned to its order, but "some importance" is completely arbitrary and does not make for a good mathematical principle.
quote:
Originally written by Student of Trinity:

How on earth do probability distributions ever enter physics, in which everything is deterministic?
This at first sounds a little funny, because so much of modern physics is probabilistic, not deterministic, but I suppose you're talking about a classical description as much as anything, and in classical physics, everything is deterministic.

--------------------
Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens.
Smoo: Get ready to face the walls!
Ephesos: In conclusion, yarr.

Kelandon's Pink and Pretty Page!!: the authorized location for all things by me
The Archive of all released BoE scenarios ever
Posts: 7968 | Registered: Saturday, February 28 2004 08:00
Electric Sheep One
Member # 3431
Profile #3
Quantum mechanics is also deterministic, in the sense that the quantum state evolves deterministically under the Schrödinger equation. Some element of randomness certainly seems to enter in quantum measurement, but it is pretty clear that nobody is doing any measurements on atom positions inside a steam engine, or anything like that. So the quantum version of entropy, the von Neumann entropy, is zero for any single quantum state (no matter how elaborate a superposition it may be in any particular basis).

Only when one considers probability distributions over quantum states, which is by no means required by quantum mechanics per se, does one find non-zero entropy. Probability distributions over quantum states arise in the same ways I described above: we consider a historical sequence of states as a sample set, or consider the set of parts of a larger system as a sample.

Quantum mechanics adds two things to the classical picture of entropy. Firstly, it means that in any finite system one has only a countable number of distinct states to consider. This means that when computing entropy we can use true probabilities, rather than probability densities. (When Boltzmann created statistical mechanics, in the late 19th century, he arbitrarily introduced finite-volume 'cells' in phase space, and considered the probability that the system should be anywhere within a given cell. In effect, quantum mechanics gives us natural phase space cells, instead of arbitrary ones.)

Secondly, quantum mechanics includes the peculiar, non-classical kind of correlation known as 'entanglement'. This means that even when a large system is in a single specific quantum state, to describe any of its subsystems alone may require a probabilistic distribution (not a superposition) of states. This is a source of probability with no classical counterpart; but it has not yet to my knowledge been used to add anything to the foundations of statistical mechanics. That's one of the things I'm supposed to be doing, in fact.

[ Tuesday, May 08, 2007 06:56: Message edited by: Student of Trinity ]

--------------------
We're not doing cool. We're doing pretty.
Posts: 3335 | Registered: Thursday, September 4 2003 07:00
Post Navel Trauma ^_^
Member # 67
Profile Homepage #4
Where I was heading with the complexity of the two numbers waas that pi has much less complexity than the random number. Stillness brought in the idea of the complexity of something being roughly the length of a program which generates it. A pretty small program can generate a huge amount of pi, but there isn't a way to generate my random number that's much better than "print 71793...". Sure, a short program can generate some random number, but such a program isn't any more likely to generate that than it is to generate the same number of digits of pi.

--------------------
Barcoorah: I even did it to a big dorset ram.

New Mac BoE
Posts: 1798 | Registered: Thursday, October 4 2001 07:00
Lifecrafter
Member # 7723
Profile #5
Your number is complex but what specific information does it convey? It's just random. I really think you missed the point. Living things have high information content beyond encyclopedic scale. I'll repost and see if you get the difference.

quote:
Originally quoted by Stillness:
For my sake please think of [specified complexity] as qualitative instead of quantitative. It is actually a term coined by evolutionary origin of life reasearcher Leslie Orgel and hijacked by the good guys.

“Living things are distinguished by their specified complexity. Crystals such as granite fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity.” L. Orgel, The Origins of Life (New York: John Wiley, 1973), p. 189

It can be quantitative and is measured based on [high] information content. Here is an explanation that makes me understand it: View it as the shortest algorithm you’d have to write to get generate a particular arrangement.

Italics mine. I really don't have much more to say on this.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Post Navel Trauma ^_^
Member # 67
Profile Homepage #6
quote:
Originally written by Stillness, in the other thread:

It can be quantitative and is measured based on information content. Here is an explanation that makes me understand it: View it as the shortest algorithm you’d have to write to get generate a particular arrangement.
All I was doing was talking about the term you introduced, using the method of calculation that you said was the correct one. Now, apparently, that's not how you calculate it. So how should I be calculate it (note 'calculate', not 'make assertions about')?

--------------------
Barcoorah: I even did it to a big dorset ram.

New Mac BoE
Posts: 1798 | Registered: Thursday, October 4 2001 07:00
The Establishment
Member # 6
Profile #7
Stillness, science doesn't work on quotes. So what a scientist said it? It's irrelevant unless a quantitative and hence useful definition is given.

Again, what are the units of complexity? How do we calculate/measure it?

--------------------
Your flower power is no match for my glower power!
Posts: 3726 | Registered: Tuesday, September 18 2001 07:00
Lifecrafter
Member # 7723
Profile #8
Since you insist the same question was asked at How Is Information Content Measured
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Post Navel Trauma ^_^
Member # 67
Profile Homepage #9
I notice that although equations are provided, they aren't actually used to calculate anything for a specific example. This doesn't surprise me, because the equations don't, fundamentally, make sense.

Basically, for enzyme specificity, he imagines that there are n possible things the enzyme could act on, does some kind of relative strength-of-action thing that looks like entropy, and then subtracts that from the 'entropy' of the input of all n equally likely things at the beginning.

This gets, for an enzyme that acts on only one substance, log n.

However, n is not defined. Given that there are a infinite number of things the enzyme might work on but doesn't, the "specified complexity" of any enzyme (or other catalyst) is is infinite. That's not useful.

Sorry, try again.

--------------------
Barcoorah: I even did it to a big dorset ram.

New Mac BoE
Posts: 1798 | Registered: Thursday, October 4 2001 07:00
Electric Sheep One
Member # 3431
Profile #10
I think I can flesh out Khoth's objection a bit.

The quoted formula is part of a discussion of experiments in which bacteria evolved the ability to live on an artificial sugar. It is presented as important, that the new enzymatic activity appears not as a shifting from specificity for natural sugar to specificity for the new sugar, but as a general broadening of the range of sugars on which the enzyme acts.

(Aside: This seems rather banal to me. Switching over while keeping an enzyme highly specific would be the very kind of 'magic' that evolution does not do. Broadening it, and then later narrowing it further — a loss of capability, like cave fish losing eyes — if narrowness ever became advantageous, is the obvious gradual track that could be followed.)

But here is the problem I see with that analysis. It certainly seems to me that the bacteria have done something remarkable. A strain has emerged that can live exclusively on a sugar that the original population could not survive on. And the criticism mounted by the ID advocates is something like, 'This seemingly clear example of evolution does not actually break our rule about specified complexity decrease, because these new bacteria could also live on several other things as well.'

But now consider: birds can fly in air. They can also fly in helium, pure carbon dioxide, neon, and many other gaseous mixtures. A land-based animal cannot fly in any of these atmospheres. Evolving the capability of flight is thus not an increase in specified complexity, but a decrease.

I think maybe Thuryl made this point also, some time ago in the late lamented thread. Any change can be counted as a gain or as a loss, of something, depending on how you look at it. Losing sight is gaining specificity in adaptation to a lightless environment.

In other words, it seems to me that if you look at the 'range of possibilities' in the right way, any change whatever could be presented as having decreased specified complexity. Specified complexity decrease thus does not seem itself to be adequately specified.

--------------------
We're not doing cool. We're doing pretty.
Posts: 3335 | Registered: Thursday, September 4 2003 07:00
Lifecrafter
Member # 7723
Profile #11
Calculating information is not a creationist idea. On the last thread Schrodinger, an origin of life researcher, did it. If you like, use your own means of calculating information, because that's what it's based off of.

It's clear to me from a qualitative perspective though that changes that if continued would result in a bacteria that has no response in any substrate or a fish that has no functionality can't be qualified as anything but loss. It makes it better in some environments, but it's not the kind of change in complexity that gets one from bacteria to biologist.

I don't see how gaining the ability to fly would be a decrease. That is exactly the opposite kind of change to loss of eyes. Successive changes of that sort would seem to get us to where we are given enough time. Even if your flying organism lost something like the capability to run well, it'd still be good. We can't do everything bacteria can do, but we're definitely more complex.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
The Establishment
Member # 6
Profile #12
No one is arguing that information/entropy can be calculated, that is a concept from statistical mechanics. The controversial part is the claims the author makes using the equations. The definition of "specified complexity" is still quite vague and not quantified.

--------------------
Your flower power is no match for my glower power!
Posts: 3726 | Registered: Tuesday, September 18 2001 07:00
Electric Sheep One
Member # 3431
Profile #13
Stillness, you speak as though specified complexity were obviously a meaningful concept, and so we information theory professionals ought to be able to quantify it well enough if we just tried for ourselves. But what we are telling you is that it really does not look like a meaningful concept, and so we do not know how to quantify it. The proposed quantification you cited is clearly inadequate.

quote:
Originally written by Stillness:

[C]hanges that if continued would result in a bacteria that has no response in any substrate ... can't be qualified as anything but loss.
These bacteria have somehow learned to live on artificial sugar, and you call it a loss because it is a change which 'if continued' would leave them unable to eat anything? This sure seems like taking the specified complexity theory over common sense. But then when I push the complexity picture for birds, you revert to common sense in assessing gain and loss. Doesn't this suggest that the specified complexity idea is kind of flawed?

--------------------
We're not doing cool. We're doing pretty.
Posts: 3335 | Registered: Thursday, September 4 2003 07:00
Lifecrafter
Member # 7723
Profile #14
If you don't get it I'm not of much help quantitatively. My original quote was meant as I understand it - qualitatively. Life has a quality that distinguishes it from simply random or ordered phenomena. I imagine this is how Leslie Orgel meant it. You all are the ones that brought up quantitative measure when it really is not necessary for the point.

And explicit all-inclusive concise definitions are notoriously difficult. (Try defining "life," "science," or "species" in such a way that includes everything belonging to those categories, but does not include things that do not). When I try to give them they are never satisfactory for you. The concept is clear though for anybody that wants to see it, though. Whether you agree with it or not is entirely up to you.

I know I joke around a lot, but I'm dead serious here. I can’t remember who was saying that this discussion was pointless, but I thoroughly disagree. I’m glad I didn’t quit sooner. The last page of the other thread was especially educational for me. I got some very honest and refreshing responses from Kel and SoT, in particular. To hear and read and come to conclusions is one thing, but to actually be part of the experience is another. I'm sad it got lost, but what needed to be said was said in my eyes.

Since you all were honest with me I'll be honest as well. My religious beliefs are not subject to scientific reasoning or observable reality either. They can be enhanced or accented by them, but not torn down. So, if mutations somehow caused "irreducibly complex" systems to arise I'd be shocked, but I would accept it. It's not really a religious belief. It would change my view of creation, but never of the Creator. That is latter is beyond science.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Lifecrafter
Member # 7723
Profile #15
Sot, your bird analogy is flawed. Animals can run in all of those environments as well. In what way does that fit?
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Lifecrafter
Member # 7723
Profile #16
quote:
Originally written by Student of Trinity:

These bacteria have somehow learned to live on artificial sugar, and you call it a loss because it is a change which 'if continued' would leave them unable to eat anything? This sure seems like taking the specified complexity theory over common sense.
My original use of specified complexity had nothing to do with camparisons of living things. This is why I hate posting links. I was originally comparing living things to non-living things that occur in nature. This sidetracked us to a discussion we already had on the other thread.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Off With Their Heads
Member # 4045
Profile Homepage #17
quote:
Originally written by Stillness:

If you don't get it I'm not of much help quantitatively. My original quote was meant as I understand it - qualitatively. Life has a quality that distinguishes it from simply random or ordered phenomena. I imagine this is how Leslie Orgel meant it. You all are the ones that brought up quantitative measure when it really is not necessary for the point.
Saying that life has specified complexity and other things don't is completely meaningless unless you can give a good definition of specified complexity. I suppose a qualitative definition would be sorta okay, as long as it didn't contradict itself, but your examples seem to contradict themselves.

quote:
And explicit all-inclusive concise definitions are notoriously difficult.
True, but that doesn't make it any less important. Scientists do give good definitions of things in real science.

--------------------
Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens.
Smoo: Get ready to face the walls!
Ephesos: In conclusion, yarr.

Kelandon's Pink and Pretty Page!!: the authorized location for all things by me
The Archive of all released BoE scenarios ever
Posts: 7968 | Registered: Saturday, February 28 2004 08:00
Raven v. Writing Desk
Member # 261
Profile Homepage #18
"How do you measure specified complexity?"

I think it would help, Stillness, if you just explained in your own words. You don't need to give lots of details, and it's totally OK if you want to measure it qualitatively rather than quantitatively. But "qualitative" is a quality of measurement, it doesn't actually explain how it is measured. That's what needs explaining.

If you mean to define it the way the Supreme Court defined indecency -- "I know it when I see it" (but can't explain how I make that distinction) -- then just come out and say so.

--------------------
Slarty vs. DeskDesk vs. SlartyTimeline of ErmarianG4 Strategy Central
Posts: 3560 | Registered: Wednesday, November 7 2001 08:00
Off With Their Heads
Member # 4045
Profile Homepage #19
To be fair, it was a concurring opinion, and Potter Stewart was failing to define "hard-core pornography," and the court came up with a better definition less than a decade later (as is detailed in the relevant article). So even the Supreme Court can define things, given enough time. :P

--------------------
Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens.
Smoo: Get ready to face the walls!
Ephesos: In conclusion, yarr.

Kelandon's Pink and Pretty Page!!: the authorized location for all things by me
The Archive of all released BoE scenarios ever
Posts: 7968 | Registered: Saturday, February 28 2004 08:00
Law Bringer
Member # 335
Profile Homepage #20
I think the last paragraph of the link you posted is the most telling. Increases of "specified complexity" are possible in principle but infrequent in practice. This is absolutely true and nobody, IDer or neo-Darwinist, disagrees. The leap to "unlikely means it didn't happen" is where we disagree.

—Alorael, who thinks that the qualitative definition of specified complexity that seems to be coming up is utility. Doing something is more useful than doing nothing. Doing something poorly is less useful than doing it well. Argument again comes up in border cases: is doing ten things poorly better or worse than doing only one thing extremely well?
Posts: 14579 | Registered: Saturday, December 1 2001 08:00
Lifecrafter
Member # 7723
Profile #21
Alo, I have stated and restated for so long that my argument is that purposeful action is a better explanation for what we see. I have constantly used observable reality as a basis for determining what happened in the past. If I have used terms like "impossible" and "never" it certainly wasn't intended to be the strength of my argument.

Specified complexity is self-defining. It describes patterns that are specified and complex. If you want different words call it the quality or state of having aperiodic and nonrandom form.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
Off With Their Heads
Member # 4045
Profile Homepage #22
I'm confused. What on earth is your actual argument? You've conceded that forward evolution can and does occur, you've conceded that evolution can and does account for at least some of the variation in life that we see, and now you've conceded that irreducible complexity is not actually irreducible?

What is left of the claims you've been making for twenty pages? If I ask you why you think design is a better explanation than evolution, what possible evidence can you have left?

And your first definition is the most shamelessly circular I have ever seen. Your second is fine, I guess, but non-periodic and non-random forms can arise without the intervention of an intelligence, as we've discussed over and over again.

[ Thursday, May 10, 2007 12:15: Message edited by: Kelandon ]

--------------------
Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens.
Smoo: Get ready to face the walls!
Ephesos: In conclusion, yarr.

Kelandon's Pink and Pretty Page!!: the authorized location for all things by me
The Archive of all released BoE scenarios ever
Posts: 7968 | Registered: Saturday, February 28 2004 08:00
Lifecrafter
Member # 7723
Profile #23
Now I see the problem! My arguement has never changed (except for me dropping the thermodynamic part). An irreducibly complex system has several well matched parts to function such that if any are removed it fails. In living things the parts offer no value by themselves, but only as part of the whole. This says nothing of impossibility. My argument is that we don't see nature make systems like this, but we do see purposeful action create them. Mechanisms of this sort are indicative of planning. Therefore purposeful action is a better explanation.
Posts: 701 | Registered: Thursday, November 30 2006 08:00
The Establishment
Member # 6
Profile #24
We don't see nature make systems like this (as you define them above) because our observation time is dramatically limited. The only real conclusion we can draw is that nature does not do this on the time scales we observe. This does not, however, preclude significantly longer time scales.

Is design better? Well, that is difficult because our experience with design is with things that were rapidly on a human time scale occur. A way to show design is to be able to show very rapid changes (on the order of decades, although I'll even take centuries or a few millennia) occurring early on in the history of Earth and you would have a stronger case.

--------------------
Your flower power is no match for my glower power!
Posts: 3726 | Registered: Tuesday, September 18 2001 07:00

Pages