Regulation - Complexity sidebar
Pages
Author | Topic: Regulation - Complexity sidebar |
---|---|
Law Bringer
Member # 6785
|
written Monday, May 7 2007 14:00
Profile
In order to keep the main topic on track, I'm copying Khoth's question on complexity to a new topic. quote:If memory serves me, the first million digits of pi have a completely uniform distribution of the numbers 0 to 9. So both strings of numbers should appear to be almost completely random. I'm not in the mood to count to verify for these strings, but they seem to have a fairly uniform distribution. So if you don't know the significance of the first string, they would appear to have the same complexity of information. Posts: 4643 | Registered: Friday, February 10 2006 08:00 |
Electric Sheep One
Member # 3431
|
written Tuesday, May 8 2007 01:06
Profile
Maybe this sidebar is also a good place for my little disquisition on entropy, which is in some ways related to complexity, but not really. There are two basic definitions of entropy: thermodynamic entropy and statistical mechanical entropy. They are supposed to be the same, but as you will see this is not obvious. Thermodynamic entropy: Put some number of Joules of heat into some object, while keeping its temperature constant. The ratio of heat to absolute temperature (i.e. temperature measured in a scale where zero is absolute zero), in Joules per Kelvin, is the amount by which you have raised the object's entropy. If you can't manage to heat the object without changing its temperature (sometimes you can), just compute the entropy change for a very short time, over which the temperature changes negligibly. Then add up the entropy changes over a succession of these short stages, each with a slightly different temperature. The result is an integral of heat over temperature, and that gives the entropy change for the general case of varying temperature. Only changes in entropy are defined in this way, not absolute entropy. So we fix absolute entropy by the convention that a perfect crystal at absolute zero should have zero entropy. This is the entropy of thermodynamics. Exhaustive experience over the past two centuries has been formalized in the empirical law that it will never decrease in any closed system. It does not obviously have anything to do with complexity or disorder or anything like that. It's about heat and temperature, as empirically measurable. Statistical mechanical entropy: This is a purely theoretical concept of entropy, as a property of probability distributions. If I have N equally likely states, the entropy of this probability distribution is log(N) (the logarithm of N -- in physics, usually to base e). This can be generalized nicely to the case with arbitrary unequal probabilities, but this is not necessary for qualitative understanding. The statistical mechanical entropy is naturally a pure number; when we want to relate it to thermodynamic entropy, we multiply it by a univeral constant which has units of Joules per Kelvin (Boltzmann's constant). One might well ask, How on earth do probability distributions ever enter physics, in which everything is deterministic? Well, physics deals with large systems, for instance samples of gas containing zillions of molecules. And if even a small system is followed over a period of time, the sequence of its instantaneous states makes up a very large and possibly complicated set. Precise description of such large systems or histories would take too much work, so we use probability concepts as a way of crudely characterizing some of their large scale properties. We look at a history as a set of instantaneous states, or at the instantaneous state of a large system as the set of all the states of the parts of the large system. We then try to infer a probability distribution that might yield this set as a typical sample. We compute the entropy of this distribution, multiply by Boltzmann's constant, and hope the result matches the thermodynamic entropy of the system. In all known cases in which the statistical mechanical entropy can be unambiguously computed, it does. But as you might imagine from the above explanation, the step of inferring a distribution from a sample may be problematic; and even before this step, there is in general a lot of room for ambiguity in simply deciding how to relate histories to sets of instants, or large systems to sets of subsystems. In reality the successive instants of a history are not uncorrelated samples from a distribution, but are all determined strictly from the first instant, through the laws of physics. And the subsystems of a larger system interact with each other, so they are not independent either. Justifying the statistical approach in spite of these basic facts is a tricky business, and a subject of active (though not very productive) research. So that's the state of the science of entropy. What does it mean for applying entropy beyond physics, say in evolutionary biology? Mostly, it means that it's very hard. The Second Law really applies to thermodynamic entropy. There are no known valid proofs of it for statistical mechanical entropy. And since an encyclopedia and a book of random characters will clearly burn pretty much the same, it is hard to claim that anything like 'meaning' or 'complexity' can have any real significance for thermodynamic entropy. From the statistical mechanical point of view, both books are made of bazillions of molecules, which can be arranged in many more ways. The range of different states of printed characters, although huge, is comparatively tiny. Similar considerations will make it very difficult to draw any conclusions from entropy about biochemical evolution. [ Tuesday, May 08, 2007 01:08: Message edited by: Student of Trinity ] -------------------- We're not doing cool. We're doing pretty. Posts: 3335 | Registered: Thursday, September 4 2003 07:00 |
Off With Their Heads
Member # 4045
|
written Tuesday, May 8 2007 06:17
Profile
Homepage
quote:I think the point that both Stareye and Khoth were making in the other thread is that this makes "complexity" ill-defined. The second string of numbers might be my favorite 1350-number string, in which case the seemingly random set is no longer random to me, but it's random to everyone who doesn't know that. This makes something "complex" if some importance can be assigned to its order, but "some importance" is completely arbitrary and does not make for a good mathematical principle. quote:This at first sounds a little funny, because so much of modern physics is probabilistic, not deterministic, but I suppose you're talking about a classical description as much as anything, and in classical physics, everything is deterministic. -------------------- Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens. Smoo: Get ready to face the walls! Ephesos: In conclusion, yarr. Kelandon's Pink and Pretty Page!!: the authorized location for all things by me The Archive of all released BoE scenarios ever Posts: 7968 | Registered: Saturday, February 28 2004 08:00 |
Electric Sheep One
Member # 3431
|
written Tuesday, May 8 2007 06:51
Profile
Quantum mechanics is also deterministic, in the sense that the quantum state evolves deterministically under the Schrödinger equation. Some element of randomness certainly seems to enter in quantum measurement, but it is pretty clear that nobody is doing any measurements on atom positions inside a steam engine, or anything like that. So the quantum version of entropy, the von Neumann entropy, is zero for any single quantum state (no matter how elaborate a superposition it may be in any particular basis). Only when one considers probability distributions over quantum states, which is by no means required by quantum mechanics per se, does one find non-zero entropy. Probability distributions over quantum states arise in the same ways I described above: we consider a historical sequence of states as a sample set, or consider the set of parts of a larger system as a sample. Quantum mechanics adds two things to the classical picture of entropy. Firstly, it means that in any finite system one has only a countable number of distinct states to consider. This means that when computing entropy we can use true probabilities, rather than probability densities. (When Boltzmann created statistical mechanics, in the late 19th century, he arbitrarily introduced finite-volume 'cells' in phase space, and considered the probability that the system should be anywhere within a given cell. In effect, quantum mechanics gives us natural phase space cells, instead of arbitrary ones.) Secondly, quantum mechanics includes the peculiar, non-classical kind of correlation known as 'entanglement'. This means that even when a large system is in a single specific quantum state, to describe any of its subsystems alone may require a probabilistic distribution (not a superposition) of states. This is a source of probability with no classical counterpart; but it has not yet to my knowledge been used to add anything to the foundations of statistical mechanics. That's one of the things I'm supposed to be doing, in fact. [ Tuesday, May 08, 2007 06:56: Message edited by: Student of Trinity ] -------------------- We're not doing cool. We're doing pretty. Posts: 3335 | Registered: Thursday, September 4 2003 07:00 |
Post Navel Trauma ^_^
Member # 67
|
written Tuesday, May 8 2007 09:29
Profile
Homepage
Where I was heading with the complexity of the two numbers waas that pi has much less complexity than the random number. Stillness brought in the idea of the complexity of something being roughly the length of a program which generates it. A pretty small program can generate a huge amount of pi, but there isn't a way to generate my random number that's much better than "print 71793...". Sure, a short program can generate some random number, but such a program isn't any more likely to generate that than it is to generate the same number of digits of pi. -------------------- Barcoorah: I even did it to a big dorset ram. New Mac BoE Posts: 1798 | Registered: Thursday, October 4 2001 07:00 |
Lifecrafter
Member # 7723
|
written Wednesday, May 9 2007 05:09
Profile
Your number is complex but what specific information does it convey? It's just random. I really think you missed the point. Living things have high information content beyond encyclopedic scale. I'll repost and see if you get the difference. quote:Italics mine. I really don't have much more to say on this. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Post Navel Trauma ^_^
Member # 67
|
written Wednesday, May 9 2007 08:19
Profile
Homepage
quote:All I was doing was talking about the term you introduced, using the method of calculation that you said was the correct one. Now, apparently, that's not how you calculate it. So how should I be calculate it (note 'calculate', not 'make assertions about')? -------------------- Barcoorah: I even did it to a big dorset ram. New Mac BoE Posts: 1798 | Registered: Thursday, October 4 2001 07:00 |
The Establishment
Member # 6
|
written Wednesday, May 9 2007 13:33
Profile
Stillness, science doesn't work on quotes. So what a scientist said it? It's irrelevant unless a quantitative and hence useful definition is given. Again, what are the units of complexity? How do we calculate/measure it? -------------------- Your flower power is no match for my glower power! Posts: 3726 | Registered: Tuesday, September 18 2001 07:00 |
Lifecrafter
Member # 7723
|
written Wednesday, May 9 2007 15:13
Profile
Since you insist the same question was asked at How Is Information Content Measured Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Post Navel Trauma ^_^
Member # 67
|
written Wednesday, May 9 2007 22:18
Profile
Homepage
I notice that although equations are provided, they aren't actually used to calculate anything for a specific example. This doesn't surprise me, because the equations don't, fundamentally, make sense. Basically, for enzyme specificity, he imagines that there are n possible things the enzyme could act on, does some kind of relative strength-of-action thing that looks like entropy, and then subtracts that from the 'entropy' of the input of all n equally likely things at the beginning. This gets, for an enzyme that acts on only one substance, log n. However, n is not defined. Given that there are a infinite number of things the enzyme might work on but doesn't, the "specified complexity" of any enzyme (or other catalyst) is is infinite. That's not useful. Sorry, try again. -------------------- Barcoorah: I even did it to a big dorset ram. New Mac BoE Posts: 1798 | Registered: Thursday, October 4 2001 07:00 |
Electric Sheep One
Member # 3431
|
written Thursday, May 10 2007 00:02
Profile
I think I can flesh out Khoth's objection a bit. The quoted formula is part of a discussion of experiments in which bacteria evolved the ability to live on an artificial sugar. It is presented as important, that the new enzymatic activity appears not as a shifting from specificity for natural sugar to specificity for the new sugar, but as a general broadening of the range of sugars on which the enzyme acts. (Aside: This seems rather banal to me. Switching over while keeping an enzyme highly specific would be the very kind of 'magic' that evolution does not do. Broadening it, and then later narrowing it further — a loss of capability, like cave fish losing eyes — if narrowness ever became advantageous, is the obvious gradual track that could be followed.) But here is the problem I see with that analysis. It certainly seems to me that the bacteria have done something remarkable. A strain has emerged that can live exclusively on a sugar that the original population could not survive on. And the criticism mounted by the ID advocates is something like, 'This seemingly clear example of evolution does not actually break our rule about specified complexity decrease, because these new bacteria could also live on several other things as well.' But now consider: birds can fly in air. They can also fly in helium, pure carbon dioxide, neon, and many other gaseous mixtures. A land-based animal cannot fly in any of these atmospheres. Evolving the capability of flight is thus not an increase in specified complexity, but a decrease. I think maybe Thuryl made this point also, some time ago in the late lamented thread. Any change can be counted as a gain or as a loss, of something, depending on how you look at it. Losing sight is gaining specificity in adaptation to a lightless environment. In other words, it seems to me that if you look at the 'range of possibilities' in the right way, any change whatever could be presented as having decreased specified complexity. Specified complexity decrease thus does not seem itself to be adequately specified. -------------------- We're not doing cool. We're doing pretty. Posts: 3335 | Registered: Thursday, September 4 2003 07:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 03:14
Profile
Calculating information is not a creationist idea. On the last thread Schrodinger, an origin of life researcher, did it. If you like, use your own means of calculating information, because that's what it's based off of. It's clear to me from a qualitative perspective though that changes that if continued would result in a bacteria that has no response in any substrate or a fish that has no functionality can't be qualified as anything but loss. It makes it better in some environments, but it's not the kind of change in complexity that gets one from bacteria to biologist. I don't see how gaining the ability to fly would be a decrease. That is exactly the opposite kind of change to loss of eyes. Successive changes of that sort would seem to get us to where we are given enough time. Even if your flying organism lost something like the capability to run well, it'd still be good. We can't do everything bacteria can do, but we're definitely more complex. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
The Establishment
Member # 6
|
written Thursday, May 10 2007 04:23
Profile
No one is arguing that information/entropy can be calculated, that is a concept from statistical mechanics. The controversial part is the claims the author makes using the equations. The definition of "specified complexity" is still quite vague and not quantified. -------------------- Your flower power is no match for my glower power! Posts: 3726 | Registered: Tuesday, September 18 2001 07:00 |
Electric Sheep One
Member # 3431
|
written Thursday, May 10 2007 05:15
Profile
Stillness, you speak as though specified complexity were obviously a meaningful concept, and so we information theory professionals ought to be able to quantify it well enough if we just tried for ourselves. But what we are telling you is that it really does not look like a meaningful concept, and so we do not know how to quantify it. The proposed quantification you cited is clearly inadequate. quote:These bacteria have somehow learned to live on artificial sugar, and you call it a loss because it is a change which 'if continued' would leave them unable to eat anything? This sure seems like taking the specified complexity theory over common sense. But then when I push the complexity picture for birds, you revert to common sense in assessing gain and loss. Doesn't this suggest that the specified complexity idea is kind of flawed? -------------------- We're not doing cool. We're doing pretty. Posts: 3335 | Registered: Thursday, September 4 2003 07:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 05:41
Profile
If you don't get it I'm not of much help quantitatively. My original quote was meant as I understand it - qualitatively. Life has a quality that distinguishes it from simply random or ordered phenomena. I imagine this is how Leslie Orgel meant it. You all are the ones that brought up quantitative measure when it really is not necessary for the point. And explicit all-inclusive concise definitions are notoriously difficult. (Try defining "life," "science," or "species" in such a way that includes everything belonging to those categories, but does not include things that do not). When I try to give them they are never satisfactory for you. The concept is clear though for anybody that wants to see it, though. Whether you agree with it or not is entirely up to you. I know I joke around a lot, but I'm dead serious here. I can’t remember who was saying that this discussion was pointless, but I thoroughly disagree. I’m glad I didn’t quit sooner. The last page of the other thread was especially educational for me. I got some very honest and refreshing responses from Kel and SoT, in particular. To hear and read and come to conclusions is one thing, but to actually be part of the experience is another. I'm sad it got lost, but what needed to be said was said in my eyes. Since you all were honest with me I'll be honest as well. My religious beliefs are not subject to scientific reasoning or observable reality either. They can be enhanced or accented by them, but not torn down. So, if mutations somehow caused "irreducibly complex" systems to arise I'd be shocked, but I would accept it. It's not really a religious belief. It would change my view of creation, but never of the Creator. That is latter is beyond science. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 05:44
Profile
Sot, your bird analogy is flawed. Animals can run in all of those environments as well. In what way does that fit? Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 06:01
Profile
quote:My original use of specified complexity had nothing to do with camparisons of living things. This is why I hate posting links. I was originally comparing living things to non-living things that occur in nature. This sidetracked us to a discussion we already had on the other thread. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Off With Their Heads
Member # 4045
|
written Thursday, May 10 2007 07:12
Profile
Homepage
quote:Saying that life has specified complexity and other things don't is completely meaningless unless you can give a good definition of specified complexity. I suppose a qualitative definition would be sorta okay, as long as it didn't contradict itself, but your examples seem to contradict themselves. quote:True, but that doesn't make it any less important. Scientists do give good definitions of things in real science. -------------------- Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens. Smoo: Get ready to face the walls! Ephesos: In conclusion, yarr. Kelandon's Pink and Pretty Page!!: the authorized location for all things by me The Archive of all released BoE scenarios ever Posts: 7968 | Registered: Saturday, February 28 2004 08:00 |
Raven v. Writing Desk
Member # 261
|
written Thursday, May 10 2007 07:17
Profile
Homepage
"How do you measure specified complexity?" I think it would help, Stillness, if you just explained in your own words. You don't need to give lots of details, and it's totally OK if you want to measure it qualitatively rather than quantitatively. But "qualitative" is a quality of measurement, it doesn't actually explain how it is measured. That's what needs explaining. If you mean to define it the way the Supreme Court defined indecency -- "I know it when I see it" (but can't explain how I make that distinction) -- then just come out and say so. -------------------- Slarty vs. Desk • Desk vs. Slarty • Timeline of Ermarian • G4 Strategy Central Posts: 3560 | Registered: Wednesday, November 7 2001 08:00 |
Off With Their Heads
Member # 4045
|
written Thursday, May 10 2007 07:29
Profile
Homepage
To be fair, it was a concurring opinion, and Potter Stewart was failing to define "hard-core pornography," and the court came up with a better definition less than a decade later (as is detailed in the relevant article). So even the Supreme Court can define things, given enough time. :P -------------------- Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens. Smoo: Get ready to face the walls! Ephesos: In conclusion, yarr. Kelandon's Pink and Pretty Page!!: the authorized location for all things by me The Archive of all released BoE scenarios ever Posts: 7968 | Registered: Saturday, February 28 2004 08:00 |
Law Bringer
Member # 335
|
written Thursday, May 10 2007 10:48
Profile
Homepage
I think the last paragraph of the link you posted is the most telling. Increases of "specified complexity" are possible in principle but infrequent in practice. This is absolutely true and nobody, IDer or neo-Darwinist, disagrees. The leap to "unlikely means it didn't happen" is where we disagree. —Alorael, who thinks that the qualitative definition of specified complexity that seems to be coming up is utility. Doing something is more useful than doing nothing. Doing something poorly is less useful than doing it well. Argument again comes up in border cases: is doing ten things poorly better or worse than doing only one thing extremely well? Posts: 14579 | Registered: Saturday, December 1 2001 08:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 11:42
Profile
Alo, I have stated and restated for so long that my argument is that purposeful action is a better explanation for what we see. I have constantly used observable reality as a basis for determining what happened in the past. If I have used terms like "impossible" and "never" it certainly wasn't intended to be the strength of my argument. Specified complexity is self-defining. It describes patterns that are specified and complex. If you want different words call it the quality or state of having aperiodic and nonrandom form. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
Off With Their Heads
Member # 4045
|
written Thursday, May 10 2007 12:11
Profile
Homepage
I'm confused. What on earth is your actual argument? You've conceded that forward evolution can and does occur, you've conceded that evolution can and does account for at least some of the variation in life that we see, and now you've conceded that irreducible complexity is not actually irreducible? What is left of the claims you've been making for twenty pages? If I ask you why you think design is a better explanation than evolution, what possible evidence can you have left? And your first definition is the most shamelessly circular I have ever seen. Your second is fine, I guess, but non-periodic and non-random forms can arise without the intervention of an intelligence, as we've discussed over and over again. [ Thursday, May 10, 2007 12:15: Message edited by: Kelandon ] -------------------- Arancaytar: Every time you ask people to compare TM and Kel, you endanger the poor, fluffy kittens. Smoo: Get ready to face the walls! Ephesos: In conclusion, yarr. Kelandon's Pink and Pretty Page!!: the authorized location for all things by me The Archive of all released BoE scenarios ever Posts: 7968 | Registered: Saturday, February 28 2004 08:00 |
Lifecrafter
Member # 7723
|
written Thursday, May 10 2007 13:12
Profile
Now I see the problem! My arguement has never changed (except for me dropping the thermodynamic part). An irreducibly complex system has several well matched parts to function such that if any are removed it fails. In living things the parts offer no value by themselves, but only as part of the whole. This says nothing of impossibility. My argument is that we don't see nature make systems like this, but we do see purposeful action create them. Mechanisms of this sort are indicative of planning. Therefore purposeful action is a better explanation. Posts: 701 | Registered: Thursday, November 30 2006 08:00 |
The Establishment
Member # 6
|
written Thursday, May 10 2007 13:31
Profile
We don't see nature make systems like this (as you define them above) because our observation time is dramatically limited. The only real conclusion we can draw is that nature does not do this on the time scales we observe. This does not, however, preclude significantly longer time scales. Is design better? Well, that is difficult because our experience with design is with things that were rapidly on a human time scale occur. A way to show design is to be able to show very rapid changes (on the order of decades, although I'll even take centuries or a few millennia) occurring early on in the history of Earth and you would have a stronger case. -------------------- Your flower power is no match for my glower power! Posts: 3726 | Registered: Tuesday, September 18 2001 07:00 |