Скачать 324.6 Kb.

qua chaos become irrelevant. Now we must examine more closely the rationale behind equating low level chaos with noise; the question we should keep in mind is this: when a physical system is influenced by noise, to what is it listening? Smith begins his exploration of the issue with an appeal to the Kolmogorov/Chaitin/Solomonov (henceforth, KCS) definition of algorithmic complexity. Algorithmic complexity, also known as algorithmic information content, algorithmic randomness, or algorithmic entropy, can be usefully applied to physical entities, rather than just bit strings as we will use it here. It is the length, in bits, of the most concise description (usually, the shortest program for a Universal Turing Machine or BernoulliTuring Machine^{21}) of the physical entity for a given level of accuracy. This measure of complexity can easily be applied to bit strings by evaluating the question of whether the string can be compressed in such a way that the decompression algorithm can be specified easily enough that, when combined with the compressed bit string, the result is shorter than the original string. That is, the complexity of the compression algorithm is irrelevant; we are concerned only with how easily the string can be decompressed. Smith notes that in general the question of whether an arbitrary string is KCS random is undecidable. If we do know a way of compressing a string appropriately, we can say the string is not random, but if we donÕt know how to compress a string we can never say the string is random because there might still be some as yet undiscovered algorithm which could perform the feat. The observation is closely related to Gšdel undecidability and to our previous discussion of computability. We are interested in applying algorithmic complexity to binary codings of the behaviour of chaotic systems. SmithÕs example is to interpret the Lorenz model as a shift map over a bit string describing which wings of the Lorenz attractor a given point will map to at stroboscopically sampled times. That is, we could start out the Lorenz model at a given point on the Lorenz attractor (for our purposes, we can read ÔonÕ as Ôarbitrarily nearÕ) and then measure it at periodic times to note the wing of the attractor to which the point has moved. This generates a binary expansion describing the gross behaviour of the trajectory on which the point lies, and we can apply the KCS measure of randomness to this binary expansion. This is the point at which it becomes confusing to sort out exactly what SmithÕs argument might be. He notes that most finite sequences count as random by the KCS definition, and in fact the proportion of sequences of a given length which cannot be specified by a program at least k bits shorter (i.e., the proportion of random sequences) must be greater than 1  2^{}^{k}. (Of course this measure is more accurate the greater the string length we are considering.) Thus, he observes, the portion of sequences which can be compressed by more than 20 bits will be less than one in a million. He goes on to suggest, essentially, that since almost every binary sequence of any considerable length is KCS random, the behaviour of the Lorenz model understood as a shift map is almost always KCS random. In his own words, Ôan arbitrarily chosen ÒseedÕ x_{0} will, with a probability which can exponentially approach one, yield a sequence of n visits to the two sheets of the (simplified) Lorenz system which is as KCS random as almost every sequence of n coin tosses.Õ (p. 67) He is careful to point out that we are here concerned with a measure of the randomness of the output of the system, not of the actual means by which the output is produced. That is, he is not saying that the Lorenz model itself is somehow randomÑit is, after all, strictly deterministic in the original [P1] senseÑhe saying only that the output of the model can in some sense be called random. Computer scientists would use the term ÒpseudorandomÕ to describe this sense of randomness; a string may not be distinguishable as nonrandom and may in fact not be random by the KCS definition, but it did not appear nondeterministically. Smith goes on to suggest that because there may be underlying order in chaotic output, even if that output is not conducive to making the output compressible (he notes the example of output from the logistic difference equation, which may appear at KCS random intervals on a welldefined curve), there is no difficulty in principle with distinguishing Ôtrue chaosÕ from Ômere noiseÕ. (p. 69) But he finishes with a largely unargued series of assertions which amount to little more than question begging on some very significant ontological points. He says that chaos in mathematical models can be viewed as the extraction of detail from expansions of the real numbers used to specify the systemÕs states. But, he says, returning to his argument we have considered previously, the abstractions to which we typically apply chaotic models donÕt really have any detail to extract. Thus, he continues, physical phenomena shouldnÕt really be modelled in the infinitely detailed real number system. (But recall the earlier point that models generally must work with a number system which is at least infinite.) Instead, we need noisy models with only limited precision descriptions of physical phenomena. All physical systems, he suggests, are subject to lowlevel random noise (here I assume he means truly nondeterministic noise), and this is essentially what we get when we model chaotic systems on digital systems with limited degrees of precision and ever present roundoff errors. Thus, conveniently, our actual practice of exploring mathematical models through computer simulations is just exactly what we need to understand the observed qualitative behaviour of real physical systems. He concludes with the statement that, Ôthe models that matter are the ones as run on computers. And the random behaviour of this sort of model (like the random behaviour of real processes) will have a mixed originÑbeing partly due to an approximation to ÔpureÕ chaotic structure and partly due to noise.Õ (p. 71) There is a huge range of topics within complexity theory which should be explored in any kind of comprehensive treatment of this issue, but I cannot touch on more than a small portion of them now. It will be helpful, though, to make at least a few observations about SmithÕs application of KCS complexity to chaos theory. I will note some peculiarities about using complexity in the way Smith has, discuss some alternative measures of complexity which may yield more fruitful conclusions about the relationship between chaos and noise, and finally touch upon philosophical questions about representation in cognitive systems and how they bear on complexity and randomness. First letÕs reiterate and emphasise SmithÕs observations about the prevalence of strings which under the KCS standard are random. In the infinite space of all possible infinite strings, not only is the measure of the set of nonrandom strings zero, but there is an uncountably infinite set of strings which are noncomputable and which thus cannot be generated by any algorithmic process, regardless of the length of the string we might use to specify the process. Thus even if every single string which could be recursively generated could also be generated by a finite program shorter than the string itself (a ridiculous proposition, of course), we could still say that ÔmostÕ strings were random (in exactly the same sense in which we can say that there are infinitely more real numbers than integers). This is true even when we include the cryptoregular strings such as the binary expansion of p or of the square root of 2. Keeping mind just how many more strings there out there which are random instead of nonrandom under the KCS definition, letÕs have another look at SmithÕs argument that the probability of KCS random output from the Lorenz model understood as a shift map over a binary string exponentially approaches unity. Smith argues with a simple variation of our note above that the proportion of sequences of a given length which cannot be specified by a program at least k bits shorter (i.e., the proportion of random sequences) must be greater than 1  2^{}^{k}. Since the initial n bits of any binary string will be compressible by more than k bits with probability less than 2^{}^{k}, the output of a deterministic shift map operating on the string will be at least as algorithmically random as all but 2^{n}^{}^{k} sequences of n bits produced by tossing a coin.^{22} Thus, as we noted above, he concludes that an arbitrarily chosen initial condition for the Lorenz model will, with probability exponentially approaching one, produce a sequence of n visits to the two wings of the attractor which is as algorithmically random as almost every sequence of n coin tosses. But Smith is here merely playing with probabilities; the argument needs considerable tightening if we are to believe it. As it stands, it makes no explicit appeal to the fact that the system under consideration is chaotic, and the assumption which I believe it is implicitly making begs the question at hand. Specifically, SmithÕs argument applies to any string of n bits, regardless of how we produce it. Suppose I ask about the algorithmic randomness of my friend KristinaÕs next email message to me. Given that the message is n bits of ASCII text, it is still true that with probability exponentially approaching one as we consider longer and longer messages, her message will be as algorithmically random as almost every sequence of coin tosses of the same length. But so far (except maybe for the occasional message sent after a party), Kristina has managed to avoid ever sending me a message which I would be prepared to call algorithmically random or even mildly mixed up. How has my friend escaped these extraordinary probabilities that she will fall into the randomness trap just like our hapless Lorenz model? The answer is easy to recognise: KristinaÕs messages are all constrained by certain factors which impose an order on them. She uses only a small portion of the lower half of all possible ASCII combinations in any given messages, the sequences of characters almost always make English words which form into English sentences, and so on. It is not surprising that Kristina manages to beat the odds even with extremely long messages, because we assume in advance that she imposes order on them before they get sent out. That is, we assume that she does not create her messages randomly. The supporter of SmithÕs line of argument might well acknowledge that the argument fails in this case and make explicit an assumption which cures it: namely, that the argument should apply to all and only those strings which we do not believe to have been generated by some process which imposes order. But then the argument is essentially saying that the probability exponentially approaches one that any binary string which we do not believe in advance to be ordered will in fact be as KCS random as almost every string of coin tosses of the same length. So I can take comfort in the fact that every alleged sentence allegedly uttered in a language I do not understand really is probably just random anyway! The problem here is it that the argument gets us nowhere unless we are already prepared to assume that a binary string created by the simplified Lorenz model is going to look random anyway. But the question we are trying to answer is just exactly whether the output of the simplified Lorenz model really does look random. Moreover, mightnÕt there something wrong with saying that such and such a deterministic system could produce output exactly as random as a particular set of coin tosses? Later we will return to questions about distinguishing deterministic chaos from nondeterministic disordered behaviour as well as about some of the underlying assumptions of complexity analysis, but for now this problem moves us in the direction of observations about chaotic systems as information compressors and what their r™le as information compressors says about the definition of complexity on offer. Smith notes on page 27 that it appears that every sequence of integers (such as 4, 2, 6, 13, 1, 99É) has a corresponding point or set of points on the Lorenz attractor which will take exactly the number of orbits around each wing of the attractor which is named in the sequence. So for our example, there is some initial condition which initiates a trajectory which travels around the left wing four times, the right one twice, the left one six times, the right thirteen, and so on. (Alternatively, there is a point which will generate any binary sequence.) Proving this is no small task, but it is an idea which I believe most mathematicians would find highly plausible. Indeed, it seems no more implausible than the idea that any region in phase space will eventually be visited by some point lying in any other region, which is just topological transitivity, one of the defining characteristics of chaotic systems. But if it is true, then it follows that chaotic systems such as the Lorenz model can be treated, in effect, as information compressors, exactly in those circumstances where the initial conditions generating a given sequence can be specified succinctly. It may not be a computationally tractable or even computable matter to determine one of the points in phase space which will lie on a trajectory following orbits describing an arbitrary sequence, and it certainly isnÕt the most efficient way of compressing a string for the general case, but this doesnÕt stop its being possible for there to be an initial condition at a computable point in phase space which will generate a description of any computable sequence whatsoever. Now, in some cases, specifying the initial point may take more bits than the desired string itself, and in some cases specifying the point together with a specification of how to generate the trajectory by applying the equations could also exceed the length of the string. But in other cases, there might be huge savings in length. For instance, every point on the Lorenz attractor specified by coordinates of a fixed length generates a fixed sequence of any length from one to infinity; by applying other recursive functions of a given length to the output of the model in terms of numbers of orbits, we can generate an entire class of similarly ordered such sequences. Indeed, the idea of reducing some lengthy description to a more manageable size by coding the description in a chaotic system is the basis of the secretive commercial work of pioneering chaos researcher Michael Barnsley. By applying the socalled Collage Theorem (Barnsley 1988), it is possible to achieve extremely compact and progressively more accurate approximations of a pattern with a chaotic iterated function system. Similar methods may be involved in the image processing technique known as FITS (Functional Interpolating Transformation System), developed by Parisbased FITS Imaging for its software Live Picture. The software essentially translates a complex bit mapped image into a set of equations representing the same picture. The image can then be manipulated extremely rapidly because the huge amounts of data in the original image donÕt need to be altered, just the extremely compact mathematical representation. Given the very high compression ratios possible with this kind of method, we can easily see that some binary sequences produced by the Lorenz model will not be very KCS random at all if we can specify their initial conditions concisely enough. But there is no obvious correlation between the number of digits we might use to specify an initial condition and the character of the binary sequence the ensuing trajectory will describe. Indeed, there 