Скачать 324.6 Kb.

If there is nondeterministic Smith style noise, this kind of robustness is just the kind we expect to single out when we consider the multiple worlds view of Qlogical depth. (Yet SmithÕs picture precludes it!) An autoplectic system would remain stable over possible universes, whereas the complexity of a homoplectic system would be wiped out by the random variance across universes (here it is useful to appeal to the vacuum fluctuations mentioned above). Taking WolframÕs distinction as a point of departure, Bennett wonders interestingly if dissipative processes such as turbulence, which are not explicitly computational, could still generate logical depth. He wonders if something like a waterfall could be an autoplectic process which contains objective evidence of a long dynamical history: is there any objective difference between a day old waterfall and a year old one? He does not answer the question, although he does cite Ahlers and Walden (1980) for evidence of Ôfairly longterm pseudorandom behaviour near the onset of convective turbulenceÕ. (Bennett 1990, p. 147) In any case, we have seen that KCS is not the only description of complexity on offer, and we have seen that SmithÕs argument for counting the outputs of chaotic systems as random only works if we are already prepared to count them as random. We have seen some of the advantages of using logical or Qlogical depth as our measure of complexity, although we have noted that they share one shortcoming of the KCS description in that arbitrary strings may receive different measures of complexity simply because they can be described succinctly as initial conditions of chaotic dynamical systems. We have made a fairly critical exploration of SmithÕs closing conclusions about chaos, noise, and limited precision simulations and seen that there is much more to be said about the problem than is at first apparent using only the KCS description. It is time now to turn to a general class of problems which arises for our kinds of purposes when we try to use any of the standard measures of complexity to describe dynamical systems. Consider some of the following representations of a string: Representation 1: 0100001001101100011001010111001101110011011001010110010000100000011000010111001001100101001000000111010001101000011001010010000001110000011001010110000101100011011001010110110101100001011010110110010101110010011100110011101000100000001000000110011001101111011100100010000001110100011010000110010101111001001000000111001101101000011000010110110001101100001000000110001001100101001000000110001101100001011011000110110001100101011001000010000001110100011010000110010100100000011000110110100001101100011001000111001001100101011011100010000001101111011001100010000001000111011011110110010000101110 Representation 2: 426C657373656420617265207468652070656163656D616B6572733A2020666F722074686579207368616C6C2062652063616C6C656420746865206368696C6472656E206F6620476F642E Representation 3: Blessed are the peacemakers: for they shall be called the children of God. Representation 4:
Representation 5: Matt 5:9 The string is given most recognisably by the third representation. The first is the same text written in binary (base 2) ASCII equivalent, where each 8 bit byte corresponds to a particular character, and the second is the same in hexadecimal (base 16) form, where each pair of characters indicates a byte. The fourth representation is a simple kind of code where entire words are indicated by single digits, and the final very economical representation is the book and chapter and verse from the King James Bible. A brief consideration of these five ways of saying the same thing reveals that complexity is not a straightforward measurement and that we must understand the relationship between complexity and representation in order to make any sense of complexity as applied to philosophical questions about minds. If we type the characters of representation 3 into a modern personal computer, they will be stored in the machine as simple binary switches from which we could extract exactly the bits in representation 1. When the computer actually fetches these bits from memory, it always does so in groups of multiples of 8 bits, often 32 bit chunks sometimes called words or double words. These 32 bit chunks correspond to sets of 8 characters in representation 2. Thus the first 32 bits of representation 1, 01000010011011000110010101110011, correspond to the first 8 characters 426C6573. Although everything in a digital computer is stored in binary form, computer scientists often use the simpler hexadecimal representation when analysing a machine or when writing low level code because it is shorter and faster to understand. Most people find it more natural to think in terms of hexadecimal representations than in binary expansions of the same thing. But most people find it even easier to understand the text of the third representation, and in our everyday communications this is the form we use. For our purposes, we might very loosely call this the natural representation. In order to understand what is being said by either of the first two representations, we must understand how the bit patterns or hexadecimal patterns correspond to the characters of our ordinary text natural representation. This correspondence is given by the ASCII table, the American Standard Code for Information Interchange. Without the standardised code, we would have no way of understanding the first two representations. We can say something similar about the second two representations. Here, in order to understand the string 123456789ABCDEF, we must know the special table given above. Rather than representing individual characters with a special code, as in ASCII, here we are representing entire words. Likewise in the last representation, we are denoting an entire sentence by a simple verse reference. If we know what the King James Bible is, then we can understand Matt 5:9 just as well as we can understand 123456789ABCDEF if we know the special table. The point of all this is that whenever we analyse the complexity of string, we will almost always involve some representational scheme. Even when we considered the behaviour of the simplified Lorenz model and the bit strings it generated, we began with the representational scheme which said Ôwhen the trajectory is on this wing of the attractor, weÕll call it a 1, and when it is on this other wing, weÕll call it a 0Õ. (The only case in which a representational scheme wonÕt be involved is if we are analysing output in a systemÕs native representation; more on this presently.) But each representational scheme includes its own degree of order or complexity. For example, we could generate the string in representation 4 very simply by just writing down the highly ordered list of the first 15 positive hexadecimal numbers. But if the table defining the representational scheme had been different, such that the string were maybe C326B5F847D1E9A, generating the string could be much more difficult. By a simple change of the representational scheme, the string has gone from highly ordered to (apparently) not very ordered at all. But yet we are considering, in a way, the same basic thing: the string from representation 3. Likewise, ASCII representations of plain text like the first two above can be easily compressed by standard methods by about 50%. Thus, they have some discernible pattern and complexity. Yet if we run the same strings through a few rounds of a randomising algorithm such as DES, the Data Encryption Standard, the strings may be extremely difficult to compress.^{27} Where has the complexity gone? In a clear sense, it is present in the data encryption algorithm, which has essentially replaced the standard ASCII representation system with a more complicated one. Thus different ways of representing particular things such as English sentences carry with them different effects on measurable complexity. Insofar as measuring the complexity of some system almost always involves the use of some representational system, measuring complexity almost always involves a prejudgement inherent in the representational system we choose. Some representational schemes produce very dense and uncompressible strings, where superfluous information is stripped out by the complexity of the scheme, whereas some produce very compressible strings with plenty of inessential data. Indeed, particular representational schemes will give preference to particular kinds of order within that which is being represented. For instance, the kind of representation used in the last example above gives preference to sentences which happen to be verses of the Bible. It is still a plain text representation, but it relies for its brevity upon the vast information stored in the correlations between short book names and verse indices to give a very dense representation.^{28} WeÕve established that the same thing represented two different ways may appear more or less complex according to the characteristics of the representational system. One example of this is the Bible passage represented as binary ASCII as opposed to binary ASCII which has been randomised by DES. In a way, both patterns represent the same thing under different interpretations, yet the first appears logically deep because it is compressible while the other looks both KCS random and logically shallow. Moving from the representational scheme of ordinary ASCII to ASCII with a few rounds of DES thoroughly alters the complexity of the Ôsame stringÕ under both KCS and the logical depth measures. With the observation to hand that the choice of representational scheme largely determines the outcome of a complexity measure applied to a system, letÕs consider what it would mean to measure complexity in a systemÕs Ônative representationÕ. The easy example here is analysing the internal functions of a digital computer, since the complexity measures weÕve considered are all meant for application to bit strings, and thatÕs exactly how digital computers represent things internally. Since monitoring bit streams at some points would yield highly ordered patterns and at others highly disordered patterns, weÕll consider two examples. First, suppose we watched the signal across control lines from a floppy disk controller to a disk drive. The fact that there is a limited number of possible meaningful signals which the controller can send to the disk drive (turn on the main motor, strobe a stepper motor switch or a data latch, reset the read/write head, etc.) means that only particular patterns will ever occur. There will also be a higher level of order in that particular commands frequently follow each other (such as turn on the main motor and strobe a stepper motor) while others never do (such as strobe a stepper motor and then turn on the main motor). Note that these individual patterns themselves might be uncompressible, but the order present in the way in which they are used will give us an easy way of compressing the overall pattern.^{29} Thus on an appropriate scale the stream of bits we might read across these lines will look highly ordered and easily compressible on any of the three measures of complexity weÕve considered. The immediate lesson of this example is that highly disordered or random (but recurring) patterns might be put to very ordered functionally relevant use in a digital computer. While the individual patterns might be highly disordered, for the computer they represent instructions in a well ordered process. Now, in the same digital computer system, we could also measure the signals along functionally relevant lines where the patterns would not be nearly so regular. For instance, in a computer with 9bit parity checked RAM, we might decide to sample the line carrying the parity bit. Computers with parity checked RAM store an extra bit along with every 8 bits of data to indicate whether there is an odd or an even number of high bits (i.e., 1s) in that memory location. While there will be exceptions for programs which frequently access highly redundant sequences of bytes in memory, in general if we watch a computer for long enough while it executes some code the value on the parity line leading to the memory control chip will be patternless, indicating a 1 about as frequently as a 0. Under the three measures of complexity weÕve considered, the strings we measured along this line would be considerably more random and uncompressible than the strings we measured along the disk control line. The uninitiated philosopher of digital computers might even conclude that the value on this parity line constituted a computer equivalent of nondeterministic noise. The immediate lesson from this example is that even in a highly ordered system such as memory accessing in a digital computer, appropriate measurements may yield strings with little or no apparent pattern. Now we have two examples of sampled signals from a paradigmatically well ordered digital computer, and one is highly compressible while the other is fairly random. Yet they are both from functionally relevant areas of an entirely deterministic system which is completely free from noise at the scale we are considering. And since it is not necessary to translate the quantities weÕre measuring in any way, we neednÕt worry that our choice of representational scheme has influenced the complexity result in any way. We have noticed that highly disordered patterns might be put to highly ordered uses and that likewise highly ordered processes might generate highly disordered patterns. Our entire discussion has been motivated by questions about the complexity of chaotic signals and their relationship to nondeterministic noise, specifically as these questions bear on the presence of chaotic processes in the brain. After exploring the relationship between representation and complexity as well as asking particular questions about Ônaturally representedÕ complexity in the digital computer example, we are now in a position to say something about what complexity measures might say about chaos in the brain. First lets consider the easier question: if we sample signals in the brain, convert them into binary patterns, and analyse their complexity, what conclusions might we draw? Well, if we happen to sample in an area where signals look chaotic (as Freeman 1989, 1991, for instance, has done), we might find a KCS random pattern which could also be a logically deep one. Regardless of whether the pattern actually was logically deep, it might be akin to the individual command patterns which the disk controller sends to the disk drive. That is, the chaotic pattern could itself play a functionally relevant r™le in a process ordered at another level. To use an example inspired by Freeman, the chaotic pattern might simply be the firing pattern of neurons in an area of the olfactory cortex which is presently sensing the odour of bunny food. Just as the disk drive controller signals may be patternless yet represent for the computer functionally relevant commands in a well ordered process, the chaotic firing patterns in the rabbitÕs cortex may represent for the rabbit a piece of data functionally relevant to the process of getting something to eat. Moreover, despite the fact that detectable order might not emerge until signals are analysed at a higher functional level, we cannot even conclude that the special properties of chaos such as sensitive dependence on initial conditions are irrelevant at the lower level. (Something like this we 