2. Psychological Indeterminism




Скачать 324.6 Kb.
Название2. Psychological Indeterminism
страница8/11
Дата конвертации04.02.2013
Размер324.6 Kb.
ТипДокументы
1   2   3   4   5   6   7   8   9   10   11
cannot be because the behaviour of an individual trajectory depends on the dynamics of the system, not on the particular number system we use to specify that trajectory. Thus, the KCS definition may allow that two very similar binary strings produced by the Lorenz system could differ greatly in complexity according to whether the strings are compressible as initial conditions (specified with a limited number of bits) for the Lorenz system itself. In other words, for a given length string there may be nearly arbitrary variance of the KCS measure of complexity. But this is a very counter-intuitive notion of complexity! Why should the output of the Lorenz model objectively look any different when it started out at a point specified with only 64 bits of information than when it started out at a point specified with 128 bits of information? (We might make similar arguments by appealing to other compression strategies, but the point is that without reference to any compression standard at all except the system itself, we are left with a counterintuitive notion of randomness.) Together with our point above that perhaps there is something wrong with a measure which is incapable of distinguishing between strings created by nondeterministic processes and those created by deterministic and ordered chaos, these observations suggest that perhaps the KCS framework for analysing complexity is inadequate for our purposes. Maybe there is a better way.


To start, the KCS definition of algorithmic complexity corresponds intuitively much more to randomness than complexity. Complexity generally evokes the notion of something that is highly ordered rather than something uniformly disordered. I do not intend to debate the semantic point of whether we should consider uniformly disordered things complex or only grant that honour to things with hidden order; instead, I suggest simply that it is useful to consider this kind of difference between order and disorder and that such a consideration may shed some light on the question we have been considering about the relationship between deterministic chaos and ÔrandomÕ noise. We might expect that a measure of complexity in the sense of the presence of order should return ÔoppositeÕ values compared to KCS complexity. We might hope that in adopting such a measure, we could also overcome some of the deficiencies of the KCS measure as discussed above.

One such candidate measure is the logical depth of C.H. Bennett (1990, 1987) Logical depth is defined as the execution time of the shortest program for a universal computer which can generate a description of the object in question. More precisely, it is the harmonic mean of all such programs, since there may be an infinite class of programs of the same length which could generate any finite object.23 The idea here is that logically deep objects (or binary sequences or whatever) should contain internal evidence of most plausibly having been the result of long computations or dynamical processes. (Note that wholly disordered strings can be generated quickly by long programs, whereas highly ordered strings might be generated more slowly by shorter programs.)

The most striking difference between logical depth and KCS as measures of complexity is that KCS returns a high complexity for both nondeterministically created disordered sequences and arbitrarily chosen sequences described by the Lorenz model, whereas BennettÕs measure in general gives low complexity to very disordered and perhaps nondeterministically created strings and higher complexity to strings generated (or generable) by chaotic processes. Rather than simply the ÔoppositeÕ value that we might have expected, logical depth manages to discriminate between sequences essentially in terms of the complexity of how the sequences may be generated (i.e., execution time) rather than the ÔcomplexityÕ of the strings themselves.

An improvement on the logical depth measure, relatively minor but useful, has been offered by David Deutsch. (1985b) DeutschÕs measure of Q-logical depth is keyed to the harmonic mean of the execution times of the shortest programs for his own Universal Quantum Computer. The key point of difference with ordinary logical depth stems from the suggestion that in Nature, random states are generated not by Ôlong programsÕ but by short programs exploiting indeterministic hardware. The quantum analogue of logical depth solves this minor problem by generating random sequences with very short programs. Yet, since we are primarily concerned with the execution times of particular programs, this change is unlikely to alter the measure of complexity by anything other than a uniform constant amount. Despite the elegance of DeutschÕs quantum computer24, it is difficult to see how the measure of complexity returned would differ markedly from that provided by logical depth measured with a Bernoulli-Turing Machine (i.e., a Universal Turing Machine equipped with a random number generator).25

However, a significant advantage of Q-logical depth is the ability to consider complexity across worlds in the Everett interpretation of quantum mechanics. The Everett interpretation is widely considered to be experimentally indistinguishable from other interpretations of quantum mechanics (DeutschÕs own 1985a objection notwithstanding), and ordinarily I would not advocate reading too much into ways of thinking the interpretation seems to encourage. Yet in this case, I believe it is a useful tool for getting a handle on the kind of complexity we are discussing. In particular, we can interpret Q-logical depth as containing information about all universes (i.e., all states in the quantum linear superposition of the Universal Quantum Computer as it generates the state in question) simultaneously (this means, incidentally, that the Q-logical depth is not an observable).26 The Q-logical depth indicates high complexity only for objects which are present in all universes. As Deutsch puts it,

ÔObservationally complex states that are different in different universes are not truly deep but just random. Since the Q-logical depth is a property of the quantum state (vector), a quantum subsystem need not necessarily have a well defined Q-logical depth (though often it will to a good degree of approximation). This isÉto be expected since the knowledge in a system may reside entirely in its correlations with other systems.Õ [emphasis original] (Deutsch 1985a, pp. 114-115)

For the moment, we are primarily concerned with the first sentence; I repeat the rest of the quotation for the complementarity with the discussion (Mulhauser 1994 forthcoming) about environmental correlations in complex quantum systems. While this is a rather speculative point, we might consider a sort of Ôstaying powerÕ of complexity and note a parallel between high Q-logical depth and some measure of structural stability which reaches down to the quantum level. That is, those objects with structural stability across universes (i.e., through possible linearly superposed states) are the most complex. This is another way of interpreting the difference between strings created by nondeterministic random noise and those deterministically created by chaotic dynamics: nondeterministic noise fluctuates without pattern across all universes, whereas chaotic patterns are more stable by virtue of their concealed order (i.e., their complexity which emerges even with BennettÕs classical measure).

We are now in a position to evaluate whether Q-logical depth overcomes the problems which led us to seek an alternative to KCS complexity. Recall that one concern was that KCS does not discriminate between strings created by entirely disordered nondeterministic processes (with tossing a fair coin as the paradigm example) and those created by ordered deterministic chaotic processes. Of course if it is SmithÕs aim to equate chaos for practical purposes with noise, then it is not surprising that he should have chosen this definition of complexity! Our other concern was the arbitrary variation in the KCS measure according to whether a given string could be compressed just by giving the initial condition and governing equations of a chaotic system which generated it. It seemed counterintuitive that two strings should have different measures of complexity solely because one resulted from an initial condition specified to a greater number of places than the other.

As far as the first concern goes, the problem will be solved by either logical depth or Q-logical depth. Both measures return a high complexity for strings which can be generated by succinctly specified ordered deterministic processes. Smith notes after his initial discussion of KCS complexity that deterministic chaos can in principle be distinguished from nondeterministic, disordered random noise by finding the right way to analyse it. Logical depth and Q-logical depth essentially incorporate this feature into the measure itself and offer us a better way of distinguishing chaos from Ômere noiseÕ.

For the second concern, however, our excursion into the complexity measure shopping market has not been so fruitful. It is still the case that a sequence requiring an extremely precise description of the initial conditions of the dynamical system which created it could turn out as random as coin tosses on the logical depth measures. Thus there could still be arbitrary differences in the complexity of similar strings, where one could be generated from a very succinct initial condition and the other only from an initial condition specified with very many bits. (To reiterate the warning from above, this argument does not necessarily generalise over all possible compression algorithms; I am concerned just with the complexity of the string with reference to its own systemÕs standard of compression in terms of initial condition specification.)

This second concern, then, appears to be a shortcoming of all the complexity measures weÕve tried, and it is likely to be a feature of all measures which donÕt stray too far from the common meaning of complexity. It would be helpful to have a measure which didnÕt care about particulars of the starting conditions for a system, which commented only on the overall dynamics of the system itself (as opposed to individual strings produced by it), but I am unable to suggest one. It is possible that the line of argument I have suggested is actually irrelevant when the class of all possible compression methods is taken into consideration, but proving this would certainly be no trivial task. For the moment, we can be happy with our success with exploring another measure of complexity and leave the more difficult problem for another day.


LetÕs turn now to the final conclusions Smith wants to draw from his application of the KCS measure of complexity to the output of chaotic systems. Smith notes as I have said before that it is possible in principle to distinguish deterministic chaos from nondeterministic random noise. But, he says, the randomness (by the KCS definition) we observe in chaotic systems is the kind that Ôcomes for freeÕ as a result of the systemÕs nonlinearity extracting detail about the real number specifications of the systemÕs initial conditions. But this kind of detail, he argued previously, is incompatible with the kinds of abstractions to which we typically apply chaotic models. Thus physical phenomena shouldnÕt really be modelled with the infinitely detailed real number system. Instead, we need noisy models of limited precision which reflect SmithÕs belief that all physical systems are subject to low-level random noise (random on the KCS measure, we might wonder, or strictly nondeterministic noise?). Thus, he cheerfully finishes, what really matter is the qualitative behaviour of the limited precision models we run on digital computers, subject as they are to continual noise in the form of roundoff errors, or limited precision computations.

The first observation that we can make is that there is something curious about the assertion that all physical systems are subject to continual low level random noise. If Smith means noise that is random on the KCS definition, then this could be either deterministically produced chaotic ÔnoiseÕ, or it could be truly nondeterministic, altogether disordered noise. If it is the former, then Smith cannot appeal to this point as an argument against modelling in the real number system, nor can he appeal to it as an argument for the kind of ÔnoiseÕ created by roundoff errors unless he is prepared to offer an argument that roundoff errors are chaotic. In fairness, he does not make any direct appeal to this first argument against real numbers, although it would be easy to interpret SmithÕs final comments as mutually supporting assertions painting a particular picture of reality, rather than as a well structured linear argument. If, on the other hand, Smith means noise that is random and nondeterministic, then he is taking a powerful ontological position for which he offers no argumentative support, namely that there exists nondeterministic noise in the world which does actively influence every possible real physical system.

Of course, as physicists or engineers, we find it useful to model systems with environmental noise rather than trying to track the evolution of every single particle which could have an influence on the system being modelled. But as philosophers, we are interested in how a system ÔreallyÕ works in the spirit of our earlier comments on realism. It is fine to assert that what really matters for grasping the qualitative behaviour of physical systems is the kind of simulation we can run on a digital computer. But it is philosophically unsatisfactory just to assume that truly nondeterministic noise exists everywhere and influences every conceivable physical system. This is particularly true considering the availability of coherent entirely deterministic interpretations of quantum mechanics. (Bohm 1952) The perturbations of quantum vacuum theory (see Podolny 1986 for a charming introduction and romantic history of science in the former Soviet Union; also Puthoff 1990, 1989) may be the best candidate for a real nondeterministic noisy background of the sort Smith may be proposing, but as far as I know there is no a priori reason why it, too, cannot be subsumed under a comprehensive deterministic but nonlocal interpretation of quantum physics. If truly nondeterministic noise does not necessarily come from quantum mechanics, then where does it come from?

There is an interesting parallel here with the discussion of interactive decoherence in complex quantum systems (Mulhauser forthcoming). Until they decided to analyse quantum systems at a more detailed level of complexity, physicists were saddled with the idea that consciousness was essential to quantum measurement. It was only when more and more of the dimensions actually relevant to a quantum system were considered that this problem disappeared. Likewise, it may be that when quasi-classical systems are modelled in greater detail, Ôrandom noiseÕ reveals itself as entirely deterministic chaotic influences. Although this point is clear within the logical depth framework, it makes little sense if we limit ourselves to KCS complexity. It is possible to have an entirely deterministic model of a physical system, or it is possible to posit the existence of an all-pervading nondeterministic noise. KCS complexity makes it impossible to distinguish.

Another observation about SmithÕs closing comments is that it precludes the kind of theoretical distinction offered by the eminent mathematician Steven Wolfram (1985) between homoplectic and autoplectic processes. Homoplectic processes, Wolfram suggests, result from those dynamical systems which generate macroscopic (KCS) random behaviour by magnifying the significance of environmental noise (which, on our interpretation, is really the influence of chaotic dynamical systems otherwise external to the system being modelled). The admittedly speculative brand of autoplectic processes, on the other hand, would generate the same macroscopically pseudorandom behaviour (i.e. behaviour which is KCS random but complex on the logical depth measures) independently of the presence of noise. Such a robust autoplectic system could generate logical depth and maintain internal evidence of a long history, whereas homoplectic processes would remain comparatively shallow because of the randomising influence of noise.
1   2   3   4   5   6   7   8   9   10   11

Похожие:

2. Psychological Indeterminism iconPsychological Universals: What Are They and How Can We Know?

2. Psychological Indeterminism iconPsychological Perspectives on Politics

2. Psychological Indeterminism iconAdaptationism and psychological explanation

2. Psychological Indeterminism iconPsychological Theories of Language Acquisition

2. Psychological Indeterminism iconTitle: Advances in psychological assessment

2. Psychological Indeterminism iconSocial psychological dimensions and considerations

2. Psychological Indeterminism iconPsychological Theories and Literary Representations

2. Psychological Indeterminism iconRunning head: psychological mechanisms

2. Psychological Indeterminism iconOf the Self, by the Self and for the Self: Internal Attachment, Attunement, and Psychological Change

2. Psychological Indeterminism iconIn Press, Psychological Assessment, July 17, 2012


Разместите кнопку на своём сайте:
lib.convdocs.org


База данных защищена авторским правом ©lib.convdocs.org 2012
обратиться к администрации
lib.convdocs.org
Главная страница