It is generally widely accepted that quantum mechanics opened up an entirely new field of logic, which we know as

**quantum logic**. Hence, new rules and postulates arose for which the logical classical definitions hitherto invoked are inadequate.

E.g.: Two statements p and q, are

**contrary**if they cannot both hold, i.e. if:

~ (p / q)

Two statements: p, q are

**contradictory**, if:

[p -> (q) ] , [ (~p) -> ~q]

and so on.

However, the fundaments of quantum mechanics (validated by experiment) diverge from this. Quantum mechanics can be regarded as a non-classical probability calculus resting upon a non-classical propositional logic. More specifically, in quantum mechanics each probability-bearing proposition of the form

*"the value of physical quantity A lies in the range B"*is represented by a projection operator on a Hilbert space

**H**.

Neumann [1932], showed that each physical system can be associated with a (separable) Hilbert space

**H**, the unit vectors of which correspond to possible physical states of the system. Each "observable" real-valued random quantity is represented by a self-adjoint operator

**Â**on

**H**, the spectrum of which is the set of possible values of

**Â**. If

**e**is a unit vector in the domain of A, representing a state, then the expected value of the observable represented by A in this state is given by the inner product . The observables represented by two operators A and B are commensurable if and only if A and B commute, i.e., AB = BA.

However, non commutation is also possible, and indeed even expected, hence the emergence of the Heisenberg Uncertainty Principle which is really a statement regarding non-commutativity. This can be expressed in quantum mechanics, using the momentum (p) and position (x) measurements via the Poisson brackets:

**[x, p] = -i h/ 2 pi**

where h is the Planck constant of action.

If two variables a, b commute, then one has:

[A, B] = (A*B - B*A) = 0

if not, then:

[A,B] = (A*B - B*A) = -1

and we say a and b are 'non-commuting'.

(You may observe one aspect at any one time, but not the other).

In term's of Bohr's (Complementarity) Principle, the variables x (position) and p(momentum) are regarded as "mutually interfering observables".

This is why only one can be obtained to precision, while you lose the other.

The binary{0,1}-valued observables may be regarded as encoding propositions about properties of the state of the system. Thus a self-adjoint operator

**P**with spectrum contained in the two-point set {0,1} must be a projection; i.e.,

**P**^2 = P. Such operators are in one-to-one correspondence with the closed subspaces of H. Indeed, if P is a projection, its range is closed, and any closed subspace is the range of a unique projection. If e is any unit vector, then = ||Pe||^2 is the expected value of the corresponding observable in the state represented by e. Since this is {0,1}-valued, we can interpret this as the probability that a measurement of the observable will produce the "affirmative" answer 1. In particular, the affirmative answer will have probability 1 if and only if

**Pe**=

**e**; that is, e lies in the range of P

What all this means is that the “universality” of a concept is a moot issue. It has no meaning or significance in the setting of quantum mechanics and quantum logic. Since a typical closed subspace (say representing a quantum ideation in the brain – for which we already know quantum mechanics applies, cf. Stapp, ‘

**Mind, Matter and Quantum Mechanics’**, 1993, p. 42) has infinitely many complementary closed subspaces, this lattice is not distributive.

What are we to make of this? Mainly that the empirical success of quantum mechanics calls for a revolution in logic itself. This view is associated with the demand for a realistic interpretation of quantum mechanics. Now, since philosophy has not progressed to a non-distributive, non-classical form – it therefore can have squat to say about ultimate reality. It is in effect a non-player. Or perhaps more accurately, an ersatz player.

To be more specific, the formal apparatus of quantum mechanics reduces to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over by the "quantum logic" of projection operators on a Hilbert space. The usual statistical interpretation of quantum mechanics demands we take this generalized quantum probability theory quite literally -- that is, not as merely a formal analogue of its classical counterpart, but as a genuine doctrine of chances.

Let me give an example of how classical logic breaks down. Take the case of a single electron fired at a two-slit screen and ending up on the other side - impinging on a 2nd screen. (Figure 1 - at top of article)

Prior to reaching the screen the electron exists in a superposition of states or "wave packets". Is this description statistical, or individual? This depends. The wave function has a clear statistical meaning when applied to a vast number of electrons. But it can also describe a single electron as well. In the example just cited, all the energy states refer to the same electron. However, if all electrons are identical, the statistical and individual descriptions coincide.

Without making this post too long, it is found (in > 90% of cases) in numerous trials that the electron goes through both slits of screen (1) at the same time to reach the final screen. Thus, one is not able to say (designating the slits in the intervening screen as A, B respectively, and C as the final screen):

**If A then C**

OR

**If B then C**

But rather BOTH A and B, then C

A decidely non-Boolean result.

The point emphasized here is that this deviation means that in specific spheres (mainly in science, specifically in modern physics) conventional logic and thinking are of little or no use. As you can see, it breaks down in this example. A number of researchers, authors, for example Hilary Putnam, have argued that the distributive law of classical logic is not universally valid. (Putnam, H.: 'Is

*Logic Empirical?'*in Boston Studies in the Philosophy of Science 5, Dordrecht-Holland, 1968.

A more complex application might be to the premise or claim that "no primary cause can be physical". The trouble is, that this assumption is based on a classical system of logic that is binary and uses binary {1,0} or (yes, no) operators. Thus, since a careless person- perhaps attempting to execute a "proof" of a creator, will assert all physical entities

**must be caused**, he will make the classical error of applying this to the cosmos' origin.

But what if instead of classical mechanics and its deterministic provisions, quantum mechanics is incorporated, say at the level of quantum gravity? Can the proposition still hold? T. Padmanabhan in his ‘

*Universe Before Planck Time – A Quantum Gravity Model*, in

**Physical Review D**, (Vol. 28, No. 4, p. 756) showed that it is irrelevant. As irrelevant as regarding the electrons in Fig. 1 as hard tiny "marbles".

Without going into all the complex mathematics entailed, Padmanabhan employed integrals related to the “action” (J) as a function of time. He proceeded by solving for the expansion factor S(t) using two separate energy equations, one of which (2.15 in his paper)bears a remarkable resemblance to the basic quantum wave potential equation. Moreover, his potential energy term is remarkably similar to that for the quantum harmonic oscillator - with some critical differences (substituting for the frequency f, and thence angular frequency omega = 2 pi f, a conformal quantum variable, alpha).

The most masterful section in his paper is III. ‘

*Geometry of the Quantum Universe’*wherein spacetime itself is taken to be in a particular quantum state U(q, t). He then assumes “stationary states” (given by the variable Q) for the early universe that are independent of time and for which all the dynamics “are contained in S(t)” (ibid., p. 28). The form of the expression for his “energy content” of the universe,

**E**(t) also bears a remarkable structural similarity to the equation given earlier for total energy in an expanding cosmos. (Except other variables such as ‘Q’ appear)

The conformal factor Padmanabhan uses is alpha, which is a purely quantum mechanical parameter, defined from his equation (2.24):

alpha = S^ 6 (t) omega^2 (t)

thereby fixing the state of the universe to be compatible with a harmonic oscillator of frequency alpha. (Which we know has solutions in terms of Hermite polynomials H_ n(q)) To make a long story short, and leave out lots of formalism, he then considers a series of different solutions for the respective energy equations, including for an “open”, “flat” and “closed” model cosmos. It’s found all the spacetimes are non-singular (e.g don’t have an associated singularity or infinity) and start with some minimum value of expansion factor. “Classical” (non-quantum) limits are achieved by setting alpha = 0, thus S(t) = 0.

Physically, it is found that the conformal factor (alpha) contributes a

**negative energy density**. Negative energy density may be cast in a relatively simple form of an equation of state, viz.:

**w = (Pressure/ energy density) = -1**

This is consistent with Einstein's general theory of relativity - which one could say approaches the status of a 'basic law of physics'. In this case, the existence of a negative pressure is consistent with general relativity's allowance for a "repulsive gravity" - since any negative pressure has associated with it gravity that repels rather than attracts. (See, e.g. 'Supernovae

*, Dark Energy and the Accelerating Universe'*, by Saul Perlmutter, in Physics Today, April, 2003, p. 53.) Of course, simple algebra applied to the above also shows that the energy density would have to be negative, e.g. energy density = - (pressure).

The point is, the metric and treatment is feasible since non-trivial and matterless solutions exist. Thus, the cosmos can be incepted and proceed to expand because of the negative-energy density of the conformal factor.

It is also this basis that provides the dynamic scaffolding for the instantaneous formation of the universe by a possible quantum fluctuation that arises when a particular threshold is crossed near alpha = 0 (from quantum to classical domains) As Padmanbhan shows in his paper, such a cosmos from “nothing” is perfectly justified in the context of the model, and follows from the basis of the equations, the light cone, scale factor restrictions and so on.

This means the limits at spontaneous inception must definitely be for acausal determinism and quantum cosmology, NOT classical – including classical causality.

What Padmanabhan has accomplished, in a way, is using quantum logic under very specific conditions (defined by his conformal variable, alpha) and subject to rigorous mathematical analysis, to establish that a physical entity (cosmos) can arise "uncaused" in the normative, classical logical sense. By "uncaused", of course, I mean in the same context (roughly) as pair production incepted by an energy variation, subject to the energy-time uncertainty relation.

Thus, given a primordial vacuum state, with conformal quantum variable alpha, one can have an "explosion" arising from the negative energy density and inception of the universe.

Is it possible or feasible to prove Padmanabhan wrong in his context, and in a general way show quantum logic is subsumed by classical? (And hence that a spontaneously incepted cosmos is nonsense?) Sure, just crank your brain up and demonstrate for us that any non-Boolean, non-distributive – ortho-complementary lattice can be transferred into a Boolean (e.g. binary {1,0}) and distributive one.

Good luck on that, you'll need it!

## 1 comment:

You may know that Hans Reichenbach hypothesised a 3-valued logic of true, false and indeterminate. His 1944 book details how this logic suppresses "causal anomalies" such as complementarity and the EPR paradox. Reichenbach never had a foundation for his logic. Now, without prior knowledge of Reichenbach, I have independently found logic withinin the formalism of Quantum Mechanics that provides foundation for Reichenbach. It derives from the logical juxtaposition between the Field Axioms and normalisation of orthogonal vectors.

Steve Faulkner

email: steviefaulkner@googlemail.com

Blog: The Foundation of The Quantum Logic at http://steviefaulkner.wordpress.com/

Post a Comment