In a recent paper ('Causality in Quantum Mechanics', Physics Letters A), David T. Pegg attempts like many others before him (e.g. Einstein, Podolsky and Rosen) to attempt to impose some faux "causality" on quantum mechanics without realizing what he is doing.
The claim is that it is "shown explicitly how the causal arrow of time that follows from quantum mechanics has already been inserted at a deeper level by the choice
of normalisation conditions".. However, there is no "causal arrow of time" that follows from QM, nor does it appear with assigned normalization conditions.
The major disagreement I have is Pegg's claim that:
"Causality results in the preparation device outcome being represented by a
positive operator with unit trace and the measurement device outcome being represented by a positive operator that is an element of a set whose elements sum to the unit operator"
However, consider the vacuum state for spontaneous inception of the cosmos, and the conditions that bear on it.
First, the entropy of the universe is now defined in terms of a "holographic principle" that reckons in the Planck length and quantum fluctuations. It is defined(Lawrence B. Crowell, Quantum Fluctations in Spacetime', p. 125):
k_B[KL_p^2/ 3)^-1 + (ln 2)/2]
Given this, and the condition of zero net mass energy, the spontaneous and acausal inception of the cosmos is actually the simplest formulation for its origin. Even the smallest fluctuation in the vacuum whereby
delta E ~ h/ delta t
leads to an instantaneous local deviation in mass-energy and the explosive origin of a cosmic expansion predicated on negative pressure. (See previous blog entry). As noted by Crowell (op. cit., p. 134):
"A net zero cosmology is the most economical one that can emerge from the vacuum state".
Even more important (ibid.):
"Since K is a measure of the number of Q-bits ....a cosmology with N x Q-bits will exhibit Poisson statistics."
As anyone who has used Poisson statistics knows, these are divergent from the semi-classical probability equations employed by Pegg(Eqns. 1- 10)
Returning to the trace, let the event horizon of the vacuum bubble be defined by
rS(g) = 2 ct
Where r ~ L_p, the Planck length (L_p = {Gh/2 pi c^3}^1/2) and S(g) is the action
then we will have the cosmological constant applicable to de Sitter space: K =
(n – 1)(n – 2)/ [2 q^2]
where q is a scale factor, and n denoted the dimension (4) of the volume under consideration.
Now for S(g) ~ t^1/2, R (the scalar curvature of de Sitter space) = 0, so S(g) = 0
However, the above happens because the Einstein tensor (T_ik) has trace = 0 in the early universe. The ‘trace’ is the sum of the diagonal elements of a tensor, e.g.
Tr(M) = 0
where M =
[1 0 0 0]
[0 -1 0 0]
[0 0 1 0 ]
[0 0 0 -1 ]
Now, if the "preparation device outcome" is none other than the net zero mass vacuum state on quantum fluctuation, such that dx ~ dL/ L (where L is the length scale of the volume, viz. dL >= (L*L_p^2)^1/2, then Tr(M) will apply to good approximation, and the causality assumption is finito. The question is WHY? A clue is provided by Seth Lloyd in his monograph 'Programming the Universe'(p. 118):
"What's going on is that quantum mechanics, unlike classical mechanics, can create information out of nothing"
Thus, what has transpired, is that the holographic fluctuation has not only incepted an inital mass-energy dE ~ H/ dt -> E/c^2 but also information! The information inception occurred acausally precisely because the Hilbert space states are different for a vacuum fluctuation than for ordinary QM.
To fix ideas, while the expectation value [E(Q,A)] of an observable represented by a bounded operator A, on separable Hilbert space H is given by:
[E(Q,A)] = tr (Q, A) where Q is a statistical operator, this presumes the ensemble representation is deterministic or 'causal' if for all g in G:
f_A(g) in S(A)
where S(A) is the spectrum of A with respect to the algebra L(H).
but the condition does not apply in the case of quantum fluctuations on L_p.
For this reason other avenues are being pursued - for example, using Riemann's Zeta function Z(E) in terms of divergent, infinite Dirichlet series that can be transformed into finite sums such as:
Z'(1/2 + iE) ~ 2 exp {- i pi N(E)} SIGMA_m=1 to (E/2 pi)^1/2 m^-1/2 cos(E lnm - pi N(e))
in conjunction with Gutzwiller's trace equation, and a more germane representation for the trace:
Tr G(E) ~ Tr G_o(E) ~ d/ dE {ln Z (E)
This quantity, for example, will replace the Tr's in Peggs equations (1)....(10) etc. and also take into account the Poisson statistical nature attached to the fluctuations, whereby:
delta N = [N]^1/2 and delta V ~ G[V]^1/2
where delta N is the flucutation of Q-bits arising, and delta V the corresponding fluctuation in volume.
But omitting all the recondite probability formulations such as Pegg presents, the most compelling argument for a self-incepted cosmos may well be evidence we already behold in the type 1a supernova data. As Crowell observes (ibid.)
"The recent discovery that the universe is accelerating outward is the latest of important results, which indicates the universe could well be a net zero (mass-energy). This then indicates that the observed universe is the result of a fluctuation in the quantum gravity vacuum"
It is important to note here that the preceding does not mean that the universe spontaneously creates mass-energy within its normal space-time. So, it does obey the first law of thermodynamics such that dU = - dW + dQ, where U, W and Q denote the internal energy, the work done on the system, and the heat absorbed by it. Assuming a closed cosmology, dE= 0. The main point here is that the universe absorbs work by virtue of the negative pressure (p = -rho) acting on it, as opposed to a positive pressure that does work. Thus, for consistency one needs:
dW = - p = rho (energy density) and dQ = rho_vac c^2.
Again, the negative energy assures us that the zero point energy of the quantum vacuum will "tunnel through" to create an unstoppable expansion on fluctuation.
The best evidence yet for an acausal cosmos and acausal origin.
More on this, as well as dismantling Pegg's other claims, in the next blog entry.
The universe's true texture is a modern topic, thanks for keeping the focus on up-to-date concepts. One way to define the cosmos is by wavefunction transforms, in which a Schrodinger equation may be used as a base. This RQT (relative quantum topological) physics approach bases it's argument on the atom as the basic unit of our universe, named the U4.
ReplyDeleteWhen these quantum field theory equations are used factors like positrons show U4 engagement at picoyoctoscales of molecular or material examples. The result of steady RQT analysis is that atoms look like the standing overtone wavenodes of the U4's total sum of quanta, Q4s. More RQT pymtechnical atomic physics may be found at http://www.symmecon.com.