Physicist Julian Baggini in his WSJ review of Sabine Hossenfelder's book Existential Physics, starts out by overthinking her work, as well as applications of modern quantum mechanics. This is ironic and strange given in his own previous (2012) work 'The Ego Trick' he definitely comes over as an emergent materialist. In other words, we can't describe ourselves fully with only the vocabulary of physics. One needs to invoke psychological concepts too, because of the immaterial nature of consciousness.
He writes to start:
"Quantum physics has long been the go-to discipline for anyone in need of a pseudo-scientific justification for a quack theory. The mysterious, indeterminate nature of quantum causation is said to solve puzzles such as the foundations of consciousness and the possibility of free will, as well as vindicate dubious practices such as telekinesis and homeopathy."
But this is mostly nonsense. In fact, the comment could have been long time skeptic gadfly Michael Shermer's in his Scientific American piece, 'Quantum Quackery' (January, 2005, p 4). Therein Shermer criticized the collaborative work of Physicist Roger Penrose and anesthesiologist Stuart Hameroff. Their theory (some would say “conjecture”) is that within neurons are tiny structures called microtubules “which initiate a wave function collapse that results in the quantum coherence of atoms in the brain." For some reason Shermer thought this to be mind boggling, like Baggini doubtless would now.
Shermer claimed that: “there is too large a gap between subatomic quantum systems and large scale macro systems” for any quantum processes to bridge. However, in my first book, The Atheist’s Handbook To Modern Materialism, I showed why this interpretation is inaccurate. In particular, as physicist Henry Stapp first noted, the synaptic cleft dimension (of about 200-300 nm[1]) is exactly the scale at which the Heisenberg Uncertainty Principle [2] would be expected to operate. That necessarily implies quantum mechanics! What is so $#&@! difficult to grasp?
The action then is centered around the brain’s synapses, and their dimensions, but not exclusively. Stapp, for his part, also pointedly noted (Mind, Matter and Quantum Mechanics) that uncertainty principle limitations applied to calcium ion capture near synapses. Stapp cited a reference model for which a calcium ion travels about 50 nm (i.e. 50 billionths of a meter) in about 200 ms (i.e. 200 millionths of a second) en route from channel exit to release site. He then elaborates by noting that:[3]
Uncertainty
principle limitations on body-temperature ions diffusing this way shows the
wave packet of the calcium ion must grow to a size many orders of magnitude
larger than the calcium ion itself
Hence, the idea of a single classical trajectory becomes inappropriate.[4]
It’s critical to point out that the locations can’t be computed from classical Newtonian mechanics but rather are based on the probability density, viz.
P
ab = ò
ba [y(Ca+2) ] 2
dx
Where the brackets denote what’s called an absolute
value. So the location is given by what we call the “expectation value” of
where it most probably is, or:
<x> = ò
¥- ¥ x [y(Ca+2) ]2 dx
y (n Î A,
B…E) = S Ijklm {y(Ca+2)
[ni(A), nj(B)…nm(E)]}
wherein all possible states are taken into account. The total of these taken in concert enables a quantum computer modality to be adopted for a von Neumann-style consciousness. In quantum neural networks it is accepted that the real world brain generation of consciousness is more along the lines of a quantum computer-transducer than a simple collection of switches. As S. Auyang has observed, consciousness is: more than a binary relation between a Cartesian subject and object.[5]
Stapp carries this deficiency of the Cartesian-reductionist dichotomy further, in exposing it as a prime defect of classical mechanics, say in coming to terms with consciousness[6]:
That classical mechanics is not capable of integrating
consciousness into science is manifest. Classical physics is an expression of
Descartes’ idea that nature is divided into two logically unrelated and
non-interacting parts: mind and matter. However, the integration of consciousness
into science requires instead, a logical framework in which these two aspects
of nature are linked in ways that account for both the observed influence of
brain processes on mental processes and the apparent influence of mental
processes on brain processes.
Further, the incorporation of quantum mechanics into brain function enables the much more effective basis of a quantum computing basis for processing[6]. That means that rather than limit storage to bits, one can work with qubits (truncated for quantum bits) where the superposition of a combined data element (1 + 0) applies, i.e.:
y = y(1) + y(0)
Shermer next claimed that Victor Stenger, in his book, The Unconscious Quantum, showed that in order to describe a system quantum mechanically, the product of its mass m, speed v, and displacement d, must be on the order of Planck’s constant, h. (Where h is 6.6 x 10-34 J-s) But this is exactly the error Alan Chalmers exposed as being too strong a falsification criterion. Chalmers observed:[7]
...if we make our falsificationist criteria too strong then many of our most admired theories within physics fail to quality as good science while if we make them too weak few areas fail to qualify.
One example of too strong a criterion would be requiring any even marginally falsified theory (i.e. Newtonian mechanics for failing to accurately describe Mercury’s perihelion advance) to be rejected outright. But if that is accepted, all of Newtonian physics would be sacrificed merely because it cannot compete with Einsteinian General Relativity at a given prescribed scale. Lost forever would be the powerful utility for all Earth-based applications, or even the basic (celestial) mechanics needed to get to the Moon or Mars!
Conversely,
too weak a criterion requires that any non-falsifiable theory be accepted. For
example, the theory of “cold fusion”, for which experiments (e.g. of Pons and
Fleischmann), would have been accepted. Their experiments did reveal an
enhancement in temperature but it was found not be significantly more than
their measured statistical uncertainty.
The problem then, evidently lay in how particular criteria for testing are identified and articulated. Who actually sets or defines the thresholds of acceptance or rejection? To the extent that this is a subjective operation, those whose theories inevitably fall out of favor will complain about 'subjectivity' and 'bias'.
Shermer in his SciAM piece writes:
If mvd is much greater than h then the system can probably be treated classically
The
problem is that if the dimension of the synapse itself is 200nm -300nm or
within Heisenberg Uncertainty Principle dimensions, the computation is
meaningless for the reason that Stenger would not be able to calculate accurate
v and x, or d (position in relation to v) at the same time.
The form for the Heisenberg Uncertainty Principle, say in one dimension is:
D x D p x » h
Where
D x is the uncertainty in
position in the x-direction and D p x is the uncertainty in momentum. But what do
we see with Stenger’s “mvd" formula? Well, we see immediately that mv = p x
or the momentum in the
x-direction, and Stenger’s d variable is just x, the synaptic gap distance with
D x the associated
uncertainty.
The stark reality is that as intent as Stenger is on proving no quantum threshold, his calculation is amiss, because (under Heisenberg’s Uncertainty Principle) he can’t obtain simultaneously accurate values of both p and x (or in this case, mv and d) since if he estimates or knows one to perfection, he loses all information on the other . Many of the same criticisms can be leveled at Baggini in his WSJ review. He simply, perhaps for sake of simplicity in communicating in a newspaper review, cuts corners and refuses to give equally competent physicists their due.
As for the author of the book he's reviewing, he has this to say (ibid.):
"The German theoretical physicist Sabine Hossenfelder understandably has no time for any such nonsense, policing the strict boundaries of science with the zeal of a North Korean border guard."
Ironic, given that's exactly how he comes over in his dismissal of stochastic and non-orthodox quantum mechanics. He also writes:
"The most surprising and interesting feature of the book is the claim that many of her physicist peers are as guilty of bringing speculation and belief into their scientific thinking as theologians and New Age mystics. She argues that all current theories offered by physicists about what made the big bang possible are “pure speculation . . . modern creation myths written in the language of mathematics."
Which is an odd take as well for Hossenfelder, given one of the foremost discoveries in physics was of antimatter. Physicist Paul Dirac recognized that the relativistic version of the Schrödinger wave equation predicted negative electrons, or anti-electrons. These denizens were then discovered, from that 'beautiful mathematics' in 1932. And no physicist with historical memory can forget or omit the power of celestial mechanics, i.e.
Which enables us to confidently calculate the positions of planets 50-100 years into the future.
See Also:
And:
https://www.youtube.com/watch?v=RT50FTICrxI
And:
Ed Kelly interviewed by John Cleese @ Science, Skeptics and
the Study of Consciousness - YouTube
And:
[1] 1 nm = 10-9 meter, or one
billionth of a meter.
[2] This
states that one cannot know both the position (x) and momentum (p) of an electron, for example, to arbitrary
precision. If you know position exactly
you know nothing of the other. In one dimension: d p d x ³ h
[3] Stapp: Mind, Matter and Quantum Mechanics, 152.
[5] Auyang,: How is Quantum Field Theory Possible?, 112.
[6] Stapp, op. cit., 132.
[7] Chalmers: Science and its Fabrication. 16.
No comments:
Post a Comment