Friday, January 17, 2025

Medical Industrial Complex Finally Coming To Its Sense on BMI Index? Maybe

 


Dr. CĂ©line Gounder's appearance on CBS Mornings three days ago was long overdue in terms of clarifying the limits of the body-mass (BMI) index. She admitted as we've long known, it can give skewed results for athletes (like NFL Players) who have massive, muscle bound arms and legs.  These guys often do register as "obese" according to the BMI - which is nonsense.

Thus, she admitted that while BMI may be useful in a "first pass" the physician then needs to dig deeper, and that means going to other metrics. She listed waist size at the top: No more than 40 inches for a male and 38 inches for a female. Also, the preference of an "pear-shape" body to "apple shape".  It's thought that having a pear-shaped body — that is, carrying more of your weight around your hips and having a narrower waist — doesn't increase your risk of diabetes, heart disease and other complications of metabolic syndrome.

Back to the cockeyed BMI as an obesity measure. Ten years ago, in one of the Sunday magazines (Parade?) , I beheld a brief article (thankfully):” ‘What’s Your BMI and Why Should You Care?’  In the lead paragraph ‘the Doctors’ wrote:

The BMI (body mass index) is a good indicator of how much body fat you have. Health professionals use it to screen for weight problems in adults.

 

They did add that “it doesn’t paint a full picture of your health” and that’s an understatement. As noted in a Penn & Teller ‘Bullshit’ episode at the time,  lampooning BMI and the whole “obesity is an epidemic” baloney, both Michael Jordan and Brad Pitt would be overweight, and Russell Crowe and George Clooney would be “obese” on the BMI index scale.

Apart from such whacked out nonsense, as one Univ. of Virginia prof quoted in the segment observed:

Another problem with the government using BMI is that it says everyone needs to be a certain weight within a certain height range in order to be healthy.”

But this disdains the range of variations for most humans pertaining to a host of attributes. It mandates that only a certain human height-weight body profile is acceptable while labeling the outliers “unhealthy” or “obese” or “overweight”. Using this bogus index we’ve actually come to believe “one third of Americans are obese” – based on having a BMI of 30 or higher.

 

But this is nonsense!

 

As Penn & Teller observed it was a Belgian polymath,  Adolphe Quetelet who devised the  BMI formula in 1832 in his quest to define the "normal man" in terms of everything from his average arm strength to the age at which he marries.  Obviously and clearly, his numerical basis would be irrelevant to today given the “normal man” ca. 1830s Belgium would not be in an way comparable to the normal man today - especially in the US of A. His diet would be more frugal, less protein for one thing as well as fewer nutrients,  and hence he’d naturally bear more a resemblance to reed-thin Stan Laurel than George Clooney, or Russell Crowe.

 

So his project had nothing to do with obesity-related diseases, nor even with obesity itself. Rather, Quetelet used the formula to attempt to describe the standard proportions of the human build—the ratio of weight to height in the average adult- in that reduced nutrition era. Using data collected from several hundred countrymen, he found that weight varied not in direct proportion to height (such that, say, people 10 percent taller than average were 10 percent heavier, too) but in proportion to the square of height. (People 10 percent taller than average tended to be about 21 percent heavier.)

The new formula had little impact among the medical community until long after Quetelet's death. While doctors had suspected the ill effects of obesity as far back as the 18th century, their evidence was purely anecdotal. The first large-scale studies of obesity and health were conducted in the early 20th century, when insurance companies began using comparisons of height and weight among their policyholders to show that "overweight" people died earlier than those of "ideal" weight. Subsequent actuarial and medical studies found that obese people were also were more likely to get diabetes, hypertension, and heart disease. (Of course, this  later allowed the medical insurers to either invoke "pre-existing conditions" to bar people from coverage or, more often, have an excuse to increase their premiums.)

By the early 1900s, it was fairly well-established that these ailments were the result of having too much adipose tissue—so the studies used functions of height and weight as little more than a proxy for determining how much excess body fat people had.  The problem with proxies, of course, is that they are not direct quantifiers or indicators and are only so good as the physical basis really allows.

 

It would actually have been more accurate for the actuaries to compare longevity data with more direct assessments of body fat—such as caliper-measured skinfold thickness or hydrostatic weighing. But these data were much harder for them to obtain than standard information on height, weight, and sex.  So they punted!

Medical researchers , meanwhile, needed a standard measure of fatness, so they could look at the health outcomes of varying degrees of obesity across an entire population. For decades doctors couldn't agree on the best formula for combining height and weight into a single number—some used weight divided by height; others used weight divided by height cubed. 

It arrived in 1972, when physiology professor and obesity researcher Ancel Keys published his "Indices of Relative Weight and Obesity," a statistical study of more than 7,400 men in five countries. Keys examined which of the height-weight formulas matched up best with each subject's body-fat percentage, as measured more directly. He concluded that the best predictor came from Quetelet’s BMI: weight divided by height squared. Keys renamed this number the body mass index.


But this was decidedly premature.


A critique (in PDF) of the body mass index in the journal Circulation suggests that BMI's imprecision and publicity-friendly cutoffs distort even the large epidemiological studies. (For example, there's no definitive count of how many people are misclassified by BMI, but several studies have suggested that the error rate is significant for people of certain ages and ethnicities. That old natural variation bugbear again!) It's impossible to know which studies have been affected and in what direction they might have been skewed.

Further, the BMI is actually a solid example of the “proofiness” that Charles Seife referenced in his book, Proofiness: How You're Being Fooled by the Numbers.


Seife decries the tactic of using numbers not just to lie but to baffle the susceptible with bullshit.  He refers to a common failing of most people unversed in math to be hoodwinked merely because some form of math or numbers are interjected into arguments.  Not just using numbers to bolster one's argument. In his words, to use fake numbers to prove falsehoods and to seek to prove something is true - even when it's not- is one of the most egregious forms of  intellectual  fraud.

In this regard, one of the surest signs of proofiness is the failure to provide attached uncertainties to the measurements - any measurements! Since BMI is always recorded as an absolute single number, say 29, and never as 29 + 2 or whatever, then it is inherently proofy - a bogus quantity. Seife emphasizes there can NEVER be a 100 percent accurate number if based on physical measurements, and he's right. Maybe the scale used is off by a pound or two, and maybe the height isn't evaluated for the associated probable error - based on the instrument used to measure it.  OR......maybe, just maybe the presumed cutoffs along the BMI chart indices have been majorly distorted by earlier misclassifications in large epidemiological studies. 

The BMI also takes this to new level because the combination of the 2 quantifiers make no sense. I mean the ratio of  weight in pounds to height in inches squared? And then multiplying by 703? That’s pure baloney and in no way even comparable to say obtaining metric mass by dividing the weight (in newtons) by the acceleration of gravity in N/kg.

Where does 703 come from anyway? Well it’s the correction factor introduced if one used Imperial units (foot, pounds) in stead of metric system. In the metric system the BMI is simply:

Mass (kg)/ [height (m)]2

Again, this is bollocks, since the result  (mass per unit area) yields no conceptually consistent physical quantity as applied to human biology!  It’s fully an example of more proofiness: In this case putting two unrelated units together in a ratio and making people believe the result (in kg/m2) has some innate core physical meaning. It doesn’t. It’s bullshit. (As Penn and Teller also pointed out in their show on “Obesity”.)

The medical -industrial -insurance- PhrMA whackos will try to tell you the ratio is valid because height and weight "are related", but this is a presumption unwarranted by the total constellation of data- especially applied to distinct ethnic groups. Also, if one investigates the fundamental units of physics that comprise it, s/he will find no such equivalent anywhere. (Which can also be deduced by using the basic SI units in various combinations.)

The closest one can come is the combination of units:

kg m -3    Which is mass divided by the length cubed or M L -3

This yields what we call “density”.   And at least the use of density would make some physical sense, but the ratio for BMI makes zero sense, because no comparable physical quantity in terms of mass per unit area exists for human bodies. It make no difference how many idiotic trials were used to attempt to validate it in the health sense. 

I could as well take the Martian mass in kg, and divide it by its assorted  Earth opposition values in meters to show that UFO sightings increase whenever the ratio approaches a certain value (say 1013 kg m-1.)  It is pure nonsense, and any “findings” add up to little more than lucky coincidence.

The use of this dumb obesity quantifier is even more enraging given there’s at least a more rational alternative. It turns out that the circumference around a person's waist provides a much more accurate reading of his or her abdominal fat and risk for disease than BMI. One unit, no hocus pocus. Simple. Besides, wrapping a tape measure around your belly is no more expensive than hopping on a scale and standing in front of a ruler. That's why the American Society for Nutrition, the American Diabetes Association, and other prominent medical groups have lately promoted waist circumference as a replacement for, the body mass index. (Some have indicated as a “supplement” but why waste time with proofy contrived numbers at all?)

Alas, few doctors  - including our own - have made the switch. This is probably because waist measurements require slightly more time and training to interpret  than it takes to record a BMI reading and use some fake out chart, which doesn’t come with any “official cutoffs”. (Right now, my BMI is 29.5 but I laugh when anyone says I am “over weight” for the reasons given above, especially the proofiness of the index and nonsensical units.)  The sensitivity of doctors to these slight inconveniences signals just how difficult it will be to unseat Quetelet's antiquated and irrational, proofy formula. See, the body mass index is cheap and easy to get (never mind the absence of uncertainty), and it has the incumbent advantage in that the Lords from On High in Health Central have conferred their benediction – along with the political-Pharma –lobby enclave – so who’s going to argue with them? Well, I am!

Sadly, just like tea leaves, natal horoscopes and palm reading, BMI is here to stay—despite its flaws – the chief of which is that it’s irrational and has no bearing to any real physical quantity (as the examination of its units discloses)

But that doesn’t mean I have to treat it any more seriously than other monkey fool bollocks, including horoscopes, palm reading and tarot cards.

 As for rationality in terms of obesity these are the key obesity- BMI  myths you need to know, summarized from Prof. Paul Campos in his book, 'The Obesity Myth':

1)  Weight is a good proxy for health ("97 percent false" according to Campos)

2) Health improvement comes via transition from being fat to thin. (Hardly ever, for most people - especially the elderly who are more at risk if they become frail.)

3) We know how to produce long term weight loss.

In respect of the last, Prof. Campos makes it clear that despite the bloviations of the medical-industrial-insurance complex and the government health brigade as well as the health diet faddists, no one really has a clue how to sustain long term weight loss. Yes, they say, "balance intake of food with exercise" -but if a lot of weight gain is traced to gut bacteria this is  a non-starter. 

Then there is the leptin factor, which hormone levels decrease when people don't get enough sleep - and which causes them to eat inordinately. As explained on one medical site (WebMD):

"When you don't get enough sleep, it drives leptin levels down, which means you don't feel as satisfied after you eat. Lack of sleep also causes ghrelin levels to rise, which means your appetite is stimulated, so you want more food,"

 The sad fact is too many overworked and  tech over- connected Americans are in this latter category. Dieting won't help them but getting on a regular, decent sleep schedule might!

Stay tuned, and in the meantime don't get hysterical over your BMI!

 See Also:


Wednesday, January 15, 2025

A Secondary Atrocity: Senate Goops Ready To Confirm Unqualified Former FOX Host & Extremist

 "A gross dereliction of duty on the part of the Republican-controlled Senate and the Trump-directed FBI. That is a harsh but unavoidable assessment of the confirmation hearing for Pete Hegseth to serve as Donald Trump’s defense secretary. Both institutions should be ashamed of their performance — Republican senators most of all, as, bullied by the president-elect.."  - Ruth Marcus, 'At Hegseth Hearing GOP Senators Covered Themselves In Shame', Washington Post yesterday

"Since Donald Trump’s election victory, we have witnessed striking accommodations to his narrow win and mandate, what has been called “anticipatory obedience.” Are we sleepwalking into an autocracy? We hope not, and would be glad if the threat does not materialize. But as close observers of people and places where democracy has come under pressure and occasionally buckled, we see creeping autocracy as a distinct and under-discussed possibility.- "Are we sleepwalking into an autocracy?", NY Times guest column today


"Trump basically wants them (GOP Senate) to crawl through the Capitol in their underwear and eat dog food out of a can for him, like in some fraternity hazing ritual.  This is a humiliation exercise, meant to either break the system or allow him to rule unfettered." - Chris Hayes on ALL In Nov. 14th after Trump Cabinet Picks are announced.

"These picks are an attempt to make the American Government dysfunctional, to do the opposite of what their agencies are designed to do under the Constitution.  In that regard, their purpose is to sow chaos and break the systemBreak the government.Prof. Timothy Snyder, Yale University historian

Our Government class at Monsignor Edward Pace High (1964), repeatedly featured one theme known as  "advise and consent". Broken down it means the power of the U.S. Senate to be consulted on whenever appointments are made by a president to public positions. Especially critical (cabinet) positions, such as Attorney General and Secretary of Defense - given the expansive inherent powers of these.   

Ignoring or dismissing the importance of advise and consent was gobsmackingly reckless since it meant not only Senators foregoing their roles but also paving the way for degenerate or unqualified nominees to gain power.  This threatened the very exercise of power and specifically the separation of powers embodied in the Constitution by the Founders.  Thereby undermining the Constitution itself.  

For if we are not a Constitutional Republic, what are we? Well mainly a pretender or a Potemkin government predicated on the whims of an autocrat given license to pull any strings he wishes. This indeed as a primal fear of the Founders soon after the Revolution. Even Benjamin Franklin replied to one questioner when asked what type of government we had:


"A Republic, if you can keep it."

But with the dumbfounded election of Trump - 34 times convicted felon, and who former Special Prosecutor Jack Smith - in his just released report on Jan. 6th - found should have been convicted given the evidence, we are teetering on the edge.  That election then, was the first atrocity.

The second has been this criminal traitor's naming of absolute derelict, unqualified picks to lead his cabinet and federal agencies. Yeah, I've heard the old saw that when you win election you get to make the picks.  But that doesn't include disabling and kneecapping the whole apparatus of government.  Oh no! Sorry, your freedom doesn't extend that far - especially when you didn't even secure a majority( > 50 percent). 

And never would have been an issue had voters last November executed their own responsibilities to properly inform themselves.  They didn't. They stupidly fell for a whole constellation of lies and misinformation- much of it pumped 0ut out by Trump's billionaire bud Musk.

So that leaves us now facing the secondary atrocity: not only seeing a formally convicted felon  (and insurrectionist traitor) ascend to the presidency - desecrating the office, e.g.

But now also the ascendance of totally unqualified and woeful nominees to critical agencies to wreck them, and the country. At least the first of these (Matt Gaetz) was stopped, but not because of any judicious or courageous decisions of the GOP Senate.  No, it was the finally released House Ethics Report that did their jobs for them. Exposing Gaetz as a trafficker of young (underage) girls for sex and using drugs.  Just imagine if that report had not been released and he had been nominated as the nation's Attorney General. Think about that!

But now we are faced with the potential nomination of Pete Hegseth, arguably only slightly worse after exposure of his own sexual dereliction (sexual assault) as well as open drunkeness- which GOOPers are now trying to sweep under the rug.

                                                A maggot by any other name: Pete Hegseth

Sherrilyn Ifill on Nov. 14's  All In described this forlorn mutt as: "a white supremacist and extremist.  Brags of himself as a former veteran which is insufficient to be qualified for such a key position"

 Chris Hayes seconded that, adding (about the DoD position): 

"Take ideology aside. It is the most complex, powerful and lethal bureaucracy in history or human civilization on the  planet. Which is an accurate description of the entity for which Hegseth has been nominated."

Indeed. So this fool and half -assed asswit will be in position to carry out any derelict, half-cocked plan Capt. Bonespurs wants - whether using the military to try to forcibly deport millions of immigrants or to try and take back the Panama Canal, e.g.

Trump's expansionist plans spark worry - The Washington Post

Or invade Greenland to take it for Amerikkan purposes. In yesterday's confirmation hearing Democratic Sen. Tammy Duckworth - using a series of serious questions -  disposed of this extremist former kook and any reasons to confirm him.  However the majority GOP Senators (53 to Dems 47) seem intent on "crawling through the Capitol in their underwear and eating dog food out of a can" to get this asswipe confirmed, in Chris Hayes' parlance.  

In my own, more direct:  They are ready - From AL Sen. Tommy Tuberville, to let Trump piss down their throats and laugh at how he has them all by the balls.

See Also:

At the Hegseth hearing, GOP senators covered themselves in shame

And:

Hegseth’s hearing: What did we learn?


And:
by Amanda Marcotte | January 16, 2025 - 6:17am | permalink

— from Salon

Pete Hegseth and his fellow Republicans can't decide if the abuse allegations against him are real or "fake news."

At the very top of the Fox News host's hearing to be Donald Trump's defense secretary, both Hegseth and his GOP defenders spun out two competing narratives: The stories about him aren't true, but if the evidence makes the stories undeniable, it doesn't matter, because he's a changed man. Armed Services Committee chair Sen. Roger Wicker, R-Miss., kicked off the have-it-both-ways strategy in his introductory remarks to Hegseth's confirmation hearing Tuesday by acknowledging that "Mr. Hegseth has admitted to falling short" while insisting "the accusations leveled at Mr. Hegseth have come from anonymous sources."

Hegseth himself picked up the ball with a conspiracy theory that he was the victim of a "coordinated smear campaign orchestrated in the media," adding, "Our leftwing media in America today sadly doesn't care about the truth." Throughout the hearing, he kept harping on the word "anonymous," implying that the press made up the stories.

» article continues...

And:

by Gregory D. Foster | January 15, 2025 - 6:16am | permalink

— from Foreign Policy In Focus

Reflect for a moment, that you are the U.S. military: perhaps the chairman of the Joint Chiefs of Staff; perhaps the combatant commander of the U.S. Southern Command or the U.S. European Command; perhaps the commander of a high-priority, deployable unit in the field.

You are aware that the incoming president has expressed an intention—perhaps real, perhaps just bluster—to use the military, if necessary, to retake the Panama Canal and take control of Greenland.

You know that, by virtue of the 1977 Panama Canal Treaties, the United States relinquished control over the canal in 2000 and thereby guaranteed its permanent neutrality. You know that Panama, a founding member of the United Nations whose sovereignty, territorial integrity, and political independence are therefore to be respected by all other UN member states, maintains full jurisdiction and operational control over the canal.

» article continues...

Tuesday, January 14, 2025

David Bohm's Stochastic Interpretation Of QM - The Theory That Placed In Him In the Sights Of Critics Like Oppenheimer

 We continue, in this celebration of the International Year of Quantum Science & Technology, in examining David Bohm's contribution - now known as the Stochastic Interpretation of Quantum Mechanics.  This is also Part 2, dealing more specifically with Bohm's work and arguably what provided the motivation to attack Bohm in the middle years of the 20th century.

 To refresh memories, Louis de Broglie, in his Ph.D. thesis (1926), had postulated the existence of what he called “matter waves”.  In effect, he postulated that particles of mass have wave properties and actually can be deemed waves.  Hence, every material particle (electron, proton etc.) has associated with it a de Broglie wave with a wavelength defined by a simple mathematical equation:

lD = h/ mv

where h is Planck’s constant, and m the mass, v the particle velocity.

    Only later (1927) did we learn that de Broglie waves (also called ‘B-waves’) had actually been proven to exist, from the results of the now famous Davisson-Germer experiment (setup below):


      The experiment consisted of firing an electron beam from an electron gun directed to a piece of nickel crystal at normal incidence. An accident occurred in which air entered the chamber, producing an oxide film on the nickel surface.

 To remove the oxide, Davisson and Germer heated the specimen in a high temperature oven, not knowing that this affected the formerly polycrystalline structure of the nickel. The effect of the accidental heating was to form large single areas with crystal planes continuous over the width of the electron beam. To make a long story short, when the experiment re-commenced  the electrons were scattered by atoms which originated from crystal planes inside the nickel crystal, leaving patterns from which the de Broglie wavelength (l) could be calculated according to:

l  =  2 d sin (90o -   q /2)

   The point is that de Broglie waves represent the wave mirror image of matter.  David Bohm then incorporated this result into his own theory of quantum mechanics. David Bohm and his Birkbeck College colleague Basil J. Hiley[1], not only concurred with the physical reality of de Broglie waves (or B-waves) but also put forward that they were guided by a clock synchronism mechanism. This was set at a certain rest frequency, fo, and also for frequencies in non-rest frames.  This mechanism would provide a “phase locking” as if guided via “synchronous motors”.  Hence, the genesis of the term “pilot waves.” 

In the relativistic limit for photons:

                               f =   mo  c/ h

Changing to angular frequency (wo) to make the mechanism consistent with that proposed by Bohm and Hiley:

2p  fo  =   mo  c2 / h

 Replacing the Planck constant  (h) by ħ = h/ 2p , the Planck constant of action:

2p  fo  =   mo  c2 / Ä§  

Then:

womo  c2 /  ħ 

Which is the “clock frequency”  in the photon rest frame.

 There is also an additional condition, known as the Bohr-Sommerfield condition, for the clock to remain in phase with the pilot wave:

p dx =  n ħ

Now, the momentum p =  mo  c , so that the integral becomes:

 2p x (mo c) = n ħ

 And the de Broglie wavelength emergence (lD = h/ p) is evident in the equation. In this sense, we have:  2p x  =  n (h/ mo  c)  =  n lD  

 Or the same expression (2p r =  n lD) for the standing waves in an atom.  In Bohm’s own development, the procession of B-waves is actually enfolded within a “packet” of P-waves:



The axis labeled E is actually the real part of the electric field component, Ez. The width of the p-wave packet is denoted by the spread:

Dk = p / (x – xo)

 Where xo  denotes the center point of the wave packet. In other words, if the center point is at  xo = 0, the packet width is just Dk = p / x. 

The wavelength, l = 2p / Dk then  is much less than the width of the packet. E.g. if Dk = p / x then l = 2p / (p / x) = 2x. so if x = 1 nm, then l = 2 nm and:

 Dk = p / x = p / (1 nm) = p  nm, but p (nm) > 2 nm.

 The maximum of the wave packet is approximated closely by the square of the amplitude:  [ Ez ] 2 =    4 sin2 Dk (x – xo) / (x – xo) 2

 We can check the limits of the preceding. Let xo = 0 then:

[ Ez ] 2 =    4 sin2 Dk x / x 2

 And: [ Ez ] 2 =      4 sin2 Dk /  x

 Conversely, let xo = x, then:  [ Ez ] 2 = 0 =  4 sin2 Dk (x – x) / (x – x) 2

   Thus, the p-wave packet ceases to exist as a discrete or localized entity and thereby loses its particle properties. This is undoubtedly what drove most of Bohm's opponents bonkers: "Wait!!  He just disposed of particles!"

Having shown the dominant wave aspect of matter, the next battle became that of “locality” vs. “nonlocality”.  This conflict in turn set up the battle between accepting quantum mechanics as a complete theory and rejecting it as incomplete.  Enter now (in 1935), Albert Einstein along with two colleagues, Boris Podolsky and Nathan Rosen.  The trio devised a thought experiment to try to show quantum mechanics was incomplete. This has since been called “the EPR experiment” based on the first initials of their names.  Einstein, Podolsky and Rosen (E-P-R). 

 They imagined a quantum system (helium atom at A) which could be ruptured such that two electrons were dispatched to two differing measurement devices, X1 and X2. 

X1  (+ ½ ) <-----(A)------>(- ½ ) X2


Each electron would carry a property called 'spin'. Since the helium atom itself had zero spin (the 2 electrons canceling each other out), this meant one would have spin (+ 1/2), the other (-1/2).  Thus, we manage to skirt the Heisenberg Principle, and obtain both spins simultaneously without one measurement disturbing the other. We gain completeness, but at a staggering cost. Because this simultaneous knowledge of the spins implies  that information would have had to propagate from one spin measuring device (on the left side) to the other (on the right side) instantaneously!  This was interpreted to mean faster-than-light communication, which violates special relativity. 

In effect, a paradox ensues: quantum theory attains completeness only at the expense of another fundamental physical theory - relativity. By this point, Einstein believed he finally had Bohr by the throat. Figuring Bohr might come up with some trick or sly explanation up his sleeve, Einstein went one better at the 6th Solvay Conference held in 1930, actually designing a thought experiment device that he was convinced would have Bohr in tears trying to find a solution.

                                                                      


Whenever the door flaps open, even for a split second, one photon escapes and the weight difference (between original box and after) can be computed using Einstein's mass-energy equation, e.g.: m = E/ c2. Thus, the difference is taken as follows:

Weight (before door opens) - weight (after )

(E.g.  with 1 photon of mass m = E/ c2   gone)

    Since the time for brief opening is known (Δ t) and the photon's mass can be deduced from the above weight difference, Einstein argued that one can in principle  find both the photon's energy and time of passage to any level of accuracy without any need for the energy-time uncertainty principle.

     In other words, the result could be found on a totally deterministic basis!      Bohr for his part nearly went crazy when he studied the device, and for hours worried there was no solution and maybe the wily determinist was correct after all. When Bohr did finally come upon the solution, he realized he'd hoisted the master on his own petard.

     The thing Einstein overlooked was that his very act of weighing the box translated to observing its displacement (say,
dr = r2 - r1)  within the Earth's gravitational field. But according to Einstein's general theory of relativity, clocks actually do run slower in gravitational fields (a phenomenon called 'gravitational time dilation') In this case, for the Earth, one would have the fractional difference in proper time, as a fraction of time passage t[i].  But this meant the Uncertainty principle had to be used with  Δ t  factored in.

 (I.e.  Î”E Δ t ³  h/2Ď€)

  Years later, mathematician John S. Bell asked the question: 'What if the E-P-R experiment could actually be carried out? What sort of mathematical results would be achieved?' In a work referred to as "the most profound discovery in the history of science", Bell then proceeded to place the E-P-R experiment in a rigorous and quantifiable context, which could be checked by actual measurements.

    Bell formulated a thought experiment based on a design similar to that shown in the earlier EPR sketch. Again we have two electrons of differing spin flying off to two separate detectors  D1 and  D2:

D1 (+ ½ )<--*---[ o  ]----*-->(- ½ ) D2 


Bell made the basic assumption of locality (i.e. that no communication could occur between  the detector  D1 and detector  D2 faster than light speed). In what is now widely recognized as a seminal work of mathematical physics, he set out to show that a theory which upheld locality could reproduce the predictions of quantum mechanics. His result predicted that the above sum, S, had to be less than or equal to 2 (S   < 2). This result, so pedestrian on the surface, became known as the 'Bell Inequality'. Little known then, it would propel three quantum physicists (Alain Aspect, John F. Clauser and Anton Zeilinger) to the Nobel Prize 5 decades later.

By 1982 Alain Aspect and his colleagues at the University of Paris were determined to actually test Bell’s Inequality and the original E-P-R quantum system (used in the EPR thought experiment).  To that end the team set up an arrangement as sketched below:


Rather than electron spins - photon polarizations  (P1 and P2)  had to be detected and determined. These were observed with the photons emanating from a Krypton-Dye laser and arriving at two different analyzers, A1 and A2. The results of these remarkable experiments disclosed apparent and instantaneous connections between the photons at A1 and A2. Say, twenty successive detections are made then one obtains at the respective analyzers (where a ‘1’ denotes polarization detection with E vector up and ‘0’ denotes polarization detection with E vector  down:

A1:   1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0

A2:   0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1

 On inspection, there was found to be a 100% anti-correlation (i.e. 100% negative correlation) between the two and an apparent nonlocal connection. In practice, the experiment was set out so that four (not two - as shown) different orientation 'sets' were obtained for the analyzers. Each result is expressed as a mathematical (statistical) quantity known as a 'correlation coefficient'. The results from each of 4 orientations (I, II, III, IV)  were then added to yield a sum S:

S = (A1,A2)I + (A1,A2)II + (A1,A2,)III + (A1,A2)IV

   In his experiments, Aspect determined the sum with its attendant indeterminacy to be:   S = 2.70 ±  0.05. In so doing he experimentally validated Bell’s Inequality and in the process reduced the EPR Paradox to a simple misunderstanding of quantum mechanics in the authors' minds.

 Regarding the various violations of the Bell Inequality, David Bohm  considered an alternative quantum world, based on different orders of manifestation which he called explicate and implicate.  To Bohm, the readily observable order of the macrocosm had unfolded or explicated. That is, its host of apparently diverse objects and processes constituted a divergence from unified order. This is the order at which Einsteinian locality and determinism would have some relevance. (After all, Newtonian mechanics can also be used to make predictions about the motions of bodies - such as pool balls and artificial satellites).

  However, this plurality of objects (subatomic particles, planets, stars, galaxies) is ultimately enfolded in a higher dimensional implicate order. This order is hidden or unseen (hence 'implicate') and not perceived by lower dimensional beings.  To render this more concrete Bohm devised a number of excellent analogies.  For example, you walk into a room and see two television monitors, A and B. Each shows the image of a fish, one in lateral view, the other face-on. The sketch of the presentation is shown in the diagram below:

David Bohm's 'fish' experiment to portray multi-dimensional reality


The casual observer may simply deduce that he’s seeing two different fish, one on each screen, each in a separate fish tank. The observer then walks into an adjoining room where he confirms that each monitor is receiving input from two distinct camcorders trained on one aquarium, with only one fish inside it. One camcorder is aimed at the front of the aquarium, the other at the side. The fish is resting with its face to the front.  

At that point, the observer has recognized that at the higher (three)- dimensional) reality there is one fish, but viewed in two different (two dimensional) perspectives. By analogy, Bohm has suggested a similar error of perception applies to how we perceive the particulate (unfolded) world around us.  As Bohm describes his result[2]:

 In the implicate order we have to say the mind enfolds matter in general and therefore the body in particular. Similarly, the body enfolds not only the mind but also in some sense, the entire material universe.

 Bohm offered this in the hope of showing how we can be deceived into thinking the explicate, particle-dominated order of separation is the valid one. But it is actually only a virtual display, an artificial reference field for 3-dimensional brains.

 Such an error is costly - in terms of confining our attention to a limited realm of fragmentary illusion, instead of seeing beyond it.  For example, like the casual observer of the two TV monitor fish images, a casual observer of the Aspect experiment might conclude that the two photons (registered at separate detectors) are themselves separate. Bohm's implicate order prevails upon the observer to think instead of the photons to have always been connected as one whole - but perhaps in a higher dimension. 

 In the historic sense, David Bohm’s work provided a useful and verifiable perspective to what like-minded colleagues have said is also the basis of a higher dimensional holographic reality, an elevation above purely reductionist conceptions. Here, physicist Bernard d’Espagnat’s words (In Search Of Reality)  are certainly worthy of consideration [3]:

The experimental corroboration of nonseparability (i.e. nonlocality) quite obviously constitutes a strong argument against the hypothesis of objective realism as applied to microscopic objects, and even….against that of objectivist realism applied to macroscopic objects only.

Readers who wish to read his landmark book Wholeness and the Implicate Order can access it at the link below:

DavidBohm-WholenessAndTheImplicateOrder.pdf (gci.org.uk)

To see more contributions from the Physics Today Archives go to the link at:

https://physicstoday.org/quantum.

----------------------------------------------------------

Addendum: Bohm's Uncertainty Principle

Bohm is primarily concerned with the canonically conjugate field momentum, for which the associated coordinates, i.e. Dt,  Dfk  fluctuate at random. Thus, we have, according to Bohm:

p k = a (Dfk  /Dt)

 Where a is a constant of proportionality, and Dfk  is the fluctuation of the field coordinate. If then the field fluctuates in a random way the region over which it fluctuates is;

(d  Dfk) 2  = b (Dt)

Taking the square root of both sides yields:

(d  Dfk)   = b 1/2   (Dt)1/2 

 Bohm notes that p k   also fluctuates at random over the given range so:

 d p k =  a b 1/2 /   (Dt)1/2 

 Combining all the preceding results one finally gets a relation reflective of the Heisenberg principle, but time independent:  d p k   (d  Dfk  ) = ab

 This is analogous to Heisenberg’s principle, cf.

dp d<  ħ

Where the product ab  plays the same role as ħ


[1] Bohm, and Hiley: 1980: Foundations of Physics,  (10) 1001-1016.

[2] Bohm, op. cit., 209

[3] d’Espagnat. p. 143.