As noted in an earlier (Aug. 29) post to do with the role of consciousness in physics, I pointed out that at least some connections vis-a-vis quantum entanglement needed to be made before a serious empirical discussion could even commence. This has now evidently occurred as physicists have finally measured connections between pairs of photons within a macroscopic beam of light. These connections have been said to qualify as quantum entanglement and represent a step toward understanding how the rules of QM scale up to phenomena such as superconductivity, involving large numbers of particles.
Physicist Henry Stapp has noted that the neural dynamics of the human brain in the vicinity of synapses involve large numbers of particles, namely Ca++ ions which - on account of synaptic cleft dimensions (200-300 nm)- can be treated in wave form. The brain then, like a full quantum computer, can be regarded as subject to effects of quantum entanglement and exploring such manifestations could lead to insights into consciousness.
But first things first. The experiment - to be published in Physical Review Letters - describes a specially prepared light beam that enables the observation of individual photons in addition to charting the quantum links between them. Basically then, the team from the Institute of Photonic Sciences in Barcelona, under Morgan Mitchell, confirmed the theoretical prediction that all the photons involved would exhibit some degree of entanglement and that the most strongly "entangled" would be pairs of photons striking the detectors at the same time.
According to Mitchell, quoted in Science News, "entanglement should be present in pretty much any situation with a lot of particles interacting with each other." Most physics' arguments, however, take the view that quantum level entanglement is a bad thing for quantum computing. After all, if the quantum particles that are the basis for one's Q-computer become "entangled" with quantum particles outside then it is possible for information leakage and lost security.
In the case of Mitchell's team, while the study of entangled particles in superconductivity would have been ideal, the problems involved would have been formidable. (SCs are so densely packed with electrons that measuring even a small subset would have been extremely difficult. Imagine then the tandem problem of measuring entanglement to do with brain neurons.)
So, as a result, the Barcelona team confined attention to the much simpler macroscopic system of a "squeezed" beam of light. Not physically squeezed, obviously, but rather transmitted through a crystal enabling the measurement of a particular property, in this case polarization.
Polarization refers to that property whereby EM-radiation as it propagates can be confined say to one plane, or one rotation plane. If we say "circularly polarized" then the E-vector rotates through a full 360 degrees. If we say "linearly polarized" then it vibrates in one plane, e.g.
^
!
!
!
!
Or:
<--------------------------------->------------------------------------------------------------->
Thus, Mitchell's Barcelona team filtered the beam of light and probed it with polarization detectors. A click at the detector location indicated the arrival of a photon with a particular polarization. Any pairs traveling in tandem (and arriving simultaneously) had corresponding polarizations. I.e. horizontally polarized such as indicated in the lower sketch above.
Clearly, much more work has to be done, especially by way of confirmation, as well as extending the conclusions to other systems for greater generality. A case in point would be to convince skeptics like Prof. Timothy Ralph of University of Queensland, who doesn't accept that the Mitchell team's results are applicable for any phenomena other than "squeezed" light.
We shall see!
Wednesday, August 31, 2016
Did The Calvinist Reformation Pave The Way For More Militant Atheism?
Johann Calvin- A "tyrant" to many but a "holy man" to his followers. He aspired to make Geneva the "holy city"
Travel to Geneva, as we did in August, 1997, and you can't help but be thoroughly impressed by this cultural, world class capital city in the southwest of Switzerland. Just sitting on the lake side and watching the Jet d' eau (color image lower right)and you have to marvel. Also, if you knew your Swiss history, how this location was a most uninviting place for freethinkers back in the 1530s, 1540s. After all, it was the zealous firebrand Johann Calvin (founder of Calvinism) who was called in 1536 to make Geneva the "Protestant Rome".
Did he succeed? Up to a point. But in his zealous program and reaction to it he may have unwittingly unleashed more militant, aggressive forms of atheism. Consider that even for lax Protestants - far less any Catholics - Geneva from 1541 to 1564 was not a very hospitable place. Committed to his program of rendering Geneva the "holy city" Calvin and his highly motivated minions went all out to do it. Their program included: routine and harsh censorship (i.e. no lewd, nude works of art), regular home visits by the clergy to ensure the faithful were toeing the line, and a network of informers who would have done the Nazis proud.
The regulation included every aspect of life from the style of women's shoes to the names allowed for children as well as pets. Those who flouted the proscriptions or regulations, such as naming a dog after Calvin, faced more sever punishment including banishment and even execution. Yale prof Carlos Eire, author of the book 'Reformations: The Early Modern World 1450-1650', notes that over the time period indicated no less than 58 people were put to death and 76 were banished. This was out of a town with a (then) population of roughly 10,000.
Eire in his monograph is clear that the most potent force in the Reformation period is Calvin. Too many not au fait with the man (other than the American work ethic stereotype, i.e. forged by strict Calvinism) aren't aware that while Martin Luther spread his influence mainly through Germany, Calvin's spanned all of Europe.
Interestingly, coincident with Calvin's reign, the Council of Trent (reactionary in itself) assembled over several periods from 1545 to 1563. It is not exaggeration to assert this Council set the program and course for Roman Catholicism for the next 400 years. One of its offshoots, in fact, was the Society of Jesus - otherwise known as the Jesuits. While the Jesuits have often been portrayed as the "shock troops" of Catholicism, it is truer to say that the Jesuits proudly reinforced the use of reason and pragmatism that - in Eire's words - "had always been an integral part of the Catholic tradition."
To substantiate this, attend theology (or philosophy) classes at any Jesuit -run university - as I did from 1964-67 at Loyola University, New Orleans. Discussions were wide ranging and critical thinking a required commodity. These attributes served me well even after I'd transferred to the University of South Florida to pursue my degree in astronomy (which Loyola did not offer.)
In fact, as I observed in my June 29, 2010 post, Loyola's Jesuits were in many ways directly responsible for my atheist path. As I noted:
"my questioning and suspicious mind and approach was only confirmed and augmented with each step I took, including a series of theology courses at Loyola. I'd also gone to hear one of the foremost Existentialist philosophers in the world, Jean-Paul Sartre, who came to the Loyola University Fieldhouse in 1964. And, of course, we learned at that lecture one of the cornerstone avoidances of the Existentialist, is "bad faith".
This was the cardinal sin, if you will. The most serious transgression an authentic being or person can make. By "bad faith", Sartre meant going against your own interior barometer, to "go along to get along". It made life relatively easy (few conflicts) but ultimately led to despair since an artificial life was substituted for an authentic one."
Given this, it doesn't appear to be a stretch that Calvinism's excesses spawned a vigorous reaction of questioning and criticism from the Jesuits, also the products of the Reformation. And just as my continued critical thinking led me to atheism, that Reformationist reaction might have led to a more ambitious, aggressive form of atheism - as later vindicated by the onslaught of modern science. Who needs popes, Canon law or '95 Theses' when Darwinian evolution and the Big Bang disclose supernatural agents as redundant?
Eire himself doesn't shy from the negative side of the Council of Trent including the Inquisition, the suppression of deviance, and the supremacy of value attached to celibacy and virginity. But looking at these in retrospect, they would have only incited an even more critical, hostile and militant atheism- of the form that ultimately took hold in the early 20th century, i.e. with philosophers such as Bertrand Russell and others.
Ultimately, I have to agree with Eire that the Calvinist legacy of the Reformation period incepted Christianity's fragmentation. With it, the epistemological space thereby created paved the way for secularism, skepticism ....and a hardy, no fools tolerated....atheism. It is hardly astonishing that since the Enlightenment one finds most physical scientists are atheists. This is not merely a reflection of their ideological choices but also of the recognition that there is no place for supernaturalism in modern physics, chemistry or biology.
"Thousand -Year Summer Storms" - Expect More And More
The headline on p. 6 in the latest TIME blared: "A Thousand Year Storm Hits Louisiana" , which seems barely weeks after we read of another such event bashing Maryland, and earlier West Virginia. Are these just "coincidental" meteorological events or might they be somehow tied to the advent of even more ferocious climate change, global warming.
In the case of all the previous events and especially in LA, we are informed "the air had been full of moisture - meteorologists call it 'precipitable water'- so all it took was a modest front to make the skies burst.'
The piece then went on to point out "Epic floods have hit the South with alarming frequency in recent months."
Indeed, and make no mistake at root global warming IS the culprit.
We now know from research published by Dim Coumou and colleagues at the Potsdam Institute for Climate Impact Research that rapid Arctic warming has bolstered summer heat waves. These heat waves have engendered warmer, moister atmosphere of the type most likely to trigger massive floods, including the so-called "thousand year events".
At the same time Coumou et al have found a decrease in continent cooling storms as a result of a weakened jet stream, owing to an Arctic warming twice as fast as lower latitudes in the Northern hemisphere. This rapid Arctic warming has in effect reduced the temperature differential that drives the polar winds- thereby weakening the jet stream.
Previous studies that investigated the impact of the dwindling jet stream on lower latitude only focused on fall and winter effects. During these intervals the Arctic Ocean warms the overlying air, obscuring the jet stream effects to Coumou et al instead. Focused on summers using meteorological data from 1979-2013.
The work, published in Science (3/12/15) showed that the summer jet stream slowed by 5 percent during that interval. That slowing in turn caused a 10 percent decrease in the energy available to power cooler summer storms.
The result? Without the relief afforded by the cooler summer storms, the N. Hemisphere faces longer and longer bouts of intense summer heat. This in turn will make the atmosphere more "precipitable" opt laden with water, which upon being triggered will lead to more "thousand year flood events" such as we have seen.
The takeaway is that rapid climate change is already ravaging large swaths of the country, as well as the planet. It is not some distant future as the climate deniers make out, but upon us right now. If the deniers - mainly driven by capitalist Neoliberal economics - have their way, nothing will be done. They talk about raising misplaced alarm, but what else to call an event that (according to TIME) "has left more than 30,000 people evacuated from their homes" with "30 of Louisiana's 64 parishes declared disaster areas."
If the climate change deniers don't recognize such events as alarming and worthy of action, then one can only conclude they are either blind, brainwashed by the fossil fuelers or simply don't care what happens to their fellow humans. Particularly those that are in harm's way because of the adversely changing conditions.
Or, maybe the deniers are fortunate to live in such rarefied digs or gated communities they feel they'll be spared whatever happens to the rest of us who aren't so lucky. Hence, they can afford to put capital markets and economics above all other priorities.
A serious error, if ever there was one.
In the case of all the previous events and especially in LA, we are informed "the air had been full of moisture - meteorologists call it 'precipitable water'- so all it took was a modest front to make the skies burst.'
The piece then went on to point out "Epic floods have hit the South with alarming frequency in recent months."
Indeed, and make no mistake at root global warming IS the culprit.
We now know from research published by Dim Coumou and colleagues at the Potsdam Institute for Climate Impact Research that rapid Arctic warming has bolstered summer heat waves. These heat waves have engendered warmer, moister atmosphere of the type most likely to trigger massive floods, including the so-called "thousand year events".
At the same time Coumou et al have found a decrease in continent cooling storms as a result of a weakened jet stream, owing to an Arctic warming twice as fast as lower latitudes in the Northern hemisphere. This rapid Arctic warming has in effect reduced the temperature differential that drives the polar winds- thereby weakening the jet stream.
Previous studies that investigated the impact of the dwindling jet stream on lower latitude only focused on fall and winter effects. During these intervals the Arctic Ocean warms the overlying air, obscuring the jet stream effects to Coumou et al instead. Focused on summers using meteorological data from 1979-2013.
The work, published in Science (3/12/15) showed that the summer jet stream slowed by 5 percent during that interval. That slowing in turn caused a 10 percent decrease in the energy available to power cooler summer storms.
The result? Without the relief afforded by the cooler summer storms, the N. Hemisphere faces longer and longer bouts of intense summer heat. This in turn will make the atmosphere more "precipitable" opt laden with water, which upon being triggered will lead to more "thousand year flood events" such as we have seen.
The takeaway is that rapid climate change is already ravaging large swaths of the country, as well as the planet. It is not some distant future as the climate deniers make out, but upon us right now. If the deniers - mainly driven by capitalist Neoliberal economics - have their way, nothing will be done. They talk about raising misplaced alarm, but what else to call an event that (according to TIME) "has left more than 30,000 people evacuated from their homes" with "30 of Louisiana's 64 parishes declared disaster areas."
If the climate change deniers don't recognize such events as alarming and worthy of action, then one can only conclude they are either blind, brainwashed by the fossil fuelers or simply don't care what happens to their fellow humans. Particularly those that are in harm's way because of the adversely changing conditions.
Or, maybe the deniers are fortunate to live in such rarefied digs or gated communities they feel they'll be spared whatever happens to the rest of us who aren't so lucky. Hence, they can afford to put capital markets and economics above all other priorities.
A serious error, if ever there was one.
Tuesday, August 30, 2016
String Theorist Ed Witten Offers A Take On Consciousness - It Will Never Be Solved
Ed Witten - examining an aspect of String Theory. He isn't convinced that consciousness can ever be solved by physics.
It was interesting pulling up a blog post by chemist Ash Jogelekar where he quoted physicist Edward Witten as writing:
"I think consciousness will remain a mystery. Yes, that's what I tend to believe. I tend to think that the workings of the conscious brain will be elucidated to a large extent. Biologists and perhaps physicists will understand much better how the brain works. But why something that we call consciousness goes with those workings, I think that will remain mysterious. I have a much easier time imagining how we understand the Big Bang than I have imagining how we can understand consciousness...
Understanding the function of the brain is a very exciting problem in which probably there will be a lot of progress during the next few decades. That's not out of reach. But I think there probably will remain a level of mystery regarding why the brain is functioning in the ways that we can see it, why it creates consciousness or whatever you want to call it. How it functions in the way a conscious human being functions will become clear. But what it is we are experiencing when we are experiencing consciousness, I see as remaining a mystery...
Perhaps it won't remain a mystery if there is a modification in the laws of physics as they apply to the brain. I think that's very unlikely. I am skeptical that it's going to be a part of physics."
This may be so, but I don't believe the problem is as insuperable as he and many other physicists believe. But it will require patience and getting our theories tested properly in terms of what could be considered an environment conducive to human consciousness. Jogelekar himself adds:
"what Witten is saying here is in some sense quite simple: even if we understand the how of consciousness, we still won't understand the why. This kind of ignorance of whys is not limited to consciousness, however"
Which is true! I have no idea WHY solar flares and coronal mass ejections are often linked together and erupt where they do but I have a detailed idea of how they do. In science this is the most one can really aspire to: answering the 'how' as opposed to the 'why'.
In the case of consciousness it would seem that a logical starting point is the physical scale of the synaptic cleft. The scale is on the order of 200-300 nm and hence subject to the Heisenberg Uncertainty Principle. Once so subject, then it embodies the laws of quantum mechanics. Wave forms, as opposed to simple classical trajectories of particles (e.g. electrons) are enabled, because uncertainty principle limitations applied to calcium ion (Ca++) capture near synapses shows they (calcium ions) must be represented by a probability wave function. (Cf. Henry Stapp, Mind, Matter And Quantum Mechanics, Springer-Verlag, 1983).
Consider the 4D wave function U(X,Y,Z, t) . Then brain dynamics and function at a time t is contingent upon the neuron and its connections to synapses at the same time t. We therefore want networks that invoke the above function and Pauli spin operators as effective gates.
Application of the Heisenberg Uncertainty Principle to Ca+2 ions (at body temperature) discloses the associated wave packet dimension increases to many times the size of the ion itself. Thus we can represent the ion uptake superposition as a separate contributor to the aggregate (sub-complex) or neuronal assembly:
U (A1....A n) + U (Ca+2) n
It was physicist David Bohm who first pointed out ('Quantum Theory', p. 169), a very precise analogy of quantum processes to thought. In particular, the quantum "wave packet collapse" (e.g. to a single eigenstate, from a superposition of eigenstates) is exactly analogous to the phenomenon of trying to pinpoint what one is thinking about at the instant t he is doing such thinking. More often than not, when one does this, as Bohm notes - "uncontrollable and unpredictable changes" are introduced into the thought process or train of thought.
Often, people are heard to say: "Sorry, I've lost my train of thought".
What they really mean is the thought coherence they had enjoyed has since been obliterated, so that they have to commence the thought process anew. The coherent state has "collapsed" into a single state which they now no longer recognize. In this way, as Bohm pointed out, the "instantaneous state of a thought" can be compared to the instantaneous position of a particle (say associated with a de Broglie wave or "B-wave" in a brain neuron). Similarly, the general direction of change of a thought is analogous to the general direction of change in time for the particle's momentum (or by extension, its phase function).
Now, let's get into more details.
From the foregoing remarks (on thought), Bohm - in another work ('Wholeness and the Implicate Order', 1980) could also regard meditation as a possible "channel" by which the individual mind can access the Dirac Ether. I have dealt with similar conjectures before, in terms of the 'quantum potential".
In general, the quantum potential defined by Bohm (ibid.) is:
VQ= { - ħ2/ 2m} [Ñ R]2 / R
Where ħ is the Planck constant of action h divided by 2π , m is the mass, and R a phase amplitude. The quantum potential computed for a pair of Gaussian slits is shown below (cf. Bohm, D. and Hiley, B.J.: Foundations of Physics, Vol. 12, No. 10, p. 1001):
Now assume the total set of one's thoughts contains waves of frequencies ranging from f' (highest) to f, then the empirical quantum potential ( V'Q) can be expressed:
V'Q = h(f' - f),
where h is Planck's constant.
Thus, V'Q has units of energy as the other potential functions in physics, e.g. gravitational and electrostatic. On average, the greater the number of possible states, the greater the difference (f' - f) and the greater the empirical quantum potential.
In a real human brain, of course, we have a "many-particle" field (especially since we're looking at neuronal complexes) so that the quantum potential must be taken over a sum such that:
VQ= { - ħ2/ 2m} å i [Ñ Ri]2 / R
The velocity of an individual B-wave is expressed by:
v(B)= Ñ S/ m
Where m is the mass of the particle associated with the B-wave, and S is a phase function obtained by using:
U = R exp( iS/ħ)
A neuron in sub-complex 'A' either fires or not. The 'firing' and 'not firing' can be designated as two different quantum states identified by the numbers 1 and 2. When we combine them together in a vector sum diagram, we obtain the superposition.
Y (n ( A] = Y (n1 ( A1] + Y (n1 ( A2)]
where the wave function (left side) applies to the collective of neurons in 'A', and takes into account all the calcium wavepackets that factored into the process. What if one has 1,000 neurons, each to be described by the states 1 and 2? In principle, one can obtain the vector sum as shown in the above equation for all of the neuronal sub-complex A, and combine it with all the other vector sums for the sub-complexes B, C, D and E in an optimized path. The resulting aggregate vector sum represents the superposition of all subsidiary wave states and possibilities in a single probability density function. Configure the action of Pauli spin gates as well, and radical emergence is allowed, of the type that could even account for the effects reported by Robert Jahn (from his students) on computer random number generators.
VQ= { - ħ2/ 2m} [Ñ R]2 / R
Where ħ is the Planck constant of action h divided by 2π , m is the mass, and R a phase amplitude. The quantum potential computed for a pair of Gaussian slits is shown below (cf. Bohm, D. and Hiley, B.J.: Foundations of Physics, Vol. 12, No. 10, p. 1001):
Now assume the total set of one's thoughts contains waves of frequencies ranging from f' (highest) to f, then the empirical quantum potential ( V'Q) can be expressed:
V'Q = h(f' - f),
where h is Planck's constant.
Thus, V'Q has units of energy as the other potential functions in physics, e.g. gravitational and electrostatic. On average, the greater the number of possible states, the greater the difference (f' - f) and the greater the empirical quantum potential.
In a real human brain, of course, we have a "many-particle" field (especially since we're looking at neuronal complexes) so that the quantum potential must be taken over a sum such that:
VQ= { - ħ2/ 2m} å i [Ñ Ri]2 / R
The velocity of an individual B-wave is expressed by:
v(B)= Ñ S/ m
Where m is the mass of the particle associated with the B-wave, and S is a phase function obtained by using:
U = R exp( iS/ħ)
A neuron in sub-complex 'A' either fires or not. The 'firing' and 'not firing' can be designated as two different quantum states identified by the numbers 1 and 2. When we combine them together in a vector sum diagram, we obtain the superposition.
Y (n ( A] = Y (n1 ( A1] + Y (n1 ( A2)]
where the wave function (left side) applies to the collective of neurons in 'A', and takes into account all the calcium wavepackets that factored into the process. What if one has 1,000 neurons, each to be described by the states 1 and 2? In principle, one can obtain the vector sum as shown in the above equation for all of the neuronal sub-complex A, and combine it with all the other vector sums for the sub-complexes B, C, D and E in an optimized path. The resulting aggregate vector sum represents the superposition of all subsidiary wave states and possibilities in a single probability density function. Configure the action of Pauli spin gates as well, and radical emergence is allowed, of the type that could even account for the effects reported by Robert Jahn (from his students) on computer random number generators.
The Pauli spin matrix-operator σ x = (0,1¦1,0) where the left pair is a matrix 'top' and each right pair a matrix 'bottom' - since they are usually written in a rectangular array form.
Similarly, the other Pauli gates would be defined by: σ y = (0,-i¦i, 0)and σ z = (1, 0¦0, -1), where i denotes the square root of (-1). Incorporation of such Pauli (quantum) gates meets a primary application requirement for feed forward networks, in describing synapse function. (See e.g. Yaneer Bar-Yam, 'Dynamics of Complex Systems', Addison-Wesley, pp. 298-99.)
The advantage is consciousness is elevated out of the strict machine-like model of an ordinary computer to one that can explain more features of the human experience.
This post is intended to show just how complex the integration of consciousness into an existing, accepted theory of physics can be. Empirically, I don't believe anyone can move forward on this topic until the Pauli spin gates' actions are actually tested via neural networks in the brain. When or how this can happen given our current technology is anyone's guess - but it will likely require actual quantum computers. A first marker may well be the extent to which quantum entanglement might emerge for discrete quantum systems.
See also:
http://brane-space.blogspot.com/2010/07/numerical-testing-of-mlp-network.html
See also:
http://brane-space.blogspot.com/2010/07/numerical-testing-of-mlp-network.html
Monday, August 29, 2016
U. Of Chicago Welcome Letter Warns Incoming Frosh Against Political Correctness
Amidst
all manner of reactions on the web and social media, we learn that
the University
of Chicago has
sent new students a blunt statement clearly opposing any potential
instigations of campus political correctness. The letter, diverging
from the usual anodyne 'welcome' , has incited thousands of
passionate responses, for and against.
The
letter from John Ellison, dean of students, reads in part:
“Our
commitment to academic freedom means that we do not support so-called
trigger warnings, we do not cancel invited speakers because their
topics might prove controversial, and we do not condone the creation
of intellectual ‘safe spaces’ where individuals can retreat from
ideas and perspectives at odds with their own,”
The
preceding was a not-so-veiled rebuke to any
potential protesters from the class of 2020 who might be
tempted to howl over the speech to be
condoned on campus. Also, who should be allowed
to speak,
issues that have rocked
Yale, Wesleyan, Oberlin and
many other colleges and universities in recent years. Some alumni,
dismayed by the trend, have
withheld donations from
their alma maters.
We already saw the case of early education expert
Erika Christakis at the center of a
Halloween brouhaha at Yale last year. It began when Yale's
Intercultural Affairs Committee advised students they ought
not present themselves wearing feathered headdresses, turbans or war
paint - or modifying skin tones (to appear as a minstrel performer) .
The aim was to try to steer students into being more sensitive in
their choice of costume or apparel.
In
response, Ms. Christakis dispatched her own email wondering whether
such oversight and advice was really needed. She wrote:
"Whose
business is it to control the forms of costumes of young people to
get them to act responsibly?"
Adding:
"Free
speech and ability to tolerate offense are the hallmarks of a free
and open society".
Many
Yalies became enraged and called for Christakis and her husband
to be removed from their positions as heads of undergraduate
residence at Yale. Ms. Christakis then resigned from her teaching
position. In an early April WSJ piece, she admitted
she stepped down not only because of the email kerfuffle but also she
felt more broadly that "the
campus climate didn't allow open dialogue".
In
other words, it more or less treated staff and students as impudent
and out of control barbarians who had to be directed toward more
judicious actions and couldn't be depended on to act responsibly on
their own.
In
the end, this is basically what the Univ. of Chicago letter is all
about apart from vindicating policies that were already in
place there as well as at a number of other universities
calling for “the freedom to espouse and explore a wide range of
ideas.”
Interestingly,
last
year, a faculty Committee on Freedom of Expression, appointed
by university president, Robert R. Zimmer produced
a report stating
that: “it
is not the proper role of the university to attempt to shield
individuals from ideas and opinions they find unwelcome,
disagreeable, or even deeply offensive.”
Unfortunately, the basis of the Chicago letter appears to have been
misunderstood by many students as suggesting that slurs and racial,
sexual or other putdowns are now to be tolerated. Not so!
Only that vehemently expressed ideas are not snuffed out a priori
before speakers or writers are heard, seen.
For
example, while the opinions of a hard core atheist against Mother
Theresa, i.e. as a phony plaster "saint", might rile some
students, it is not in their purview to stop the speech.
They do have the choice to attend or not.
In
like manner, the Chicago letter makes clear Students "are
encouraged to speak, write, listen, challenge and learn without fear
of censorship." In addition, that means the school "does
not support so-called 'trigger warnings' " to alert students of
upcoming discussions or speakers that they might find offensive.
Why
should the university do that? Its role is not to be a nurse maid,
or acting therp for student piques, neuroses and sensitivities. In
this regard, the grown up makes his or her own choices and knows what
stimuli to avoid and doesn't have to be overtly protected from
speeches, ideas or controversial writings.
Thus,
The University of Chicago letter is saying it won't cancel
controversial speakers, and it "does not condone the creation of
intellectual 'safe spaces' where individuals can retreat from ideas
and perspectives at odds with their own."
Which
is good! I think back to my freshman year at Loyola (1964-65) and how
impoverished I'd have been if Loyola's Jesuits had been so twitchy
about the French atheist Jean Paul -Sarte that they hadn't let him
on campus to debate Christian Existentialist Gabriel Marcel. I also likely never would
have learned about "good faith" and "bad faith"
and been motivated to get Sartre's book (from the Loyola bookstore): Being
and Nothingness.
Political
science professor Charles Lipson quoted on the NPR.org site said:
"I
think it's an excellent thing," adding that too many
campuses are shutting down discussions or speeches that some might
find uncomfortable or offensive.
In
the 60s, I don't recall any such barricading of ideas occurring even
at Catholic Loyola. Indeed, we welcomed the parry and thrust of
vehement debate especially in dorm rooms after classes We regarded it
as part of our education, an extension if you will. This included
themes such as the morality (or not) of the Vietnam War, racial
relations, and relations between the sexes.
Despite
speakers vocalizing topics that were absolute poppycock, i.e. "mind
rape" from one feminist, we came to hear her out - or not. We
didn't whine to the administration at South Florida about "no
trigger warnings" or "micro aggressions".
My
friend, Dr. Pat Bannister - Bajan psychiatrist- would have been
appalled at the very idea of enlisting such verbal jui jitsu to
prevent speech. Like Prof. Christakis, she feared the mass regression
of adults to the state of de facto children, especially with the
oncoming emphasis on the visual by way of TV. Like Christakis, she
believed true adults needed to be able to make their own decisions
and also have the maturity to live with them, come what may.
This
is the message, I believe, that is sent by the University of Chicago
letter to the incoming frosh. Maybe they will take heed, but they may
also use rationalizations to dismiss it, as one Prof at Univ. of Iowa attempted. In this case, one can reckon they may
be poorer for their college experience.
Sunday, August 28, 2016
Looking At Simple 2D Ising Models
The "Ising model" first came to the fore in the study of ferromagnetic systems. It was found that the use of such simplified models paved the way for greater understanding via modeling of more complex systems. A fairly mundane example is a two-dimensional Ising model for ferromagnetic matter. It contains magnetic domains for which the individual spin magnets can be subject to sudden reversals. For a simple example, think of the 2D model of 4 x 4 elementary spin magnets as shown below:
Here the Ising model system, by virtue of undergoing spontaneous magnetization (say from a state S(1) with spin excess 0 to state S(2) with spin excess 12 , discloses an evolution to a higher degree of order at the later time t(0) + t, where t could be in billions of years or nanoseconds.
The elementary magnets may exist temporarily in the state S(1) as shown (i.e. each arrow denotes the net spin of the atom based on the sum of electron orientations within it). We then may want to find the degree of order applicable to the system, say at time t(o) and do the appropriate counting of "spin ups" and spin downs" as shown in the left side of the model. We find on doing so (which the reader can verify) that we get 8 spin ups - 8 spin downs = 0 net spin, or in other words the system is at equilibrium.
Consider then the same system but at a later time (t(0) + t) , for which we behold the right side orientations of the elementary spin magnets. Here we get: 14 spin ups - 2 spin downs = 12 spin ups, or in other words the spin excess = 12. This system, call it S(2), has much higher degree of order (less entropy) than the system S(1). (We should add here that higher entropy - as in S(1) - corresponds to the most probable state, defined by the minimal spin excess of zero
The degree of order, as well as information, for the simple spin system shown is determined from what is called "the spin excess", or the net spin difference (up minus down or vice versa). The larger this number, the greater the degree of order, and the lower the entropy of the system. Obviously, since 0 denotes an extremely low number, we can deduce large entropy.
Accessing such simple systems allows us to infer fundamental measures applicable to the systems, for example the "magnetic moment" of a state, as well as the "degeneracy function". Consider an N= 2 model system with either 2 ups (two up arrows) or 2 downs. Then, if m denotes the magnetic permeability we can have:
M = +2m or M = -2m
where the first is the magnetic moment for two spins up particles, and the second for two spins down. One can also, of course, have the mixed state inclusive of one spin up plus one spin down, then:
M = O m or O
Meantime, the degeneracy function computes the number of states having the same value of m (individual spins) or M such that:
g(N,m) = N!/ (½N + m)! (½N - m)! [Mav]
where [Mav]denotes the average value of the total magnetic moment summed over all states (e.g. with ms)
The power of the Ising model, however, doesn't end with ferromagnetic systems. We can also use it to examine ice crystal configurations as has been shown in a recent paper by Andrei Okounkov (Bulletin of the American Mathematical Society, Vol. 53, No. 2, p. 187). In this paper the author presents us with the 2D ice crystal Ising model shown below:
Each white square denotes an ice crystal and the blue areas represent separating media. Certain model stipulations apply as given in the paper: 1) the total number of white squares is fixed, just as the total number of elementary magnets in the earlier system; 2) all squares along the boundary are deemed blue in order to prevent crystals sticking to the sides of the container, and 3) It must be possible to assign probabilities to the configuration in the same way we might assign "order" or entropy to the ferromagnetic system.
The most basic probability for any such system is "thermal equilibrium". Thus, at some temperature T if the system attains thermal equilibrium then the probability of any particular configuration decays exponentially with the energy of C, which is analogous to E = m m B in the ferromagnetic case. The probability of any particular configuration dependent on T is then:
Prob (C) = 1/ Z(T) [exp (-Energy (C)/ kT)
Where k is Boltzmann's constant, 1.38 x 10 -23 J/K.
One will also make use of the "partition function":
Z(T) = å C exp (- Energy (C)/ kT)
which as Okounkov notes, really functions as a "normalization factor" given that it "makes the probabilities sum to 1". In this Ising ice crystal model, then, the energy is "the sum of interactions of all adjacent squares." Since the total number of squares is fixed (see stipulation (1)) then the energy must be proportional to the total length of the contours separating white from blue.
To identify the contours is easy. If the energetic reader will run off a copy of the image of the 2D rectangle, then take a black magic marker and trace around each ice crystal region as it appears, he will have generated the contours. The normalization for energy is then (op. cit.):
Energy = 2 x Length of contours
As in the case of the ferromagnetic system entropy competes with order (energy). In the Ising ice crystal energy is saved via clumping. If we designate an "order parameter" such that b = (kT)-1 then in the ice crystal case the larger b the stronger tendency for order. Interestingly, as Okounkov notes there is a critical temperature Tc > 0 above which entropy wins, it is:
b = 0.5 ln (Ö2 + 1)
Below Tc and for ice crystal concentrations above a certain threshold a crystal will form as the size of the container goes to infinity.
From this brief foray iwe can see that the Ising model shows the great generality of physics, in being applicable to vastly dissimilar physical entities.
Here the Ising model system, by virtue of undergoing spontaneous magnetization (say from a state S(1) with spin excess 0 to state S(2) with spin excess 12 , discloses an evolution to a higher degree of order at the later time t(0) + t, where t could be in billions of years or nanoseconds.
The elementary magnets may exist temporarily in the state S(1) as shown (i.e. each arrow denotes the net spin of the atom based on the sum of electron orientations within it). We then may want to find the degree of order applicable to the system, say at time t(o) and do the appropriate counting of "spin ups" and spin downs" as shown in the left side of the model. We find on doing so (which the reader can verify) that we get 8 spin ups - 8 spin downs = 0 net spin, or in other words the system is at equilibrium.
Consider then the same system but at a later time (t(0) + t) , for which we behold the right side orientations of the elementary spin magnets. Here we get: 14 spin ups - 2 spin downs = 12 spin ups, or in other words the spin excess = 12. This system, call it S(2), has much higher degree of order (less entropy) than the system S(1). (We should add here that higher entropy - as in S(1) - corresponds to the most probable state, defined by the minimal spin excess of zero
The degree of order, as well as information, for the simple spin system shown is determined from what is called "the spin excess", or the net spin difference (up minus down or vice versa). The larger this number, the greater the degree of order, and the lower the entropy of the system. Obviously, since 0 denotes an extremely low number, we can deduce large entropy.
Accessing such simple systems allows us to infer fundamental measures applicable to the systems, for example the "magnetic moment" of a state, as well as the "degeneracy function". Consider an N= 2 model system with either 2 ups (two up arrows) or 2 downs. Then, if m denotes the magnetic permeability we can have:
M = +2m or M = -2m
where the first is the magnetic moment for two spins up particles, and the second for two spins down. One can also, of course, have the mixed state inclusive of one spin up plus one spin down, then:
M = O m or O
Meantime, the degeneracy function computes the number of states having the same value of m (individual spins) or M such that:
g(N,m) = N!/ (½N + m)! (½N - m)!
where [Mav]
The power of the Ising model, however, doesn't end with ferromagnetic systems. We can also use it to examine ice crystal configurations as has been shown in a recent paper by Andrei Okounkov (Bulletin of the American Mathematical Society, Vol. 53, No. 2, p. 187). In this paper the author presents us with the 2D ice crystal Ising model shown below:
Each white square denotes an ice crystal and the blue areas represent separating media. Certain model stipulations apply as given in the paper: 1) the total number of white squares is fixed, just as the total number of elementary magnets in the earlier system; 2) all squares along the boundary are deemed blue in order to prevent crystals sticking to the sides of the container, and 3) It must be possible to assign probabilities to the configuration in the same way we might assign "order" or entropy to the ferromagnetic system.
The most basic probability for any such system is "thermal equilibrium". Thus, at some temperature T if the system attains thermal equilibrium then the probability of any particular configuration decays exponentially with the energy of C, which is analogous to E = m m B in the ferromagnetic case. The probability of any particular configuration dependent on T is then:
Prob (C) = 1/ Z(T) [exp (-Energy (C)/ kT)
Where k is Boltzmann's constant, 1.38 x 10 -23 J/K.
One will also make use of the "partition function":
Z(T) = å C exp (- Energy (C)/ kT)
which as Okounkov notes, really functions as a "normalization factor" given that it "makes the probabilities sum to 1". In this Ising ice crystal model, then, the energy is "the sum of interactions of all adjacent squares." Since the total number of squares is fixed (see stipulation (1)) then the energy must be proportional to the total length of the contours separating white from blue.
To identify the contours is easy. If the energetic reader will run off a copy of the image of the 2D rectangle, then take a black magic marker and trace around each ice crystal region as it appears, he will have generated the contours. The normalization for energy is then (op. cit.):
Energy = 2 x Length of contours
As in the case of the ferromagnetic system entropy competes with order (energy). In the Ising ice crystal energy is saved via clumping. If we designate an "order parameter" such that b = (kT)-1 then in the ice crystal case the larger b the stronger tendency for order. Interestingly, as Okounkov notes there is a critical temperature Tc > 0 above which entropy wins, it is:
b = 0.5 ln (Ö2 + 1)
Below Tc and for ice crystal concentrations above a certain threshold a crystal will form as the size of the container goes to infinity.
From this brief foray iwe can see that the Ising model shows the great generality of physics, in being applicable to vastly dissimilar physical entities.
Saturday, August 27, 2016
One Of Top Ten Books On The JFK Assassination - From Former British Intelligence Officer
What I found most compelling was Col. Hughes-Wilson 's devastating exposure (Chapter 29) of the Warren Commission Report as a blatant, outright fraud perpetrated on the American people more than 50 years ago. We are informed, for example, that Earl Warren (at 72), wanted no part of this "presidential blue ribbon" dog and pony show. Warren believed a Justice Department investigation was the proper approach. But then Robert Kennedy was the Attorney General and LBJ and cohort couldn't afford to have Bobby snooping around their nefarious business - contacts, leaving no stone unturned.
Twice Warren refused LBJ's entreaties until finally LBJ threatened the Chief Justice with exposure of his secret Hoover files disclosing "a little incident in Mexico City". As he related the story next day to his pal Richard B. Russell, LBJ couldn't help rubbing it in, as he chortled at how the elder man "cried like a baby" and said "he would do anything, just ask".
At that point, LBJ had his extorted name cachet for his Potemkin commission. All he needed were Hoover's and Dulles cooperation to keep the fraud rolling, including deciding which witnesses to ignore and which to call, what evidence to ignore and what to inflate. In the latter, one must include the phony autopsy images.
Col. Hughes -Wilson astutely brings up Hoover's remark about the "real assassin" (p. 271) which effectively betrays his and LBJ's intentions. Hoover writes of "being so concerned about Oswald having something issued so we can convince the public that Oswald is the REAL assassin".
But, as Hughes -Wilson logically observes, Hoover "unwittingly gave the game away". Why indeed would one need to even interject the adjective real - unless it was actually the case Oswald WASN'T the real assassin? (He was in fact a CIA confected "legend") This would also explain why Parkland surgeon Charles Crenshaw grew increasingly concerned at LBJ's effort to try to get him to wring a "deathbed confession" from Oswald even as he worked on him.) See the sections in Crenshaw's book at link below:
http://www.krusch.com/books/kennedy/Conspiracy_Of_Silence.pdf
Col. Hughes -Wilson asserts (p. 256) that Crenshaw clearly had contradicted the findings of the Warren Commission, confirming that Kennedy had been shot twice from the front. Also, that Crenshaw confirmed LBJ telephoned the hospital while Dr. Crenshaw was working on Oswald and demanded a deathbed confession. Well, of course! Since as Hoover intimated they needed to show Lee Oswald was the REAL assassin! I.e. to protect the real assassins still alive and likely in CIA-NSA sponsored safe houses!
Hughes -Wilson is no less sparing in his devastating criticism (pp.262-63) of how the coffins were switched. Thus, the body "that arrived at Bethesda was in a different coffin from the one that left Dallas". Then we learn that during the flight to D.C. a "senior military aide" had radioed in advance for an ambulance to meet them to take the body to Walter Reed Hospital". But the presidential party was greeted by two ambulances: a grey Navy ambulance accompanied Jackie and the bronze casket to the front door of Bethesda while a "black, civilian hearse -ambulance" had unloaded the actual casket containing the dead body of JFK "20 minutes earlier". All this was planned, of course, because it had to be to facilitate the manipulation of the head wounds which Charles Crenshaw knew had occurred once he was confronted with the phony Bethesda autopsy photos by Gary Shaw of the Sixth floor Museum.
As I noted in my Aug. 18th post on the NOVA JFK brainwash special, Dr. Crenshaw was incredulous, immediately spotting the fake and ascertaining that the head had been "manipulated". It was immediately evident to him that a conspiracy was at work, else why alter the massive frontal head wound to a rear one? As to why Dr. Crenshaw kept silent for so many years, he makes clear in his book (p.. 6):
"We doctors who had worked on President Kennedy, whether out of respect or out of fear, had agreed not to publish what we had seen, heard, and felt. It was as If we were above that, as If what we knew was sacred, as If to come forward with our account would in some way desecrate our profession. To a degree, I think we were afraid of criticism."
Adding later:
"I was as afraid of the men in suits as I was of the men who had assassinated the President... I reasoned that anyone who would go so far as to kill a President of the United States wouldn't hesitate to kill a doctor."
As well he should have been given the mathematical-statistical evidence that's been provided by Richard Charnin, e.g.
http://richardcharnin.wordpress.com/2013/10/14/jfk-witness-deaths-graphical-proof-of-a-conspiracy
Including in his book, Reclaiming Science- The JFK Conspiracy.
While some heads of the slower -grasping may "pop" at Hughes-Wilson's dual ambulances (one bogus, for show) it is eminently plausible, as much as using Lee Oswald as a decoy. This is apart from the fact that witnesses at Bethesda came forward to Gary Shaw as noted by Dr. Crenshaw (p. 11-12).
Then there is Hughes-Wilson's startling disclosure (to some) that more than one type of Mannlicher -Carcano rifle exists as revealed on p. 187. We learn, for example, that the weapon supposedly used by Oswald was "40 inches long, weighed 8 pounds and purchased from Klein's Sporting Goods in Chicago". But....the Mannlicher-Carcano actually advertised in the magazine (from which Oswald allegedly ordered) was "a different model completely, a 36-inch, 5.5. pound carbine model". The latter in a more modern cast, was also likely similar to the version used by Luke Haag and son to perform the ballistics tests on the NOVA PBS special. As I said, there is no way that they could have used the actual weapon housed in the National Archives and hit the side of a barn.
Even Hughes-Wilson is suspicious of the one attributed to Oswald noting (p. 188): " that Mannlicher-Carcano is supposedly the only one fired that day." (Referring to the clunky 8 lb. version 40 inches long and which Patricia Dumais wasn't allowed to touch at the National Archives.)
Hughes -Wilson then concludes from his own analysis (pp. 188-89) a total of TEN shots made on Nov. 22, 1963 - contrary to the Warren fiction of three. These are:
Shot 1: Struck sparks behind JFK's limo, observed by Royce Skelton and Austin Miller
Shot 2: Struck curb on the north side of Elm and a fragment struck bystander James Tague
Shot 3: Struck across a manhole cover and embedded in the grass, later DPD officers photographed removing it. It never was assayed in official documents and supports Carl Oglesby's argument the Dallas police were conscripted to remove evidence.
Shot 4: Frontal shot struck JFK in the throat, and he's visible clutching his throat
Shot 5: Struck Kennedy in the upper back (in line with fifth thoracic vertebra) and came from a "relatively flat trajectory"
Shot 6: Misfire but struck limo - causing indentation in limo's windscreen
Shot 7: Bullet punched hole through Stemmons Freeway sign.("The bullet had punched through from the direction of the grassy knoll and blown the rim backwards")
Shots 8, 9: Two bullets struck Gov. John Connally: - Shot from the back that smashed into his chest and shot that shattered his wrist.
Shot 10: The fatal head shot from direction of grassy knoll.
Col. John Hughes-Wilson, a trained intel operative like my Swiss friend Rolf, never bought the fiction that one magic bullet created all the 7 wounds in JFK and Connally and emerged pristine. Both, again, could easily spot the hands of the spook propaganda teams. Maybe it takes a spook to see the handiwork of spooks.
Hughes-Wilson also concurs with former Justice Dept. agent Walt Brown ('Treachery in Dallas', Chapter 'Blue Death) ',and author James Douglass ('JFK and the Unspeakable') that the original plan was to knock off Oswald at the Texas Theater so there was no chance of his ever getting his say at a public trial Douglass (p. 292) makes it known that the Dallas cops approached Oswald (in his seat) "almost as if they were provoking the suspected police killer to break away from his seat ..which would have given Tippet's enraged fellow officers an excuse to kill him".
But Oswald did no such thing. Nor did he attempt to fire any shots. Lee clearly and obviously knew by now he'd been set up as the patsy and the last thing he wanted to do was make these Dallas cops his judge, jury and executioners. No, he wanted to have his trial and his say, and how and why he'd been set up. So, rather than mindlessly react he expressly said: "I am not resisting arrest! Police brutality!" He never said "It's all over now" - those words were put into his mouth by the WC's cavalcade of faux witness puppets and liars.
Hughes-Wilson cites his own source (p. 176) who overheard two Dallas cops talking about how Oswald was to have been killed before he ever arrived at the station. One, in a snarling voice, said to the other (ibid.) "You were supposed to kill Lee....you stupid son of a bitch, then you go and kill a cop". Referring to the shooting of officer J.D. Tippet.
So there were snafus along the way, even the best planned conspiracies can go awry, but in the JFK case the architects ensured there was always a back up plan. In this case, to recruit former Chicago mobster Jack Rubinstein, aka Ruby, to snuff Oswald. Mark North, using actual, released FBI files, documents many of Ruby’s Mob connections in his book, Act of Treason- including his reported “gangster connections in Dallas”, especially to Joseph Civello, the Mafia boss in Dallas. The same files disclose that Ruby, on October 26, 1963, “placed a 12 minute person to person call to Irwin S. Weiner at Weiner’s Chicago home”.
It is further noted that Weiner was:
“a prominent Chicago Mafia associate” and “instrumental in coordinating the flow of cash between the Teamsters and Las Vegas casinos." (North, op. cit., pp. 333-34).
All of this Col. Hughes-Wilson agrees with.
Note that a mistake (oversight?) made by some reviewers is that Col. Hughes -Wilson's book is simply a "regurgitation" of old facts. It is not, and this merely discloses that the complainants didn't read it carefully enough. Hughes -Wilson notes the then Dallas PD as part of a national security pact with the elements of the CIA, NSA orchestrated the strategy for compromising Kennedy. Hughes-Wilson cites the background details from a source (p. 142): "In the first week of November, three Corsican gunmen slipped across the Mexican border using Italian passports. They were ensconced in a CIA house by Dallas policeman Roscoe White, acting for the CIA".
The point is, if elements of the then Dallas PD were charged with hiding the JFK mechanics, they wouldn't also set up blockades after the events to apprehend them. This distrust of the then Dallas cops is also a theme that runs through Carl Oglesby's book, ('JFK - The Facts and the Theories', p. 94) Oglesby points out that there was NO chain of credible evidence connecting Oswald to the casings or the rifle found at the Book Depository. Neither proves Oswald fired from there- or fired at all, a point with which Col. Hughes-Wilson concurs. . Further, Oglesby himself suspects they were planted compliments of the Dallas PD. - a point that conforms with Walt Brown's take in his section 'Blue Death'. E.g. p. 125 (Brown):
"No crime scene involved in the assassination was ever truly sealed. They (Dallas police) rushed to the grassy knoll, stayed only long enough to take a cursory look and sniff gunpowder, to which they attached no significance. ...There was a pervasive pattern of not taking names and addresses as if they did not want to know. Certain witnesses were totally ignored. No impedimenta were placed in the way of potential fleeing assassins."
There are multiple points of consistency of other long time researchers (e.g. James Douglass, Walt Brown, Carl Oglesby) with Col. Hughes-Wilson's analysis. Those who pass this work up on the basis of being 'recycled fluff" merely do themselves a great disservice - especially from a person with bona fide intelligence background.
Lastly, if you really want to know why so much noise and endless ambiguity keeps being generated in many forums where the assassination is discussed, go directly to Hughes-Wilson's Appendix, 'CIA Instruction to Media Assets, 1 April 1967, headed 'Concerning Criticism of the Warren Report' and CIA doc. 1035-960. You will likely get more information than you want on why your paranoid fears concerning JFK media whitewashing (including using books like Gerald Posner's and Vince Bugliosi's) are very well founded. Especially as the CIA has itself driven the harassment and obfuscation, particularly with "Operation Mockingbird".
Transcending The Hype Over The Proxima B Planet
Artist's conception of the view from the new planet circling the star Proxima Centauri B
It appears every time a new exoplanet is discovered, the hype of astronomers (and some physicists) approaches that of the media commentariat. Three days ago on CBS Early Show, for example, physicist Michio Kaku was heard to blab that "the holy grail of astronomy is to find an earth like planet that supports life." Not really. The true "holy grail", if such a term can be applied to astronomy, would be to forge a consistent theory of stellar evolution by which each stage of a star's life cycle can be matched to an exact sequence of nuclear fusion reactions and reaction cross sections. But, of course, this would likely be too complex for wide consumption.
Enter now a new exoplanet (in terms of its discovery) orbiting the star Proxima Centauri B in the triple star Alpha Centauri system, roughly 4.3 light years distant. Its discovery began with an observing program called Pale Red Dot, in early 2016. This was under the auspices of the European Southern Observatory (ESO). (Not all exoplanet finds have been via the Kepler space observatory). In the case of Pale Red Dot, the goal was, specifically, to find a planet for this star. This sounds somewhat like putting "the cart before the horse" and in many ways it was. But the ESO team had goon reason: namely, why shouldn't Earth's nearest neighbor star not have a planet or even planetary system?
Now ESO and several other institutions have released statements on their new discovery that yes, there is a new planet. Even better it's only slightly more massive than Earth. Yes, it’s in Proxima Centauri’s habitable zone, meaning there’s a potential for liquid water to exist on its surface.
The journal Nature is due to publish a paper describing the new planet – which is called Proxima b – on August 25, 2016. Guillem Angada -Escude, an astrophysicist at Queen Mary University, London, said:
"The long-sought world … orbits its cool red parent star every 11 days and has a temperature suitable for liquid water to exist on its surface. This rocky world is a little more massive than the Earth and is the closest exoplanet to us — and it may also be the closest possible abode for life outside the solar system."
Of the three stars in the system, Proxima – a small red dwarf star – is the closest star. Read about the Alpha Centauri system here. Using Kepler's harmonic law or 3rd law of planetary motion, e.g.
http://brane-space.blogspot.com/2011/08/solution-to-simple-astronomy-problems-6.html
We can compute that the distance of Proxima b from its parent star is just over 4.6 million miles, or a factor twenty less distance than Earth is from the Sun. But bear in mind the red dwarf sun is a much cooler star. This means Proxima is plausibly in the not too hot, not too cold "Goldilocks zone". If that is so it is possible that planet also has an atmosphere.
The radial velocity (RV) plot shown above shows the variations in motion of Proxima Centauri over the first part of 2016. The negative values denote approaching velocities in km/hr and the positive values denote receding velocities, i.e. when the component is moving away from Earth. Thus, at some times it is approaching at ~ 5 km/hr and at other times receding at the same speed. As Anglada Escude explained:
"At times Proxima Centauri is approaching Earth at about 3 miles (5 km) per hour — normal human walking pace — and at times receding at the same speed. This regular pattern of changing radial velocities repeats with a period of 11.2 days. Careful analysis of the resulting tiny Doppler shifts showed that they indicated the presence of a planet with a mass at least 1.3 times that of the Earth, orbiting about 4 million miles (7 million km) from Proxima Centauri — only 5% of the Earth-sun distance."
The greater mass of the planet, of course, means a higher g-value, i.e. for gravitational acceleration. So any Earthman arriving there would weigh more than on Earth. Thus:
g(Proxima b) = G M/ r2
Where G is the Newtonian gravitational constant, M is then 1.3 x Earth's mass ( 6.0 x 1024 kg) and r is the radius of the planet.
And the weight would be m g(Proxima b) where m is the Earther's mass in kilograms. (Note: Mass is an inherent property of matter, denoting the number of particles possessed, and hence doesn't change assuming a non-relativistic environment)
According to investigator Angada -Escude:
"The first hints of a possible planet were spotted back in 2013, but the detection was not convincing. Since then we have worked hard to get further observations off the ground with help from ESO and others. The recent Pale Red Dot campaign has been about two years in the planning."
When combined with earlier observations then, the Pale Red Dot data revealed the new planet, Proxima b.
Obviously, further investigations are still needed to confirm Proxima b's other apparent peculiarities including that the planet doesn't rotate so one side is always facing its star while the other is darker and colder. Add the fact that it is also bombarded with ultraviolet light and x-rays and one can understand why it may be best at this point to contain the hype.
Friday, August 26, 2016
Internet Hate Culture - Why So Many "Comments" Sections Have Been Ditched
After writing his recent article for TIME ('Tyranny of the Mob', Aug. 29, p. 27) on the culture of internet hate gone wild, Joel Stein informed the chief editor he was "going off Twitter" and also "hopes there will be no comments in the online edition", i.e. for the net trolls to come after him. (In his 4 page piece he cites several examples of being net-stalked and receiving hate missives, including from losers like Andrew Auernhiemer who barked in one tweet "You people (meaning Jews) belong in a fucking oven". Such is what the web, with all its initial promise, has descended into: a huge dog pit, with snarls, howls, bites and free floating excrement.
But what about the comments sections that typically accompany the major media sources, and which are usually allocated for reader inputs or exchanges? In respect of these, Alicia Shepherd recently wrote on smirkingchimp.com:
"NPR is joining a growing list of media organizations that have said “finito” to comments including, ‘This American Life,’ Reuters, Recode, Mic, the Chicago Sun-Times, Popular Science, CNN, The Toronto Star and The Week."
This should not be surprising as we read in the recent TIME of the rise of hate on the Internet in media and sites as diverse as Reddit, Twitter and even Facebook. Sadly, hate speech and vitriol has become the new coin of the realm thanks to mentally deranged trolls who get off on pissing in public - and crapping too.
In the lengthy TIME article, Joel Stein gives a number of other examples (including the attacks on black actress Lesley Jones) that show how far down into the toilet our net culture has deteriorated. It wasn't always thus. I can recall from as early as 1994 having very heated discussions on topics in the old AOL forums. But though these debates (especially on the atheist boards) were animated they never descended into the sort of invective with racial and other hate so abundant today.
The reach of hate culture was eventually bound to extend to the comments section of the big corporate news media. By 2004, when comments sections were initiated on news sites like The Miami Herald, Milwaukee Journal-Sentinel, Baltimore Sun and WaPo, they were hailed as a means to "democratize the media". The original template also allowed a two-way conversation between readers and the journalists who served them. But that didn't last long, and the journalists themselves soon found out they were unable to enter into debates without being tracked and excoriated by assorted nuts with chips on their shoulders.
What one also found, as I have in the comments sections of The Financial Times, is that readers are often talking to and past each other because most of the resident journalists don’t engage. Alicia Shepherd noted there’s a reason. She refers to how Chris Cillizza enthusiastically embraced his audience when he started his political blog, The Fix, in 2006, on The Washington Post. According to Cillizza:
“I would regularly go into the comments to interact (or try to interact) with readers. I incentivized and deputized regular commenters to keep order. Then I gave up. Because none of the tactics or strategies we tried ever had any real impact on the quality of the dialogue happening on The Fix. No matter what the original post was about, a handful of the loudest — or most committed — voices in the room hijacked the comments thread to push their own agendas.”
Which is much the same take that Joel Stein had in his TIME article, noting that trolls ("most likely to be sociopaths with Asperger's") have now taken over most discussion venues and filled them with their hot air and venom. Stein also cites studies by psychologists who refer to the "disinhibition effect" by which contributing factors such as anonymity, invisibility and not communicating face to face in real time, encourage brash behavior. Stein insists this has led to "the stripping away of mores that society spent millennia building".
The point is that the journalists, as well as many of their major media, have finally tired of sponsoring enclaves for unhinged simpletons to vent and hate- and having to constantly monitor and censor. Hence, removing comments sections. (This blog also has a comments section but it is rigorously moderated. I welcome challenges to any arguments or posts but they must have a logical and coherent basis in making specific points. They can't be just shooting from the hip in the equivalent of a verbal 'drive by')
Which is much the same take that Joel Stein had in his TIME article, noting that trolls ("most likely to be sociopaths with Asperger's") have now taken over most discussion venues and filled them with their hot air and venom. Stein also cites studies by psychologists who refer to the "disinhibition effect" by which contributing factors such as anonymity, invisibility and not communicating face to face in real time, encourage brash behavior. Stein insists this has led to "the stripping away of mores that society spent millennia building".
The point is that the journalists, as well as many of their major media, have finally tired of sponsoring enclaves for unhinged simpletons to vent and hate- and having to constantly monitor and censor. Hence, removing comments sections. (This blog also has a comments section but it is rigorously moderated. I welcome challenges to any arguments or posts but they must have a logical and coherent basis in making specific points. They can't be just shooting from the hip in the equivalent of a verbal 'drive by')
Shepherd herself, as an NPR ombudsman from 2007 to 2011, claims firsthand to know "how futile and frustrating comments sections are." She further points out that even though NPR had a sign-up system, and hired an outside moderator to check comments before posting, a listener could still create an alias and write whatever he (usually men) liked. The comments "were often mean-spirited and did little to foster civil conversation."
Shepherd, in a 2011 essay on comment sections for the Nieman Reports, wrote: “The goal is dialogue, but it’s pretty clear that the debate between dialogue and diatribe is still being waged. From the view I’ve had for the last three years as NPR’s ombudsman I’d say diatribe is winning—hands down.” It’s still true today.
Why is this? Why does diatribe trump dialogue? My own theory is that too few netizens know how to conduct a civil dialogue. They are affected by the "disinhibition" effect and also lack the patience to develop rigorous and consistent arguments to build strong dialogue. It's much easier just to deliver drive -by shots with little or no information for support. But this isn't just in comments sections, it abounds in other places on the net. It's also one reason I stopped frequenting the Deja News online discussion groups, as on the JFK assassination. There was too much noise, not enough signal.
Shepherd writes (ibid.):
Why is this? Why does diatribe trump dialogue? My own theory is that too few netizens know how to conduct a civil dialogue. They are affected by the "disinhibition" effect and also lack the patience to develop rigorous and consistent arguments to build strong dialogue. It's much easier just to deliver drive -by shots with little or no information for support. But this isn't just in comments sections, it abounds in other places on the net. It's also one reason I stopped frequenting the Deja News online discussion groups, as on the JFK assassination. There was too much noise, not enough signal.
Shepherd writes (ibid.):
"The trolls who rule the comment seas may actually have won because they often scare away people with their vicious attacks. An infinitesimal number of NPR’s 25 to 35 million unique monthly users bothered to join story-page conversations."
Moreover: “Far less than one percent of that audience is commenting, and the number of regular comment participants is even smaller,” wrote Scott Montgomery, NPR’s managing editor for digital news announcing the shutdown. He adds:
“Only 2,600 people have posted at least one comment in each of the last three months –– 0.003 percent of the 79.8 million NPR.org users who visited the site during that period.”
Moreover: “Far less than one percent of that audience is commenting, and the number of regular comment participants is even smaller,” wrote Scott Montgomery, NPR’s managing editor for digital news announcing the shutdown. He adds:
“Only 2,600 people have posted at least one comment in each of the last three months –– 0.003 percent of the 79.8 million NPR.org users who visited the site during that period.”
NPR’s current ombudsman, Elizabeth Jensen, noted that caretaking NPR’s commenting system becomes more expensive as the number of comments increases –sometimes costing twice what was budgeted. So basically, NPR decided it’s not worth the money to engage only a sliver of its audience While Jensen notes cost is certainly a critical factor for any media company, the more valid question remains: What is the value of commenting unless it’s tightly moderated and journalists engage? Well, on the evidence, not much!
That is a question more and more websites and blogs will have to address in coming years given the troll culture and its dregs is only likely to grow - like a cancer. You can maybe treat and eliminate a biologic cancer with surgery or radiation, but I don't know exactly what the corresponding treatment would be to tame the hate-troll social cancer metastasizing across the net.
See also:
http://brane-space.blogspot.com/2015/10/ditching-online-discussion-groups.html