Friday, October 31, 2014

A Scary Fact: Earth On Track For The Hottest Year On Record

No automatic alt text available.
Receding glacier seen in Switzerland on our trip there in September.

Perhaps the scariest story reported so far this year, is that Earth is on track for 2014 to break the record for the hottest year ever recorded. The National Atmospheric and Oceanic Administration reported on Monday, Oct. 20, that the previous month averaged 60.3 F - the hottest September in 135 years of record keeping. More sobering yet, May, June and August also set records.

Meanwhile, the first nine months of 2014 have registered a global average temperature of 58.72 F- tying it with 1998 for the warmest first nine months on record, according to NOAA's Climate Data Center.  According to NOAA climate scientist Jessica Blunden, quoted in the Denver Post (Oct. 21, p. 14A, 'Earth Headed for Hottest Year Yet'):

"It's pretty likely 2014 will break the record for the hottest year."

This may well come as a major surprise to those backward folks still spreading the misbegotten trope that warming has "halted since 1998". I refer to those like a character named Theo Vermaelen, who recently wrote in a letter to The Financial Times (Sept. 12, p. 10), commenting on a graph in an earlier (Sept. 10) article by Pilita Clark :

"It would have been enlightening to also plot a graph of average temperatures since 1984 so that all can see that since 1998 there has been no meaningful global warming.".

Actually, Theo, you're just not reading the graphs correctly, as I also advised columnist George Will in a previous post, e.g.:
http://brane-space.blogspot.com/2013/06/george-will-no-warming-for-last-16.html

As noted therein, the error made by Will (and likely Mr. Vermaelen as well) is a serious misreading of the graphs published in a Nature paper back in 2008 by Noel Keenlyside et al. The plausible misreading was rendered more probable by the authors'  tentative claim for "monotonic global cooling" since ca. 1998. This 'jumped the shark' and become embedded into the warming skeptics' arsenal of disinfo and set real global warming science education back at least a decade in my estimation.

The problem? Overlooking that for the key graph  each data point really represented a ten-year centered mean. That is, each point represented the average temperature of the decade starting 5 years before that point and ending 5 years after that point. Thus, the statistics for potential “cooling” could not possibly have been justifiably extrapolated beyond 1998 + 5 = 2003, or in Will's case, "16 years of no warming"  from 1998.

In addition, the mystery could have resolved itself had Will - and his equally unread mate, Vermaelen- spotted the red line in the graphic of the Nature publication and beheld that it was the actual global temperature data from the U.K.'s Hadley Center for Climate Prediction and Research. They then could have asked themselves: "Why does the red line stop in 1998 and not 2007?"  Again, it’s a running 10-year mean, and the authors use data from a Hadley paper that ends around 2003, In effect, they can't do a ten-year centered mean after 1998.

Lazy deniers like Will and Vermaelen, however, have parlayed their own perceptual deficiency and this simple statistical peculiarity of the data into believing that global warming factually STOPPED in 1998! But the stunning fact is the evidence all around belies this canard!   Hell, one only needs to remove the blinkers and behold the scores of receding, melting glaciers, images and videos not long ago made available as in this  PBS program, see:

http://video.pbs.org/video/1108763899


Not to mention the scenes we saw in Switzerland, including of the once proud Eiger gletscher. Can these knuckleheads actually deny what they're seeing? How do they thing the recession of those glaciers occurred since 1998? Fairy dust?? Sprinkled from invisible sprites?

Climate scientist Blunden, in response to these dimwits who insist the world has not warmed in 18 years has this rejoinder:

"Well no one has told the planet that!"

She added (ibid.) that NOAA records show no pause in warming, period. But of course, this information issues from scientific sources not those from 'Fox n' Friends'!  Yes, 1998 broke a record for hottest year, but so also has 2005 and 2010, and likely 2014 as well.

If any excuse might be made for the likes of Will and Vermaelen it's that they don't regular read Nature so might have missed seeing the clarification letter Dr. Noel Keenlyside, subsequently provided to the publication: As he explained, they  were predicting no increase in average temperature of the "next decade" (2005 to 2015- relative to their data timeline) over the "previous decade", which, for them, is 2000 to 2010! And that is, in fact, precisely what their figures show -- that the 10-year mean global temperature centered around 2010 is the roughly the same as the mean global temperature centered around 2005. And both years are relatively HOT.

I guess the moral of the story is that people who pen newspaper columns like Will, or write letters to editors like Vermaelen, ought to first ensure they aren't adding to the already large cache of agnotology - for example as present here:












As we know, and I've stated before,agnotology, (derived from the Greek 'agnosis') is  the study of culturally constructed ignorance- is achieved primarily by sowing the teeniest nugget of doubt in whatever claim is made (and as we know NO scientific theory is free of uncertainty).

For those who want the real lowdown, without the bull crap, you can read the American Geophysical Union policy position statement on global warming here:

http://news.agu.org/press-release/american-geophysical-union-releases-revised-position-statement-on-climate-change/

See also:
http://www.smirkingchimp.com/thread/thom-hartmann/59320/green-world-rising-a-call-to-save-the-earth-from-climate-change

Selected Solutions to Nuclear Physics Problems (Part 3)

1) Calculate the wavelength of the gamma ray photon (in nm) which would be needed to balance the endothermic part of the triple –alpha fusion equation. (Recall here that 1 eV = 1.6 x 10-19 J)

Solution:

From the information given, we have:  E (g) =   hc/ l = 6.7 keV

So: 

6.7 keV =  (6.62 x 10-34 J-s) (3 x 108 m/s ) /  l

Therefore:   l  =   (6.62 x 10-34 J-s) (3 x 108 m/s ) / 6.7 keV

Converting to consistent energy units, using 1 eV = 1.6 x 10-19 J:

l  =  (1.98 x 10-26 J-m)/  (1.07 x 10-16 J) =  1.85 x 10-10 m  =  185 nm

2) Verify the second part of the triple-alpha fusion reaction, especially the Q-value. Account for any differences in energy released by reference to the gamma ray photon coming off and specifically, give the wavelength of this photon required to validate the Q.

Solution: The 2nd part of the triple alpha fusion reaction is:
 
8Be + 4He  ®  12C + g  + 7.4 MeV
 
Q = [ (8.00531 u + 4.00260 u)– 12.0000 u] c2

Q =  [12.00795 u - 12.0000 u]   =  0.00795 u (931 MeV/u) = 7.4 MeV

The role (and value in energy) of the gamma ray photon can be obtained by using instead the value for carbon of 12.011 u and following the procedure shown in (1))
 
3) The luminosity or power of the Sun is measured to be L = 3.9 x 1026 watts.  Use this to estimate the mass (in kilograms) of the Sun that is converted into energy every second. State any assumptions made and reasoning.

Solution: 
 
The luminosity is the same as the power or energy generated per unit time, thus:
 
L = E/ t  =  3.9 x 1026 J/s
 
The energy delivered per second then is:
 
3.9 x 1026 J   =   E  = m c2  so the mass converted to energy is:
 
m =   E/ c2     =  (3.9 x 1026 J)/ (3 x 108 m/s )2 
 
= 4.3  x  109  kg

We have to assume the luminosity represents the actual macrosopic mass converted into energy and is a faithful reflection of all the fusion reactions underlying the  conversion.

4) In a diffusion cloud chamber experiment, it is found that alpha particles issuing from decay of U238 ionize the gas inside the chamber such that 5 x 10 3 ion pairs are produced per millimeter and on average each alpha particle traverses 25 mm. Estimate the energy associated with each detected vapor trail in the chamber  if each ion pair generates 5.2 x 10 -18   J:

Solution:

Let d be the  avg. length of each vapor trail, so :  d =   25 mm.

Each ion pair generates  5.2 x 10 -18 J, and  5 x 10 3 ion pairs are produced per millimeter, then:


E/ d  =  (5 x 10 3 ion pairs/mm) (5.2 x 10 -18 J)/ 25 mm = 1.04  x  - 15 J

 
5) When 118 Sn 50 is bombarded with a proton the main fission fragments are: 
 
24 Na 11   and  94 Zr 40  
 
The excitation energy necessary for passage over the potential barrier is:

e  >   3 ke2 Z2/ 5

Where the right hand side denotes the height of the Coulomb barrier.

a) What must this value be?
(Take   ke = 1.44 MeV/ fm)

Solution:
We have Z = 50 so that Z2  = 2500, then:

e   =    3 (1.44 MeV/ fm) (2500)/ 5  =   2160 MeV
 
b) Find the energy difference between the reactants and the products. (Take c2  = 931.5 MeV/ u)
 
Solution: The fission reaction can be written:

118 Sn 50      + 1H1 ®     24 Na 11   +  94 Zr 40  
 
Therefore (Using atomic masses given in Wikipedia):

Q =  

[(117.901606 u +   1.007825 u)  - 24.99096 u - 93. 907303 u ]  (931.5 MeV/u) =

[118.909431 -  118.898263] (931.5 MeV/u)  = 


 [0.011168] (931.5 MeV/u) = 10. 4 MeV

6)  a) Show that I    = A cos (ar)  + B sin(ar)

In  a solution of the reduced Schrodinger equation for the deuteron:

   d2 u I /dr2  +  a2  I = 0.

Solution: We are given:

I    = A cos (ar)  + B sin(ar)

Then:  d/dr    = - a A sin (ar)  + a B cos(ar)

d2 u I /dr2  =   - a2 A cos(ar)  - a2 B sin (ar)   = 

 -  a2  [ A cos (ar)  + B sin(ar)]

Or:  d2 u I /dr2  =   -  a2  I 

Transposing:

d2 u I /dr2  +  a2  I = 0.

Since R = u/ r  the cosine solution must be discarded, lest we get an unwanted infinity. This leaves:
 
u   =   B sin(ar)

Thursday, October 30, 2014

Is Pope Francis Correct? Can God "Co-Exist" With the Big Bang?



Attention has once more turned to the words of Pope Francis who, in a recent address to the Pontifical Academy of Sciences, appeared to endorse the Big Bang theory of cosmology. In his speech he raised eyebrows when he opined that there was no contradiction between having a belief in God and acceptance of cosmic expansion.  He said:

"When we read about creation in Genesis we run the risk of imagining God was a magician, with a magic wand able to do everything, but that is not so." 

He went on to add:

"Evolution in nature is not inconsistent with the notion of creation, because evolution requires the creation of beings that evolve"

Well, apart from some circular reasoning that is still not a totally valid statement. My major quibble is with the use of the words "creation" and "evolution" in the same sentence. Generally, in terms of definitions, they mean two different things and we acknowledge them to be mutually exclusive: if a being has been 'created' its genome is already complete and there is no need to evolve. If a being is evolving then it hasn't been created. "Creation" then denotes a completeness not present with evolution.

The problem with theistic evolution in any form, of course, is that it also pre-supposes a governing purpose and envisions God as some kind of divine watch maker.

From earliest times both philosophers and theologians have debated the question of whether the universe has a purpose.

Those who saw some divine purpose invariably believed the cosmos had to have been “created”. Most of these creationists appealed to a subjectively perceived evidence of “design” in the universe as an argument for the existence of a special divine creator. William Paley (1743- 1805), for example, drew attention to the complexity of structures occurring in both astronomy and biology, arguing that they could not possibly be a product of blind chance. (The Vatican, to its credit, has taken care to reject intelligent design).

In this respect, he may be said to be the father of intelligent design (ID) – now making the rounds as the latest manifestation of the belief that some kind of “irreducible complexity” is embedded in physical- biological reality that dictates one must invoke an external, “intelligent designer”.

The viewpoint of Science in general, and modern physics in particular, is totally opposed to this. This opposition has arisen not merely from logical arguments, but from experiments and observations in quantum mechanics, statistical mechanics and cosmology. In the light of these advances, Paley’s (not to mention ID’s) deficiencies are now evident.

Both physicists and biologists, for example, now recognize many systems in which order and complex activity can emerge spontaneously. In this article, I show how such recognition leads the dispassionate observer to dispense with any notion of cosmic purpose that transcends mere existence in its own right.

A biological example, based on in-vitro experimental studies of cancer tumors, is the individual tumor cell.[1] The cell appears as a fluctuation, able to develop by replication. A cosmological example is the instantaneous formation of the universe by a possible quantum fluctuation[2] that arises when one treats the conformal part of space-time as a quantum variable.

A more prosaic example is the aurora, such as I observed near Chena Hot Springs, Alaska in March of 2005. This particular aurora displayed two perfectly symmetrical parallel green “tubes”, arcing from north to south horizon. Did an intelligent designer craft two natural fluorescent tubes in the sky? Not at all. The inimitable procession to order (observed over two hours) was dictated by the (pre-existing) presence of the auroral oval around the pole and the polar electro-jet, after impinging electrons from the solar wind began to decelerate into the oval and form currents in sheets. These were then shaped by the ambient magnetic field of Earth into the two parallel tubes visible near Chena.

What do the above examples disclose? Basically, that William Paley’s famously naïve argument: “A watch must always have a Watchmaker, so also the universe must have a Maker or Creator.” is flawed and outdated.

The analogy is flawed, first, because the universe is not a mechanical contrivance like a watch. Apart from the fact that – for the most part (certain limited domains in celestial mechanics excepted) the ‘clockwork universe’ was dispelled when quantum theory emerged. Unfortunately, while the practicing physicist has long since had to adopt an indeterminate, non-mechanistic world view (e.g. guided by the experimental results from quantum physics), the same cannot be said for non-physicists, including theologians, philosophers and multitudes of laypersons.

These groups continue to labor under erroneous assumptions of causality and “order” generated almost exclusively by an ignorance of modern physics. For example, an ignorance of the fact that simultaneous measurements at the atomic level are fundamentally indeterminate. Technically, for one of the most common forms of the Heisenberg Uncertainty Principle, this may be expressed (in terms of position x, the Planck constant h and momentum p = mv):

[x, p] = -i h/ 2 p

In term's of Bohr's (Complementarity) Principle, the variables x (position) and p(momentum) are regarded as "mutually interfering observables". This is why only one can be obtained to precision, while you lose the other. In another sense, one can think of approaching a particle in such way (or with such apparatus) that it suddenly gets 'wavy'. At a particular stage of resolution, as the late David Bohm noted, the particle aspect vanishes and you apprehend a wave. But during some interim threshold one can regard it as a wavicle. Of course, if Heisenberg's principle didn't apply - meaning we could know both the position and momentum to the same degree of accuracy, then: [x, p] = 0 such that x*p – p*x = 0 spells out non-interference.

In cosmological terms, the whole concept of “order” has been relegated to a minor and tiny niche of the extant cosmos. For example, the recent balloon-borne Boomerang and MAXIMA UV measurements to do with Type I a supernovae, have disclosed a cosmic content:[3]

7% - ordinary visible matter

93% - dark component, of which:

- 70% is DARK (vacuum) energy and

- 23% is dark matter

In effect, 93% of the universe can’t even be assessed for “order” since it can’t be seen. In the case of dark matter, one can only discern its presence indirectly by the visible effects on neighboring matter. In the case of dark energy, the underlying physical basis isn’t even known – though we know the result is an increase in the acceleration of the universe (arising from a cosmic repulsion attributed to dark energy).

This is all critical, since in the past apologists of teleologism (the belief that purpose and design are part of nature) have cited a perceived “orderliness” as a revelation for the “handiwork” of an intelligent Mind, or Creator. Alas, this falls through the cracks if most of the universe is disorderly, or dark-energy-matter. Indeed, by current assessment – and discounting plasma abundance, one may reckon that even rudimentary order is evident in barely 0.00001% of the cosmos. And this can all be explained or accounted for by appeal to scientific reasoning or hypotheses. For example, the nebular hypothesis, whereby the original solar nebula progressively collapsed under the force of gravitational attraction, can account for the formation of the solar system.

Another point missed by these apologists is that there has always been a profound confusion between the principles of sufficient reason and causation. According to the former: “Nothing happens without a sufficient reason”. As Mario Bunge has observed[1]:

“Giving reasons is no longer regarded as assigning causes. In Science, it means to combine particular propositions about facts with hypotheses, laws, axioms and definitions. In general, there is no correspondence between sufficient reason and causation.”

As an example, let's say I fire electrons from a special "electron gun" at a screen bearing two holes some distance away, e.g.

At first glance, one might reasonably conclude that the electron motion is singular and follows one unique path. That is, that each fired electron traverses a single, predictable path, following stages 1, 2, 3 and so on, toward the screen. This is a reasonable, common-sense sort of expectation but alas, all wrong! The problem is that common sense is useless in the domain of quantum mechanics.

According to the most widely accepted interpretation of quantum theory[1] , the instant the electron leaves the "gun" it takes a large number of differing paths to reach the screen. Each path differs only in phase, and has the same amplitude as each of its counterparts, so there is no preference. How does the electron differ from the apple? It takes all paths to the screen, the apple takes only one (at a time) to the wall. And the electron exhibits phases (as a wave) while the apple doesn't. The electron's wave function can be expressed:

U = U (1) + U (2) + U(3) + . . . . . . U (N)

Here the total wave function for the electron is on the left hand side of the equation, while its resolved wave amplitude states (superposition of states) is on the right-hand side. If they are energy states, they would correspond to all possible electron energies from the lowest (1) to the highest or Nth state (N). There is absolutely no way of knowing which single state the electron has until it reaches the screen and an observation is made, say with one or other special detector (D). This is illustrated in Fig. 1, with each number denoting a given electron state and path.

Prior to reaching the screen the electron exists in a superposition of states or "wave packets". Is this description statistical, or individual? This depends. The wave function has a clear statistical meaning when applied to a vast number of electrons. But it can also describe a single electron as well.[1] In the example just cited, all the energy states refer to the same electron. However, if all electrons are identical, the statistical and individual descriptions coincide.

Germane to the point made earlier, i.e. that there is no correspondence between sufficient reason and causation, one finds that in a large number of cases, the approaching electron goes through both holes in the screen – not just one. This is totally counterintuitive to one steeped in the traditions of Newtonian or classical mechanics. For example, if a baseball were hurled at a wall with two six inch diameter apertures near to each other – it would go through one or the other – but not both! Of course, the electron deviates from such classical behavior precisely because of its wave nature – as demonstrated in the famous Davisson-Germer experiment that verified that particles exhibit wave properties.

The point emphasized here is that this deviation means that in specific spheres (mainly in science, specifically in modern physics) conventional logic and thinking are of little or no use. A number of researchers, authors, for example Hilary Putnam, have argued that the distributive law of classical logic is not universally valid[6] Much of his reasoning (which is beyond the scope of this article) has to do with the peculiar nature of Hilbert spaces that are part and parcel of the underpinning of quantum mechanics.

Interestingly, it is this very indeterminacy that also resides at the core of many quantum bootstrap models, allowing for the spontaneous inception of the cosmos. People have serious problems with such models and ways of thinking because: a) they fail to appreciate the lack of correspondence between sufficient reason and causation, and b) they fail to understand that causality predicated on classical logic is no longer applicable to many areas of modern physics.

While further conceptual/conceptual development remains (the work of science is never final) it is clear that any postulated purpose in the cosmos can already be regarded as a redundant anachronism. If the cosmos can “bootstrap” itself into existence via quantum fluctuation, and acquire “order” (even in highly limited domains) via the implicit laws of statistical and thermal-quantum physics – then it has no need of a “creator” (or “designer”) and no purpose other than to exist. No extraneous being is necessary to ensure its continued stability or existence. More bluntly, the addition of such a being doesn’t advance the quality of our research, or improve our predictions by the most remote decimal place. Hence, to all accounts such a being (or purpose) is totally superfluous.

Thus do humans, as generic offshoots of the cosmos, have any purpose other than to be. If they seek an additional purpose, they must craft and forge this subjectively of their own accord – rather than looking for it on high.
 
Does this imply that the concept of “God” is outright useless, null and void? Not at all. It merely requires that we re-think the concept so that it is consistent with the absence of higher or extraneous purpose. As Bernard d’Espagnat notes [7]:


“The archaic notion that is conveyed by the words ‘Lord’ and ‘Almighty’ will presumably never recover its full efficiency for lulling the ontological qualms of mankind. For a religious mind, turning towards being should therefore become a subtler endeavor than the mere acceptance of the heavenly will stated in the Bible, formulated by the priests, and exhibited by miracles.”

Finally, the abolition of extraneous higher purpose should not incur any psychic loss for humanity. As Marilyn French has aptly observed:

“It is a loss of dignity to define humanity as a race defined to please a higher Being, rather than as a race whose only end is to please itself. The ‘gift’ of purpose to the human race is thus very expensive: one can fulfill one’s God-given purpose only by sacrificing felicity while one is alive.”

In the end, as I have noted before, it all hinges on one's definition of "God".  Francis then is definitely correct that there is no inconsistency of God belief with the Big Bang - but that is only applicable if we refrain from invoking a personal deity and opt for an impersonal but transcendent Being instead. More on this in a future post.


[1] Garay, R.P. and Levefer, R.: 1978, Theoretical Biology, 417, p. 73.

[2] Padmanabhan, T. 1983, ‘Universe Before Planck Time – A Quantum Gravity Model, in Physical Review D, Vol. 28, No. 4, p. 756..

[3] See: Physics Today, July, 2000, page 17.

[4] Bunge, Mario: 1979, Causality and Modern Science, Dover Publications, p. 231.

[5] Due to Richard Feynman. See, Herbert, N.: (1985), Quantum Reality - Beyond the New Physics, Doubleday, New York, pp. 115-117.

[6] Putnam, H., ‘Is Logic Empirical?’ in R. Cohen and M. P. Wartofski (eds.), Boston Studies in The Philosophy of Science 5 (Dordrecht, Holland: D. Reidel, 1968). Reprinted as ‘The Logic of Quantum Mechanics’ in H. Putnam, Mathematics, Matter and Method, Cambridge University Press (1976).

Bochy Unleashes the "Kraken" and Giants Take Series - And Some Takeaways!

In many respects one could think of Madison Bumgarner as the human equivalent to the mythological Kraken - certainly to Kansas City fans at Kaufmann Stadium last night. By the bottom of the fifth, after reliever Jeremy Affeldt had given his best, Giants' manager Bruce Bochy let the bullpen gates open and MadBum emerge. Thousands of boos greeted him as well as hisses, curses and grimaces - realizing that with his appearance the KC run could well be at an end. As Ann Killion noted in her SF Chronicle piece this morning: "When Bumgarner strolled in to start the 5th it was over." As Killion put it:

"The Royals fans — all on their feet, losing their minds and their voices trying to will the team to a win — booed their displeasure into the black night. This was the man they didn’t ever want to see again. The player they hoped was too spent to be available for Game 7.

But there he was, walking alone through right field to the mound. If there had been a soundtrack, it would have been wah-wah-waaaaah from the “The Good, the Bad and the Ugly......Like the gunslinger in a spaghetti Western, Bumgarner unstrapped his arm from its holster and the opposition faded away into the night.”

Well, she wasn't too far off and in many respects, indeed,  "The game was over. The World Series belonged to the Giants."  As I forecast would happen yesterday IF  Bochy put MadBum in.

Of course, I had recommended (in my post yesterday) that he start the game, but as it turned out Bochy's strategy of starting Tim Hudson worked even better. Bochy allowed Hudson to self-destruct, as I also predicted he would, shelled from the mound after only 1  1/3 innings, giving up two runs. Ordinarily that might not seem like a biggie, but one only had to watch Hudson in action to see he was getting clobbered - even when the pitches went for outs. He just didn't have that quality pitchers, coaches and managers call "stuff. Bochy then sent Jeremy Affeldt in to contain the damage and not allow any more and he went another 2  1/3 innings - but you could tell he was spent by the bottom of the fourth.

With the game a nail biter and the Giants holding to a 3-2 lead, speculation then turned to who the next reliever would be and, of course, it was evident earlier when we saw Bumgarner warming up in the bullpen. As the bottom of the fifth opened up he walked to the mound- from the outfield-  and most KC fans already knew it was over, as the air was collectively sucked from the stadium. Their worst fears materialized and their most foolish hopes (that he was too spent after tossing 117 pitches two days ago) were dashed. But they ought to have known Bochy had no other option: MadBum was the only SF pitcher who could hold the KC bats in check.

Though a long time Milwaukee Brewers' fan, I am not a purist in the sense of rooting only for that team   - which self-destructed this year after leading the National League Central for over 150 days. Unlike the Giants, the Brew Crew grew lackadaisical and complacent after the All Star break losing series to the Cards, the Giants and the god-awful Cubs,  of all teams. I had written them off by the end of August as pretenders for yet another year.

The Giants then emerged as the team to watch with likely historical marks set by Bumgarner - who also helped dispatch the hated  Brewers' nemesis, the St. Louis Cardinals. The Royals, meanwhile, were too damned cocky and jubilant - especially after rousting wifey's team - The Baltimore Orioles- in four straight. Someone, we said, had to tame these guys and teach them some respect. Who better than the Giants with MadBum?

To say Bumgarner was superb is putting it mildly. By the end of the game he crossed all historical thresholds going back to the era of another Giants' great, Christy Mathewson. This is going all the way back to the "dead ball" era nearly 100 years ago. To put Bumgarner's stats in perspective, consider this:

21 innings pitched in this Series and only one earned run given up for an ERA of  0.43

For those unfamiliar with baseball stats that translates to less than one half of a run per game - or less than one run in two games.

He did this using 291 pitches, 205 of which were strikes for a strike percentage of over 70 percent. This included 17 strikeouts.

Talk about "pounding the zone"!

Bumgarner certainly deserves the MVP for the Series, and all the kudos he's now getting. Meanwhile, teams like the Brew Crew should be paying attention to see the level of pitching needed to sustain a long season and come out on top at the end. My main quarrel with what started out to be a seemingly stout Brewers' rotation is how they mostly petered out at the end. NO resilience! Then too all the injuries!

Where I do have a quarrel is with the SF Chronicle's presumptuous headline: DYNASTY!

First, the baseball gods don't take kindly to cities or teams self-declaring 'dynasty' status. YOU don't do that, it's up to outside observers to make the determination - and often it's from a historical perspective. Second, the Giants only won this game by the hair of their chinny-chin chins. One mistake at the end would have messed it up - though I didn't believe it would happen. Or,  if for some reason MadBum couldn't go past the seventh or given another fielding error, it might have well been different.

Finally, building a putative 'dynasty' whether in baseball or football,  is tricky in this era of free agency - when a team can lose key players at the drop of a hat. It only takes one loss of a key player to send a would-be contender into pretender-ville. In the case of the Giants, Pablo Sandoval, aka the Panda, becomes a free agent next year. If they lose him to free agency I can visualize the dynasty dreams having a half-life of about......five minutes.

Wednesday, October 29, 2014

What The Giants Need to do to Win the World Series

The 10-0 shellacking last night of Jake Peavy and the San Francisco Giants by the Kansas City Royals was predictable, especially after the masterful pitching performance by Giants' ace Madison Bumgarner on Sunday night. It was painful for this National League backer to watch as the smarmy KC upstarts fairly laughed and gobbled up pitch after pitch for a big inning.  Peavy, in total contrast to "MadBum",  lacked any kind of stuff as well as location.

The same can be said of tonight's planned SF pitcher, Tim Hudson, who in relative terms is as pedestrian as Peavy. Apart from both Hudson and Peavy having losing records, neither can hold a candle - or a cutter- to MadBum.  So one must inquire what SF manager Bruce Bochy (aka "Yoda") will do tonight. The take from at least one SF Chronicle columnist is to start Bumgarner who, let us be frank, is the ONE pitcher the Royals fear the most - as they should.

The counter view is that this will wreck the poor guy's arm (to which I call BS, as he's resilient as hell) and also he has only had two days rest so won't be able to go very long. So what? Let him go as long as he can. Even three or four shutout innings - hopefully giving the Giants the chance to get a secure lead- can be enough, before bringing a middle order reliever in, say Petit.

But if Bochy goes with Hudson as starter, the outcome is pretty well as predictable as it was last night, especially as the %#*#&$!@  Royals can already taste a big Game 7 victory after the way they pounded SF last night.  Now is the time to snatch that Royals' would- be victory from them, and make the little beggars cry in their dugout instead of grinning ear to ear like a bunch of half drunk Macaques.

Let's also bear in mind that pitching three games or portions thereof has a historical basis. Lew Burdette of the Milwaukee Braves did it back in the 1957 Series, with two of the games shutouts. A total of 27 innings pitched and he was years older than MadBum. Bob Gibson also did it for the Cards back in 1967, again featuring 27 total innings pitched and two shutouts.  Are you going to tell me that those guys had the stamina to do the job back then but 25-year old Bumgarner can't now?

Well, maybe! The managers of those pitchers spaced out their games realizing the only chance they had was to pitch their ace as often as possible. Originally, Fred Haney of the Braves believed twenty-one game (regular season) winner Lefty Warren Spahn could handle the Yanks, but he couldn't. It was pitching mate Burdette, the rightie, who quieted the bats of the Bronx Bombers. The point is, Haney spaced Burdette's games out to allow at least a minimum of three games' rest between pitching outings.

By contrast, Bochy already blew it in those terms, pitching Bumgarner in Game 5, when from the resting perspective (and using the ace to take at least 3 games) he ought to have gone in Game 4. Then, he'd be more than ready in Game 7. Still, I do believe MadBum can go at least three stout innings in tonight's face-off, maybe more.

But if Bochy gets predictable and goes "by the book", look for a San Francisco defeat and all the Royals whooping it up on their home grounds.

Just my 0.02 and I could be wrong.

I really hope I am!

Rocket Re-Supply Disaster Shows Commercial Space Ferrying Not Ready for Big time





The images of the explosion last night, shown on Lawrence O'Donnell's MSNBC show were difficult to watch. What had been touted as a  re-supply mission to the International Space Station (ISS) came a cropper just six seconds after launch. This was for a two-stage Antares space launch vehicle to boost a Cygnus spacecraft into orbit.  Tuesday night’s launch was also to be the first in which a more powerful second-stage motor was to be used.

The launch had been scheduled for 6:22 p.m. from the Wallops Island launch facility on the Eastern Shore of Virginia. Had all gone according to plan, the launch would have been visible for hundreds of miles along the Eastern Seaboard.  The plans called for the spacecraft to dock with the space station and deliver about 5,000 pounds of food and other cargo.
.
The first stage employed a liquid-fueled rocket powered by two Aerojet Rocketdyne AJ26 engines, according to Orbital Sciences Corporation which company NASA had contracted to do the re-supply.  The second stage was to use a solid motor to boost the Cygnus into orbit. Orbital Corp. said the Tuesday night mission was the first to use a larger, more powerful CASTOR 30XL second stage motor.


The Associated Press,  quoting a NASA spokesman,  reported that none of the cargo was urgently needed on the space station. According to the AP, the Russian space agency is proceeding with its own resupply mission. Which is just as well, else the astronauts may have to start rationing. 

As currently envisaged the commercial -linked program is to deliver up to 44,000 pounds of cargo to the space station over eight missions, including Tuesday’s.   But I have always been skeptical of using private companies even for ferrying cargo. (We were warned last night by O'Donnell's guest James Oberg not to jump to conclusions and form "bias" and fair enough. But I will still do so.

While NASA honchos are convinced that private, commercial craft will fill the  current cargo transport vacuum, I don't buy it. We are really comparing the limited private efforts of disconnected, competing companies with a total capitalization and resource allocation of barely 1/1000 of what NASA had, to the vast monetary resources commanded by a federal agency featuring specialized talent levels hundreds of time greater- in quality and quantity. 

Anyone who seriously believes any of these private operators (even acting in concert) will match Shuttle achievements is either drunk, comatose or suffering from early onset Alzheimer's. It just isn't going to happen in this version of the multi-verse!  We are also comparing a few companies like Space-X  -with a few hundred jobs- against a mass, federally-funded effort that saw cooperative jobs in 32 states - with detailed construction of every single Shuttle piece to specifications, from its heat-resistant tiles, to the internal gyroscopes, to the solid rocket boosters, and the Rafaelo cargo bay (designed to accommodate tons of cargo to the Space station). 

By comparison, none of the planned private spacecraft will have the hauling capability of NASA's Shuttles whose payload bay stretches 60 feet long and 15 feet across. Bays this size hoisted megaton observatories like Hubble. Much of the nearly 1 million pounds of space station was carried to orbit by Space Shuttles. The private companies biggest models, by contrast, will be lucky to lift one one-hundredth the cargo by volume and one five-hundredth by mass, which means many more trips and much more costly fuel expenditures.

Worse, if human assistance is required, or re-staffing the ISS, we'll no longer be able to send seven astronauts up at once - but only two. Because that's the maximum that can be accommodated in the Russian Soyuz craft! As I wrote in an earlier(2010)  blog post,  JFK - who envisioned not only beating the Russians to the Moon but dominating them in space - would be appalled at the spectacle of American astronauts now having to play the role of hitch hikers on Russian craft. The truth is the Shuttle could easily have continued its role of servicing re-supplies to the ISS, as well as further repairs to the Hubble, but unbeknownst to most Americans - the Shuttle program was re-directed to military spy missions. (A number of special Shuttles had been re-fitted for military use.) 

In the meantime, Space-X claims it can get astronauts to the space station within three years of getting the all-clear from NASA. Station managers expect it to be more like five years. Some skeptics say it could be 10 years before Americans are launched again from U.S. soil. I say it's more like 15, especially if we continue our wasteful squandering of money in places like Afghanistan (A remaining quota of 10,000 troops to remain until 2024). That  ongoing Afghanistan farce wastes more money in two months than NASA's WHOLE budget uses,  for one year!  And I won't even count the waste of using Shuttles for military spying.

Sadly, almost everyone appears to have bought into this "private company as the solution to future manned space" formula. But I don't! During the late 60s and early 70s I saw what could be done via a united effort commanding public (tax ) resources the way the military does now. It got us to the Moon and back and would have gotten us to Mars by now - had the Vietnam War not gobbled up $269 billion. 

It can get us to Mars again, as well as supply the ISS, but the will isn't there. Like everything else, the mantra has become: "Privatize Privatize! Privatize!" 

Which could well be the final words on this country's epitaph.

Tuesday, October 28, 2014

Attending to the Warning Signs of a Major Stock Market Correction






















Many in the stock market are fairly ebullient - when the market soars - which is when it's not taking a dump on certain days or in response to specific events (e.g. Ebola infections, ISIS gains in Iraq etc).. Most have been beneficiaries of the Fed's infusion of "crack" in the form of "quantitative easing" and cheap money. (With another round on the horizon. Notice how the DOW stopped dropping once the buzz began about QE3, following QE1 and QE2 - which have together infused $4.3 TRILLION in bond purchases so far).

But at some point, the cheap money flow has to stop and even if it's done slowly Maul Street will respond hysterically, which is what has prompted discussion of how large a future correction will be ('How Bad Can It Get?', WSJ Sunday in Denver Post, Oct. 19). As the article observes, "corrections of 5% to 20% are a normal part of the stock market" and pointed out that even J.P. Morgan built corrections into his forecasts (often taking advantage of inside info while the little guys got toasted.)

Thus, those in the market now, whether in 401ks, IRAs or doing their own thing in day trades, need to be aware of the potential for loss, and large loss. In line with this, the article points out the "gloomiest" prognostication for a drop so far has been Scottish stock market historian and analyst Russel Napier. He suggests that Wall Street "might fall by 75 percent or more before the carnage is over."  This would put the DOW at about 4300 or where it was in 1980.

And it's not just a remote possibility. Arch-forecaster Nate Silver, in his book, The Signal and the Noise- Why So Many Predictions Fail But Some Don’t  warns (p. 347):

"Of the eight times in which the S&P 500 has increased at a rate much faster than its historical average over a 5-year period , five cases were followed by a severe and notorious crash, such as the Great Depression and the Black Monday crash of 1987”.


Interestingly, two of the largest stock market dives have followed two of the biggest bubbles: the first of 37.85 percent and on the heels of the bubble that formed from Jan. 14, 2000 to Oct. 9, 2002, and the second of 58.78 percent that formed (mainly due to the housing bubble) from Oct. 9, 2007 to March 9, 2009.

Are there warning signs to attend to? Of course! First among them is that IF  the economic future was truly rosy long term interests rates (a barometer of economic growth, i.e. since it indicates wages going up) would have been going up. But instead we observe them tumbling with the benchmark yield on 30-year Treasury bonds having now dipped below 3% and yields on the 10-year note at mid 2013 levels.

People should also be leery of the S&P increasing beyond the range Silver notes over a 5-year period, and the DOW is also a proxy of that - and red alarms ought to sound if it hits 17,000 and stays there any length of time.

A final warning sign which too few attend to is stock buybacks by the companies themselves. I mean, WHY should you have to buy back your own stock to create an artificial rise in share price if your company is genuinely doing well? It makes no sense. If you're going to use any extra money for anything it ought to be for paying dividends, not stock buybacks!

Columnist Jonathan Clements in his piece in the WSJ Section of the Post this past Sunday, observes:

"Many companies were big buyers of their own stock before the 2007-09 market decline, only to scuttle their buyback programs during the market crash. In recent years with share prices up sharply they have begun to voraciously buy back."

And why not, because they are then reaping the bounty of their own high share prices? Buy backs also make management's stock options more valuable - so what better way to compensate the Street's honchos?  Strangely, all this has selectively blinded investors to the lack of dividends. As Robert Arnott, quoted by Clements, stated:

"Dividends are reliable, You cut them at your own peril - but you can cut a buyback and hardly anyone notices."

Why? Why aren't the little guys more aware of how they're being shafted? Maybe for the same reason, as former trader Michael Lewis observed this morning on CBS' Early Show, they're not paying attention to how flash trading is ripping them off via "millisecond" advantage and thereby gaining pennies on the dollar with each trade made. They essentially gain those pennies (which add up to billions over time) by getting shares redeemed before you can via that slight micro-time advantage.

Why doesn't the SEC do anything to stop this baloney? According to Lewis, because once they leave the SEC they will be looking for jobs on the Street so don't want to alienate it with antagonistic regulations or judgments.  So don't look for the SEC honchos for any guidance or alerts to help you with navigating the swamps of Maul Street. You are literally on your own - with maybe this blog and a few others to try to provide some heads ups.

What you can do is what every sane investor ought to be doing: leaving any stock now that doesn't deliver dividends and is only doing buy backs. (With a little research you can find this out.)

You can also re-assess your risk tolerance. Can you really afford to lose 75 percent of your 401k  if the market came a crapper again? And don't take any online quizzes to asses your current risk tolerance! Psychologists will tell you when the market's going up, DOW headed toward 17,000 as it is now, people tend to over answer on the positive side.

What you really need to do is doff the rose-tinted glasses and pink Pollyanna hat and put on your black, negative thinking cap. Ask yourself how you would feel if you had $300,000 salted away  in 401ks and IRAs and after a major crash or "correction" had $75,000 left.  Would you be able to suck it up and move on? Would you be able to stick to the Street mantra of "Buy and hold!"  If not, you had better rethink your positions and portfolio.