The late solar physicist John Eddy who did some of the most ground breaking research on the Sun-climate relationship
Graph of radiocarbon (C14) excess over C12 over ~ 2,000 yr. period. The zero level is an arbitrary norm referenced to 1890. Arrows mark persistent features identified as possible solar anomalies. (From Eddy, J. 'Evidence for a Changing Sun' in The NEW SOLAR PHYSICS, AAAS Symposium 1978)
An ongoing issue with climate research has been to construct a reliable base of observations for comparisons of solar radiation intensity in different epochs – thereby to better infer or delineate the emergence of any anthropogenic effects. To this day the best work in this area has probably been done by the late Dr. John Eddy.
The core problem concerns comparing eras of solar variation when there were no telescopes to detect sunspots (such as in the 12th-13th century) with eras in which spot recordings existed. We’re fairly confident, for example, that the Maunder Minimum (1645-1715) really did have few or no spots because we’re the beneficiaries of the observational records of such famous astronomers as Flamsteed, Halley, Cassini and Hevelius whose work in other areas was above reproach.
That observational record through 70 years discloses only occasional sunspots, mainly single and scattered across six separate 11-year solar cycles, with locations mainly near the solar (heliographic) equator.
But what about discerning solar variations in earlier epochs, say the 1200s when the so-called Middle Ages “warming” occurred? Eddy has pointed out (The New Solar Physics, p. 16) that the breakthrough for this arrived in the 1960s when a series of papers demonstrated that radiocarbon (C14) in plant cellulose could be used as an indirect or proxy register of solar activity.
In general, C14 is produced in the upper atmosphere via the impact –interaction with high energy cosmic rays, say from galactic sources. Solar activity in turn modulates the intensity of these cosmic rays via the action of the heliosphere which deflects a fraction of the intense cosmic ray flux and other harmful interstellar radiation. (This shield by the way is shrinking – see: http://www.telegraph.co.uk/news/worldnews/northamerica/usa/3222476/Suns-protective-bubble-is-shrinking.html )
At times the Sun is more active, so also will the heliosphere be stronger, shielding the Earth from more intense cosmic rays the effect of which is to reduce the C14 produced in the Earth’s upper atmosphere. Conversely, when the Sun is less active – as it’s been from 2000- 2008 then the shield is weaker and more intense cosmic rays penetrate to our upper atmosphere yielding more C14 produced. It follows from this that if a record could be obtained of the ratio of say C14 to C12 then one would have a proxy indicator of solar activity for any time (with the C14 to C12 ratio extracted from tree rings or other plant tissue). If such a record showed falling C14 to C12 then we’d deduce higher solar activity and if increased C14 to C12 lower solar activity. If the same ratios were obtained in the modern era it might be feasible to normalize all the results to compare them and draw conclusions.
Most importantly and crucially, since the Sun isn’t the exclusive modulator of cosmic rays (clearly if there’s an anthropogenic effect that would also impact the high atmosphere and modulate intensity and C14 production, as would a changing magnetic moment for the planet) then the C14 record embedded in tree rings would have to be expected to be inscribed with other histories too.
Fortuitously, a 2000-year record of C14:C12 deviations has been compiled by P.E. Damon ('The Solar Output and Its Variation', The University of Colorado Press, Boulder, 1977) and this is shown in the accompanying graph. To conform with solar activity the plot is such that increasing radiocarbon (C14) is downward and indicated with (+). The deviations in parts per thousand are shown relative to an arbitrary 19th century reference level.
As John Eddy observes concerning this output (Eddy, op. cit. p. 17):
“The gradual fall from left to right (increasing C14/C12 ratio) is…probably not a solar effect but the result of the known, slow decrease in the strength of the Earth’s magnetic moment.[1] exposing the Earth to ever-increased cosmic ray fluxes and increased radiocarbon production.
The sharp upward spike at the modern end of the curve, representing a marked drop in relative radiocarbon, is generally attributed to anthropogenic causes—the mark of increased population and the Industrial Age. The burning of low radiocarbon fossil fuels- coal and oil- and the systematic burning off of the world’s forests for agriculture can be expected to dilute the natural C14/C12 ratio in the troposphere to produce an effect like the one shown, though a real increase in solar activity may be hidden under the curve”
Assuming the validity of the arbitrary norm (zero line or abscissa) for 1890, then it is clear that the magnitude of the Middle Ages warming period (relative C14 strength of -18), for example, is less than about ½ the relative effect attributed mainly to anthropogenic sources in the modern era (-40). Even if one fourth the latter magnitude is assigned to solar activity (based on solar variability component detected over 1861-1990 amounting to 0.1- 0.5 W/m^2 vs. 2.0 to 2.8 W/m^2 for heating component arising from greenhouse gas emissions, cf. Martin I. Hoffert et al, in Nature, Vol. 401, p. 764) the anthropogenic effect is at least 3/2 that for the last (exclusively solar) warming period..
These results comport with modern findings that the last ten years have been the warmest ever. This is according to data from the World Meteorological Office. For reference: parts of Greenland had an average temperature 5.4 F above normal. Meanwhile, Russian officials have ascribed 11,000 “excess deaths” due to heat, arising from their prolonged heat wave. According to the WMO:
“The year 2010 is almost certain to rank in the top three warmest years since the beginning of instrumental records in 1850.”
[1] This is estimated to be 10^25.9 G-cm^3 and computed from:
m = r B(r, L)/ [1 + 3 sin^2(L)]^1/2
where r is the distance from the center of the Earth, and referenced to latitude L with B(r,L) the magnetic intensity.
The core problem concerns comparing eras of solar variation when there were no telescopes to detect sunspots (such as in the 12th-13th century) with eras in which spot recordings existed. We’re fairly confident, for example, that the Maunder Minimum (1645-1715) really did have few or no spots because we’re the beneficiaries of the observational records of such famous astronomers as Flamsteed, Halley, Cassini and Hevelius whose work in other areas was above reproach.
That observational record through 70 years discloses only occasional sunspots, mainly single and scattered across six separate 11-year solar cycles, with locations mainly near the solar (heliographic) equator.
But what about discerning solar variations in earlier epochs, say the 1200s when the so-called Middle Ages “warming” occurred? Eddy has pointed out (The New Solar Physics, p. 16) that the breakthrough for this arrived in the 1960s when a series of papers demonstrated that radiocarbon (C14) in plant cellulose could be used as an indirect or proxy register of solar activity.
In general, C14 is produced in the upper atmosphere via the impact –interaction with high energy cosmic rays, say from galactic sources. Solar activity in turn modulates the intensity of these cosmic rays via the action of the heliosphere which deflects a fraction of the intense cosmic ray flux and other harmful interstellar radiation. (This shield by the way is shrinking – see: http://www.telegraph.co.uk/news/worldnews/northamerica/usa/3222476/Suns-protective-bubble-is-shrinking.html )
At times the Sun is more active, so also will the heliosphere be stronger, shielding the Earth from more intense cosmic rays the effect of which is to reduce the C14 produced in the Earth’s upper atmosphere. Conversely, when the Sun is less active – as it’s been from 2000- 2008 then the shield is weaker and more intense cosmic rays penetrate to our upper atmosphere yielding more C14 produced. It follows from this that if a record could be obtained of the ratio of say C14 to C12 then one would have a proxy indicator of solar activity for any time (with the C14 to C12 ratio extracted from tree rings or other plant tissue). If such a record showed falling C14 to C12 then we’d deduce higher solar activity and if increased C14 to C12 lower solar activity. If the same ratios were obtained in the modern era it might be feasible to normalize all the results to compare them and draw conclusions.
Most importantly and crucially, since the Sun isn’t the exclusive modulator of cosmic rays (clearly if there’s an anthropogenic effect that would also impact the high atmosphere and modulate intensity and C14 production, as would a changing magnetic moment for the planet) then the C14 record embedded in tree rings would have to be expected to be inscribed with other histories too.
Fortuitously, a 2000-year record of C14:C12 deviations has been compiled by P.E. Damon ('The Solar Output and Its Variation', The University of Colorado Press, Boulder, 1977) and this is shown in the accompanying graph. To conform with solar activity the plot is such that increasing radiocarbon (C14) is downward and indicated with (+). The deviations in parts per thousand are shown relative to an arbitrary 19th century reference level.
As John Eddy observes concerning this output (Eddy, op. cit. p. 17):
“The gradual fall from left to right (increasing C14/C12 ratio) is…probably not a solar effect but the result of the known, slow decrease in the strength of the Earth’s magnetic moment.[1] exposing the Earth to ever-increased cosmic ray fluxes and increased radiocarbon production.
The sharp upward spike at the modern end of the curve, representing a marked drop in relative radiocarbon, is generally attributed to anthropogenic causes—the mark of increased population and the Industrial Age. The burning of low radiocarbon fossil fuels- coal and oil- and the systematic burning off of the world’s forests for agriculture can be expected to dilute the natural C14/C12 ratio in the troposphere to produce an effect like the one shown, though a real increase in solar activity may be hidden under the curve”
Assuming the validity of the arbitrary norm (zero line or abscissa) for 1890, then it is clear that the magnitude of the Middle Ages warming period (relative C14 strength of -18), for example, is less than about ½ the relative effect attributed mainly to anthropogenic sources in the modern era (-40). Even if one fourth the latter magnitude is assigned to solar activity (based on solar variability component detected over 1861-1990 amounting to 0.1- 0.5 W/m^2 vs. 2.0 to 2.8 W/m^2 for heating component arising from greenhouse gas emissions, cf. Martin I. Hoffert et al, in Nature, Vol. 401, p. 764) the anthropogenic effect is at least 3/2 that for the last (exclusively solar) warming period..
These results comport with modern findings that the last ten years have been the warmest ever. This is according to data from the World Meteorological Office. For reference: parts of Greenland had an average temperature 5.4 F above normal. Meanwhile, Russian officials have ascribed 11,000 “excess deaths” due to heat, arising from their prolonged heat wave. According to the WMO:
“The year 2010 is almost certain to rank in the top three warmest years since the beginning of instrumental records in 1850.”
[1] This is estimated to be 10^25.9 G-cm^3 and computed from:
m = r B(r, L)/ [1 + 3 sin^2(L)]^1/2
where r is the distance from the center of the Earth, and referenced to latitude L with B(r,L) the magnetic intensity.
No comments:
Post a Comment