Saturday, February 6, 2016

Dark Matter And Dark Energy In The Universe


Graph showing the plot of magnitude (M) vs. red shift z for Type 1a supernovae,  showing the universe is accelerating.(From: Perlmutter, Physics Today, April, 2003, p. 53)

Continued questions at All Experts.com suggest many still do not understand the differences between dark matter and dark energy. Both of these are critically important to our conception of the cosmos given they impinge on conclusions that we can draw on the nature of the cosmos - including whether it displays "order" or can be considered to be "created", i.e. by an outside agent.

In the case of order, as I've noted before, it is difficult to make such an argument given  the universe has been shown to contain ordinary,  visible matter at only 7% of the total, with fully 93% exhibiting a “dark component” - of which nearly 70% was dark or vacuum energy, the rest dark matter. (See, e.g. Physics Today, July, 2000, p. 17). Given these components, the extent of any order can't be properly assessed, and the issue of creation has been shown to be superseded by greater understanding of the nature of quantum fluctuations and virtual particles.

Dark matter came on the cosmological scene first, back in 1933, when Fritz Zwicky's measurements of galaxy clusters highlighted a ‘missing mass’. He found that the mass needed to bind a cluster of galaxies together gravitationally was at least ten times the apparent mass visible. This mass, because it was inferred but not directly detectable, became the first ‘dark matter’. Around the same time there were observations of stellar motions in the galactic plane by Dutch astronomer Jan Oort. He found there had to be at least three times the mass visibly presenting in order for stars not to escape the galaxy and fly off into space.

By the late 1970s, astronomers realized there were other forms of dark matter. Among the most discussed candidates were black holes, marking the end stage of evolution for very massive stars. In the black hole, the gravity is so strong that no light escapes and the mass typically is much greater than that of the Sun. These objects can only be detected indirectly, e.g. as a member of a binary (double) star system, to infer its presence from the intense x-rays given off when the companion star’s gaseous layers are sucked into it.

Dark matter itself occurs in either baryonic or non-baryonic forms, depending on whether the matter reacts with radiation or not. If it doesn’t, it’s non-baryonic. Baryons include protons and neutrons, while non-baryons include electrons and neutrinos.

Non-baryonic dark matter further breaks down into cold dark matter and hot dark matter. The terms hot and cold are not so much indicative of current temperatures, as the phase of the early universe at which the particular dark matter ‘decoupled’ from the hot radiation background. An early decoupling implies a higher ambient background radiation temperature of the primeval cosmos. A later decoupling correlates to a cooler temperature. Perhaps the most widely studied candidate of hot dark matter is the neutrino.

By contrast, cold dark matter candidates tend to have larger mass and amongst the most likely suspects are: gravitinos, magnetic monopoles, and primordial black holes. However, there are a couple of exceptions to this, which include: WIMPs and Axions.

Dark energy didn't emerge conceptually until the late 1990s when the first Type Ia supernovae measurements came to the fore. By  early 1998, the type Ia supernova results of two groups: the Supernova Cosmology Project (based at UC Berkeley) and the High- Z Supernova Search - led by Brian Schmidt of Mt. Stromlo Observatory in Australia, began to show tightening error bars.

Why Type 1a supernovae? First, because they’re bright enough to isolate in different galaxies – hence there’s a cosmological dimension. Second, they exhibit a uniform, consistent light spectrum and brightness decay profile (all supernovae diminish or ‘decay’ in brightness after the initial explosive event). This applies to all galaxies in which they appear so they function as cosmic standard “candles”. Third, all Type 1a’s betray the same absorption feature at a wavelength of 6150 Angstroms (615 nm) - so have the same spectral “fingerprint”.

Basically, the majority of plotted Type 1a supernovae data points (see graph) congregated along the upper of the two plot lines  This placed them firmly in the region of the graph we call “accelerating universe”. On the other side of the diagonal is the "decelerating region". An additional feature of the accelerating side is 'vacuum energy'.

To get an insight, we can examine the equation that underpins cosmic expansion and whether it is accelerated or not (cf. Perlmutter, Physics Today, 2003)

R"/R = - {4p / 3} G r (1 + 3 w)

Here R is a cosmic scale factor, R" is the acceleration (e.g. second derivative of R with respect to time t), G is the Newtonian gravitational constant,  the mass density. We inquire what value w must have for there to be no acceleration or deceleration. Basic algebra shows that when w = -1/3 the whole right side becomes zero. The supernovae plot data constrains w such that it cannot have a value > (-1/2). Most plausibly, w, the ratio of pressure to density is (Perlmutter, ibid.)

w = (p / r) = -1

This is consistent with Einstein's general theory of relativity - which one could say approaches the status of a 'basic law of physics'. In this case, a negative pressure (check by solving for p)  meshes with general relativity's allowance for a "repulsive gravity" - since any negative pressure has associated with it gravity that repels rather than attracts.

Some might argue that such cosmic repulsion shows a "new law" of physics, but it's merely extending the existing concept of gravitation to show it has a repulsive as well as attractive aspect, and has always been consistent with Einstein's general theory of relativity.

What’s to become of the cosmos if the acceleration is ongoing? Clearly, photons emerging from whatever cosmic object (star, nebula etc.) can never catch up to the rapidly expanding space-time. This means that over time, fewer and fewer objects will be visible to any sentient observers. Eventually, all cosmic objects will “vanish” from the scene and all observers – if any remain- will be plunged into dark, featureless skies.

3 comments:

  1. The plot you posted has 1 data point on it? I could fit more than one line through that . . ..

    Latest data are here, but no web-friendly plots for the web in the paper. This chart from the Supernova Cosmology project is perhaps better, but the data is only up to 2003. Figure 5 of Precision Measurement of the Most Distant Spectroscopically Confirmed Supernova Ia with the Hubble Space Telescope likely has the most up-to-date data (the paper itself reports the discovery of a redshift 1.71 supernova).

    Yet many of the sources are skilled at lying with charts. Why does M-Obs start at 20 or higher - in some cases, it starts at 34? Perhaps to avoid having to discuss the variance of the data near 16 Mpc due to the Virgo cluster? The data from 0 is actually quite interesting (discover dark energy for yourself here.

    ReplyDelete
  2. Yes, your point is taken. I ought to have noted it was a *representative* data point. This graph was actually taken from my original post on dark energy back in Oct. 2008 and the gradient and ordinate (M_obs) values were based on the Perlmutter data. Of course, there are bound to be changes in scale as well as values-limits as observations improve with enhanced instrumentation.

    Also as noted in this paper:

    http://astronomy.swin.edu.au/cosmos/T/Type+Ia+supernova+light+curves

    "Obviously, if the peak brightness of a SNIa is linked to its decline rate, a quantity that varies from supernova to supernova, SNIa in optical light are not the standard candles originally imagined. They are, however, standardisable candles if we apply the luminosity-decline rate relation to our calculations. This is done via one of three light curve fitting techniques: the Δm15 method, the Multicoloured Light Curve Shapes method or the Stretch method. Each of these techniques uses a standard or series of standard light curves to determine how under or overluminous the new supernova is from a typical SNIa. Astronomers can then correct for the luminosity difference before using the supernova as a distance indicator"

    Your reference to avoiding "variance in the data" is also choice, given I had to correct you some months ago on the definition of statistical variance, since you got it wrong. You attached dimensions to a variance associated with global warming data but I noted that only the standard deviation has the same dimension as the data, and hence is comparable to deviations from the mean. As I had to inform you the variance is the square of the SD.

    ReplyDelete
  3. I have replaced the rough 'generic' graph with Perlmutter's actual one from his paper, 'Supernovae, Dark Energy and the Accelerating Universe'.

    ReplyDelete