Causality, including arguments or rationalizations for a “primary” cause (e.g. creator deity, intelligent designer) often end in aggressive but unresolved arguments. This blog entry explores the issue of causality in a larger context, in the templates used to frame it logically, and also the parameters for acausal conditions and acausality.

The criteria of *necessary and sufficient
conditions* is of particular use and relevance. They were actually
originally invoked by Galileo to replace the concept of efficient cause (cf.
Mario Bunge, **Causality and Modern Science**, pp. 33-34, 1979, p. 33). However, even cursory examination
exposes major inadequacy in two respects:

1) The definition entails an indefinite number of factors, as
it includes in the “cause” any thing or event that could make some difference
to the outcome,

2) The definition is too general to the extent that it may
apply to statistical, dialectical, and other processes- since what it states
are the “set of conditions” both necessary and sufficient for the occurrence of
an event of whatever kind produced by a process of any sort – whether causal or
NOT.

In effect, the use of “*necessary and sufficient conditions”*
is really a statement of regular conditionality that exposes no real criteria
for causal efficacy. Generally four characteristics are assigned for efficient
causality: *conditionalness, existential succession,
uniqueness and constancy*. The first, conditionalness is a generic trait of
scientific law. Thus, for example, applying conditionalness to the occurrence
of large solar flares, I knew I must include such factors as: steepness of the
magnetic gradient in the active region, rate of proper motion of sunspots in
the active region, magnetic class of said spots, magnetic flux, and helicity of
the magnetic field.

Each of the above allows a degree of determinacy in the
prediction, once I make the measurements. For example, the magnetic gradient
associated with a bipolar sunspot field:

grad B = [+B_n - (-B_n)] / x

where the numerator denotes the difference in the
normal components of the magnetic field (between opposite polarities of the
active region) as measured by vector magnetograph and the denominator the scale
separation between them. If I calculate grad B = 0.1 Gauss/km then I know a
flare is 96% probable within 24 hrs.

However, there is NO analog to this in the disjunctive
plurality of causes. Thus, it is only scientific laws that are in any way
predictable.

Second, no measurement made discloses or enables any kind of
prognostication determinacy, such as, say, would be afforded by a solar flare
predictor such as the magnitude of grad B. In the case of existential
succession, in the physical case of one particular large solar flare I knew
that when the magnetic gradient spiked or steepened to 0.1 Gauss/km and so the
flare was imminent to 96% probability – in 24 hrs.

However, no similar analog exists for any *human* process, including an election – governed by
totally inchoate emotions, or even unconscious motives, or unknown beliefs.
This even more cogently applies in the realm of human conspiracies, whatever
their form. Often, one finds the simple-minded critique of the latter, to the
effect that: "*The simplest explanation for something is
almost always the correct one."* Which, of course, assumes that any
non-conspiracy model will always be correct because it is putatively 'simpler',
but this employs a false analogy to scientific objects of inquiry.

However, it is dubious that this can be applied to the realm
of human affairs. For one thing, humans are enmeshed in complexes of emotions
and ideological agendas that can't be quantified like Newton's laws of motion,
or simplistically reduced to one cause-one effect relationships. In addition,
humans - unlike natural laws- are capable of deceit and misdirection. So, from
many points of view, it would be foolhardy to reduce the realm of human
behavior, including conspiracy, to a model applicable to simple natural laws.
It would require something basically approaching a general denial that humans
would or could ever act with duplicity. Which is nonsense.

Hence, again, it is absurd to speak or write of applying
“necessary and sufficient conditions” to such! Unlike conditionalness,
uniqueness (or high level determinacy such that a one-to-one onto mapping
occurs between C(cause) and E (effect)) is absent from certain kinds of law –
such as statistical regularities peculiar to statistical mechanics (e.g. the
Maxwell-Boltzmann distribution function) or the empirical- statistical
correlations that show how sunspot morphology is related to the frequency of
certain classes of flares.

Uniqueness is a characteristic, but not an exclusive trait of
causation. When one avers “*uniqueness of causation*” one really means “*the rigidity of causation*” as opposed to say plasticity which would be associated with
human processes, influences, choices and outcomes. Again, no real “law” as such
inheres in the latter, so one would not apply the concept of efficient cause,
or the less stringent regular conditionality of “necessary and sufficient
conditions”.

The distinguishing aspect of *plasticity in causality* is that a given outcome (say the possible election of
Gore in 2000) *could be attained by a whole range of
alternative means*. E.g. Gore could have won in 2000 if Nader had
not run OR if Gore had fought hard enough for the purged votes to be reinstated
OR if he had fought for the 3,100+ butterfly ballot votes to be reinstated OR
if he had demanded all the votes in every country be counted) Note that the key
aspect here is that the alternatives *are not mutually exclusive*. Thus, Nader not running by no means implied the Florida
election would have been a shoo-in for Gore. (E.g. Secretary Katherine Harris
might have finagled other ways to purge votes as documented in Chapter One of
Greg Palast’s * The Best Democracy Money Can
Buy* , 2002 by Pluto Press, London)

If only “necessary and sufficient conditions” are to be regarded as antecedents in a causal connection then a simple causation is implied (which lies at the heart of regular conditionality): If C, then (and only then) E . Again, this is justifiably applied in the context of totally deterministic – and “rigidly causal” examples such as occur in the scientific realm (Newton’s 2nd law F= ma) but NOT human dynamics or processes. For the latter, the imposition of linear causal chains to describe events and outcomes is defective ontologically since it crafts an artificial line of development in a whole stream of causes (e.g. disjunctive plurality of causes). As Bunge observes (op. cit., p. 132) this amounts to a fabrication that may prove useful in terms of description or conveying complex information, say about the 2000 election, but it falls way short in arriving at the efficient causation we seek.

The gist of all the above is that it is facile, naïve and utterly preposterous to over-extrapolate the Galiliean definitions of “necessary and sufficient conditions” to human affairs and events. That will remain the case until such time the human agents and actors are at least as predictable as the particle of a gas are, say, in terms of their velocities in the Maxwell-Boltzmann distribution.

Now according to elementary logic: If P then Q. P is a sufficient condition for Q and Q is a necessary condition for P. But as already, referenced, the origin of necessary and sufficient conditions in the strict sense attached to physical causality (not abstract logic) arose when Galileo applied it and defined it to trump the "efficient cause" concept. (Bunge, op. cit., pp. 33-34).

Why the need to do this? Why not live with logic alone? Because quantum mechanics opened up an entirely new field of logic, which we know as quantum logic. Hence, new rules and postulates arose for which the logical classical definitions you cite are inadequate, e .g.

Two statements p and q, are contrary if they cannot both hold, i.e. if: ~ (p / q)

Two statements: p, q are contradictory, if:

[p -> (q) ] ,
[ (~p) -> ~q] and so on.

However, the fundaments of quantum mechanics (validated by experiment)
diverge from this. Quantum mechanics can be regarded as a non-classical
probability calculus resting upon a non-classical propositional logic. More
specifically, in quantum mechanics each probability-bearing proposition of the
form "*the value of physical quantity A lies in the
range B*" is represented by a projection operator on a Hilbert
space * H*. These form a non-Boolean, non-distributive –
ortho-complementary lattice. Quantum-mechanical states correspond exactly to
probability measures (suitably defined) on this lattice.

What are we to make of this? Mainly that the empirical success of quantum mechanics calls for a revolution in logic itself. This view is associated with the demand for a realistic interpretation of quantum mechanics. Now, since philosophy has not progressed to a non-distributive, non-classical form – it therefore can have squat to say about reality. It is in effect a non-player, or perhaps more accurately, an ersatz player. (As Richard Feynman once opined, it can make pronouncements of its own, but we’re not obligated to take them seriously in such a way to amend the scientific view of the world)

To be more specific, the formal apparatus of quantum mechanics reduces to a generalization of classical probability in which the role played by a Boolean algebra of events in the latter is taken over by the "quantum logic" of projection operators on a Hilbert space. The usual statistical interpretation of quantum mechanics demands we take this generalized quantum probability theory quite literally -- that is, not as merely a formal analogue of its classical counterpart, but as a genuine doctrine of chances.

Let me give an example of how classical logic breaks down. Take the case of a single electron fired from an electron gun (G) at a two-slit screen and ending up on the other side - impinging on a 2nd screen (S2).

Prior to reaching the screen (dotted line, path D) the electron exists in a superposition of states or "wave packets" (e.g. A, B, C etc.). Is this description statistical, or individual? This depends. The wave function has a clear

*statistical*meaning when applied to a vast number of electrons. But it can also describe a single electron as well. In the example just cited, all the energy states refer to the same electron. However, if all electrons are identical, the statistical and individual descriptions coincide.

Without making this post too long, it is found (in > 90% of cases) in numerous trials that the electron goes through both screens at the same time to reach the final screen. Thus, one is not able to say (designating the slits in the intervening screen as A, B respectively, and C as the final screen):

If A then C

OR If B then C

But rather

**BOTH A and B, then C**The point emphasized here is that this deviation means that in specific spheres (mainly in science, specifically in modern physics) conventional logic and thinking are of little or no use. As you can see, it breaks down in this example. A number of researchers- authors, for example Hilary Putnam, have argued that the distributive law of classical logic is not universally valid. (Putnam, H.:

*'Is Logic Empirical?*' in Boston Studies in the

*, Dordrecht-Holland, 1968. )*

__Philosophy of Science 5__Much of his reasoning has to do with the peculiar nature of Hilbert spaces noted earlier, that are part and parcel of the underpinning of quantum mechanics. Needless to say, if classical logic breaks down in illustrations to do with quantum phenomena, then the prosaic human mental commodity known as "common sense" will even earlier. Common sense assumes: a) not only the distributive law of classical logic, but b) 1:1 onto functional relation of whatever effect is described to a

*primary*cause only. That's a tall order maybe applicable only in the areas of the simplest, least contentious natural law applications - and always at the macro-level!

**See Also:**

**And:**

## No comments:

Post a Comment