Monday, 18 March 2013

It's the science, stupid. Or is it?

Thoughts on what we know, and what we don't, and how we tell the difference

Science is under attack. From the right, sceptics attack climate science; from the left, molecular biology and its products inspire deep suspicion. Science no longer seems to inspire the young or the progressive, who espouse mysticism or retreat into homespun philosophy.  Meanwhile both Left and Right cherry-pick from its conclusions; they accept or reject climate science or advances in biotechnology according to their prejudices, and examine the evidence on neither. This is dangerous, for all of us.

It’s a bad time for us to misunderstand science, because it isn’t going to go away.  Al Gore writes: “The… multiple revolutions in biotechnology and the life sciences will soon require us to make almost godlike [decisions]… Are we ready to make such decisions? The available evidence would suggest that the answer is not really, but we are going to make them anyway.” 

Science should come with a health warning. It only tells us so much, and it does not tell us what it has not told us. But I am also going to suggest that in a deeper sense, scientific principles should underpin our beliefs to a far greater extent than they do now.

Averroës: Study of creation
A word about what this post is not about, which is science vs. religion. For some, to teach evolution in a school is to deny the role of the Creator. Others use the cudgel of rationality to attack religion. Both sides would do well to remember the words of Andalusian scientist and philosopher Ibn Rushd (1126-1198), known in the West as Averroës:  “The more perfect becomes the knowledge of creation, the more perfect becomes the knowledge of the Creator. ...the Law urges us to observe creation by means of reason and demands the knowledge thereof through reason. “ The deeply religious, and strongly irreligious, should both think hard about that quote, but they won’t. In any case, this post is mostly not about religion. It is about the way we see science and interpret its findings; what it tells us, and what it doesn’t; and where it should stand in public discourse.

Let’s start by giving science a good kicking.

Induced belief?
In arguments about policy, especially but not only on climate change, one side or the other is usually saying: “It’s the science, stupid!” But they are almost never completely right.  Science rarely falls neatly to one side or the other.

Science is inductive; that is to say, I observe a single object or phenomenon, and decide that my observation allows me to infer something about other, similar or connected, objects or phenomena. An induction is therefore different from a deduction, in which I observe a number of objects or phenomena and know that a certain fact applies to each one; I don’t have to infer it.

The question of what one may reasonably infer from observation is hardly new. The most reliable form of inference is the syllogism. A syllogism is, in effect, deductive logic. I will say, A working bicycle must have wheels; my bicycle works; it therefore must have wheels. Change this proposition to: A working bicycle has wheels; other bicycles have wheels; therefore they work too. This is not a syllogism, because we don’t know it to be true; all the other bikes might have some other broken part, so might not work. However, we know from observation (of other bikes) that it is probably true, so we may infer that they work. This extrapolation of the general from the particular is inductive reasoning and modern science depends on it.

Why this dependence, if it is not foolproof? Bertrand Russell explains that an induction “has less cogency than a deduction, and yields only a probability, not a certainty; but on the other hand it gives new knowledge, which deduction does not” (A History of Western Philosophy, 1945). This is demonstrated by modern advances in astrophysics; dark matter, for example, is theoretically verifiable and falsifiable but we cannot field-test it. But its existence is a rational probability – one that we would not discover through a deductive process.

The problem with this inductive process is that it excludes the unknown; you cannot include in your reasoning a factor that you do not know to exist. This may be because one could have no reason to suspect its existence. One can argue, in the case of climate change, that the causal mechanism is clear, but what if there is some unknown factor acting upon it, or about to do so? The philosopher Moritz Schlick (of whom more later) spotted this danger in the inductive process when he warned that an inductive inference could not easily be reduced to a syllogism by establishing causality. “There are infinitely many circumstances that might possibly enter into consideration as the cause, since, theoretically, every process in the universe could make a contribution,” he wrote (General Theory of Knowledge, 1925).

Isn’t this all rather theoretical? No. Climate science, for example, is based on a massive induction.  We do not know, in a literal sense, that human activity is changing the climate. We have inferred it from the fact that we are releasing a certain tonnage of greenhouses gases and know that some of it is accumulating in the atmosphere. We also know that that accumulation will make the atmosphere retain more heat. Neither of these facts are inference – we know them to be true; we can (for example) measure the concentration of CO2 in the atmosphere and know that it has increases from 280 to nearly 400 parts per million since the Industrial Revolution began in the 18th century. We also know the extent to which these gases will increase the propensity of the atmosphere to retain heat; this was demonstrated by John Tyndall in 1859. What is inference, is that this process will lead to climate change.  That is because we cannot be sure there is no third factor that would cancel out the interaction between the two.

The history of science and technology is littered with failures that were not predicted due to such a “third factor”. In the 1990s I worked for an agricultural research centre that bred a chickpea variety that was resistant to blight, and could therefore be planted earlier in the season, taking better advantage of soil moisture from Mediterranean winter rainfall. However, within a few years farmers started to report that it wasn’t actually resistant to blight. At least part of the reason turned out to be that farmers were trading seed between them and were often buying, or selling in good faith, seed that was not of the latest release.

A famous example of the unexpected comes from aviation history. The first jet airliner, the De Havilland Comet, entered passenger service in 1952. Very soon, several aircraft exploded in flight. The Royal Aircraft Establishment at Farnborough put an intact Comet in a water tank and pressurized and depressurized it until it suffered the same failure. Pressurization of the cabin had searched out a point weakened by metal fatigue. As the aircraft had been tested to twice its maximum cabin pressure in trials, this should not have happened. In fact it was the cycle of pressurization and depressurization that was the agent of failure, not the cabin pressure itself. As Geoffrey de Havilland was to admit in an autobiography published after his death (Sky Fever, 1979), the Comet was so close to the frontiers of technology that there was nothing in existing experience to predict this. The cause could never have been hypothesized.

The failure of an inductive process to apprehend an unknown factor can also arise because its evidential base is limited in time or space. Conclusions drawn from observations of natural processes are especially suspect in this respect. The social-science theorist R. Andrew Sayer cites one of the Paradoxes of the ancient Greek philosopher Zeno of Elea:

One of Zeno’s famous paradoxes showed that on an atomistic conception of time as consisting of discretely distinct points, movement is unintelligible. If an arrow can only be at a single distinct point in space and at no other discrete point in time, then it cannot move. As Georgescu-Roegen argues: “That which is in a point cannot be in motion or evolve; what moves and evolves cannot be in any point.” …So, if we [describe] the growth of a plant [in] distinct stages occurring at discretely distinct times we can hardly expect to learn how it happens.” (Method in Social Science: A Realist Approach, 1984.)

Again, this seems theoretical; but it has implications for the modelling of environmental processes.  A simple example is that of the naturalist E.P. Stebbing, who observed environmental degradation in northern Nigeria in 1934 and concluded that the desert was moving southward.  In a sense it was, but Stebbing might not have known that it had also moved north in recent times, because his observations were temporally inadequate. However, Stebbing’s views started an ongoing narrative on the desertification menace that at one stage threatened to oversimplify land-management issues in Africa.

The problems of the inductive approach become acute in the case of climate modelling, where the number of different phenomena that would be material is so great that they cannot all be known; so great, indeed, that some may be simplified or excluded even when their existence is known. Thus at least one major climate model was drawn up in the past on the basis of single rate of decay for soil carbon – although mineralization of organic matter, and its release as CO2, is highly variable and non-linear.

Does this mean we should ignore the outputs of climate science? No. It is based on an induction, but as we have seen, all good science is. There can always be new evidence coming from left field, but I have yet to hear of anything that would really invalidate the climate models that we have so far.  In any case, as the historian of science Naomi Oreskes has pointed out, science rarely provides absolute proofs; rather, a consensus arises between scientists based on what is known, and provided that consensus is wide enough, the majority will ignore the doubters and move on. This is where climate science now stands.

What I am trying to do, however, is to make a deeper point about the rights and obligations of science, the limitations on what it demonstrates, the need for evidence, and the need for humility when its conclusions are questioned.  Failure to find that humility will itself politicise science and throw its objectivity into doubt. It may even bring it into disrepute.

Roger Pielke of the University of Colorado, who has written widely on the links between science and politics, demonstrated this in a 2004 paper in which he discussed the furore that erupted over Bjørn Lomborg’s 2001 book The Skeptical Environmentalist.  Lomborg had argued that environmental threats were exaggerated and that proposed measures to address them were uneconomic. He was subjected to intense and often savage criticism, with one critic going so far as to compare him, by implication, to a Holocaust denier.  Pielke quotes Harvard scientist John Holdren attacking Bjorn Lomborg’s 2001 book because it had “wasted immense amounts of the time of capable people who have had to take on the task of rebutting him. And he has done so at the particular intersection of science with public policy... where public and policy-maker confusion about the realities is more dangerous for the future of society than on any other science-and-policy question excepting, possibly, the dangers from weapons of mass destruction.”

This frustration with Lomborg is understandable (and Holdren is in fact a scientist of distinction; he is now senior science advisor to the White House). However, Pielke’s point is that by taking this view, scientists are themselves politicising science. This is a wise observation in itself, but there is surely a deeper danger. The critics (many far ruder than Holdren) were, in effect, saying to Lomborg:  Science says you are wrong, so shut up. The implications of this will be obvious to anyone familiar with the abuses committed in the 20th century in the name of “scientific socialism”, and the revolutionary doctrine that because we are right, we may behave as we see fit. The fact that Lomborg was wrong is not the point. As we have seen, science is a flawed instrument, and is not in a position to claim absolute truth. One is reminded of John Stuart Mill’s contention (which I quoted in an earlier post) that no opinion should be suppressed “lest it aught be true”.

There is a further implication. If science can provide at best a broad consensus about climate change, how on earth is it to demonstrate the existence or otherwise of God? We can hardly put the cosmos in a water tank and compress and decompress it until the truth is revealed. (Averroës would say, I suppose, that the Large Hadron Collider was doing just that; but I think the “God” particle is a physical phenomenon.)

We have now given science a good kicking. However, there are dangers in this.

“Science Wars”...and a Viennese legacy
Giordano Bruno (Jastrow/Wikimedia Commons)
Science has always had its enemies.  When I lived in Rome some years ago I would often visit the Campo de’Fiori; it had the best bars. In the centre stands the rather threatening statue of Giordano Bruno. (Not to be confused with Bruno Giordano. One was burned at the stake; the other played for Lazio – though, for Roma supporters, that may mean he should burn too.) Bruno was burned on that spot in 1600. His quarrel with the Inquisition was basically theological, but also concerned his science; like his younger contemporary, Galileo Galilei, he believed the earth revolved around the sun. Further, he posited the existence of other life in other worlds, which challenged the Church’s view of creation. Bruno led a life of principled intellectual endeavor. He was also, by all accounts, a disputatious pain in the arse.  His fate reminds us of what might in then have befallen Christopher Hitchens or Richard Dawkins.

It would be easy to see Bruno’s fate as something of another time, but attacks on science are with us still. The arguments about creationism, and the teaching of evolution vs. intelligent design, are an example; so is the funding by lobbyists of research to discredit the consensus on climate science. However, there have been other, more subtle attempts to undermine the rational, sometimes with good intentions. The best example is the “Science Wars” that began in the 1960s and ran on into the late 1990s.

The “Science Wars” can trace their origin back to a seminal 1962 work by Thomas Kuhn, The Structure of Scientific Revolutions. Very crudely stated, Kuhn’s argument was that what we accept as “scientific truth” is the result of a consensus that is in part a product of society and its preoccupations at any given time, and that certain conditions must occur for that consensus to be reformed (a process he referred to as a “paradigm shift”). Kuhn’s arguments are interesting and complex. A physicist by training, he did not so much question the value of science as try to illuminate how it proceeds, and under what circumstances a scientific consensus will admit of major revisions. However, post-structuralist critics have since interpreted his work as meaning that there is no “scientific” method of inquiry and that what we take for scientific knowledge is actually moderated by a society’s culture and history.

Thus by 1986 the philosopher Sandra Harding could define the radical feminist position as a claim that science is “not only sexist, but also racist, classist, and culturally coercive”.  (Professor Harding also, notoriously, referred to Newton’s Principia Mathematica as a “rape manual”.) There is no reason why there should not be a feminist critique of the scientific world, which remains very male-dominated. However, science is hard to do, and scientists understandably felt that they did not deserve this from critics outside its disciplines. Perhaps more seriously, some of what academic critics have written about science has been quite meaningless. Richard Dawkins has quoted, with glee, such statements as Jean Baudrillard’s:

Perhaps history itself has to be regarded as a chaotic formation, in which acceleration puts an end to linearity and the turbulence created by acceleration deflects history definitively from its end, just as such turbulence distances effects from their causes.

What? In 1994 the physicist Alan Sokal was sufficiently irritated to write a spoof paper that he titled Transgressing the boundaries: Toward a transformative hermeneutics of quantum gravity. Dawkins later described the paper as “a carefully crafted parody of postmodern metatwaddle.” The paper included passages such as the following:

Just as liberal feminists are frequently content with a minimal agenda of legal and social equality for women and ‘pro-choice’, so liberal (and even some socialist) mathematicians are often content to work within the hegemonic Zermelo-Fraenkel framework (which, reflecting its nineteenth-century liberal origins, already incorporates the axiom of equality).

Sokal submitted the paper to the journal Social Text, which published it as part of a “Science Wars” special issue. Sokal then admitted that it was a hoax. What Sokal, and others, were in effect saying was, look, it’s not OK to just talk complete rubbish.

They were not the first to make this point. The classic example is that of the logical positivists, and their leader, Moritz Schlick. In principle, logical positivism holds that a statement is meaningful if it can be verified. This springs in part from a distrust of metaphysical philosophy. The Vienna Circle – a group of thinkers centred on Schlick who met between the mid-1920s and mid-1930s – argued that philosophy could only be a part of science. In a 1982 memorial volume on the centenary of Schlick’s birth, Eugene Gadol explained that philosophy “could not compete with science because there was only the natural world which the sciences, with the support of observation for their theories, already wholly covered – all it could do was analyse the information which sciences provided…”.  In arguing for this essential unity of science, Schlick and the Logical Positivists were undermining the claims of philosophy, theology and the humanities, which in early-20th century Germany and Austria had, as Gadol put it: “alleged that there were special ways of enquiry (hermeneutics) and special ways of understanding (intuiting, Verstehen) which transcend the ordinary operations of the human mind as it manifests itself in the natural sciences.”

In other words, the theory of the unity of science propounded by the Vienna Circle challenged the right to write anything that did not make sense. The Logical Positivists said that for a statement to be meaningful, there must be some way in which its truth could be demonstrated, at least in theory –  even if the tools to do so were not/not yet available. Schlick once used the example of the dark side of the moon; we could not see it, but might one day be able to (as indeed we did). So a statement about what was there was meaningful, as it might eventually be verified.

On the morning of June 22 1936, as he climbed the stairs to his lecture room at the University of Vienna, Schlick was shot and fatally wounded by a former student, Johan Nelböck. This has often been presented as a political act, but Nelböck was simply deranged. However, Nelböck defended himself in court by arguing that Schlick’s rejection of metaphysics had somehow deranged him.  In the weeks that followed, Nelböck received increasing press support as someone who had rid Vienna of a pernicious left-leaning foreign Jewish philosopher, who had sought to destroy the nation’s moral compass.  He used these arguments to secure a pardon after the Nazis took power in Austria (his sentence had in any case been only 10 years).

An enlightened, West-leaning philosopher murdered by gloomy irrational Central Europeans untouched by the Enlightenment? It’s exactly the gulf expressed by the characters Settembrini and Naphta in Mann’s The Magic Mountain. Indeed Schlick fits the role; a German aristocrat (and not in fact Jewish), he was married to an American and spoke good English. The British philosopher A. J. Ayer, who met him in Vienna in 1932, said that he “made on me above all an impression of urbanity – like an American senator in a pre-war film.”

Arguably this conflict was resolved by the second world war: the rational won. But this is not so clear. The “Science Wars” showed that the validity of science is still under attack from those who do not wish to be bound by its conclusions.  But perhaps even more important, both Sokal and Schlick – in their very different ways – were insisting that it is not all right to make meaningless statements and offer them as knowledge.

This is the real point. As we have seen, the scientific method is flawed. But any alternative is worse. In the first part of this post, I argued science does not always deliver final truths. Therefore those who see themselves as rational should understand that science should not be used as an excuse to berate those who do not agree with them. In the second part of this post, however, I have tried to show that despite its limitations, the basic scientific method is essential, and our public lives must be the realm of disciplined, secular, rational thought. The reason is simple; it is sometimes a short step from talking shit to doing it to other people.

Mike Robbins’s novel, The Lost Baggage of Silvia Guzmán (Third Rail, 2014), is available as a paperback (ISBN 978-0-9914374-0-5, $16.99 USA, or £10.07 UK) or as an eBook in all formats, including Amazon Kindle (ISBN 978-0-9914374-2-9, $2.99 USA, or £1.85 UK). The full range of his books can be found here. Enquiries (including requests for review copies) should be sent to

Follow Mike Robbins on Twitter (mikerobbins19), on Facebook or on Goodreads

No comments:

Post a Comment