Book: The Precautionary Principle 1896-2000
Author: Edited by David Gee and others
Price: Earthscan
The precautionary principle is much in vogue. It is widely regarded as the common-sense approach if we want to avoid disasters like BSE. It is the ark of the covenant of those NGOs who are suspicious of the products of biotechnology, and part of the "culture of suspicion" which Onora O'Neill discussed in her recent Reith lectures. Indeed, the principle is being institutionalised. Yet, for all its popularity, it is more frequently invoked than examined.
The European Environment Agency (EEA) recently published a volume of 14 case histories: Late lessons from early warnings: the precautionary principle 1896-2000, as a guide to future policy makers. These case histories include asbestos, radiation, the chemical contamination of the Great Lakes, damage to the ozone layer and BSE. In some cases, such as the lung damage caused by asbestos, well-founded warnings were sounded at an early stage and were ignored, with devastating consequences. In others, like radiation, damaging effects emerged gradually. In at least one case, damage caused by chlorofluorocarbons (CFCs) to the ozone layer, the effects were discovered by accident and came as a complete surprise.
These examples make salutary reading and show that governments often failed to take action when the need for precaution was evident. But the lessons that the distinguished editorial team draws from these examples are less instructive. Most of them are obvious. They tell us that warnings should not be ignored, that all relevant evidence should be considered, that if there is evidence of serious harm, we should not wait for full certainty before acting, that regulatory authorities should be independent, and so on. Who could possibly disagree?
However, some of the recommendations are likely to do more harm than good. For example, we must, they say, take into account not only uncertainty but ignorance. Are we to guard not only against the perils we know about, but also perils we do not know about and where there is no reason to suspect a threat? Many environmentalists suggest that we should.
Tony Gilland recently reported in the New Humanist magazine that, at a conference on risk in April, professor Robin Grove White of Lancaster University, one of Britain's leading environmentalists, "explained his frustration at trying to get scientists to acknowledge the importance of taking into account the existence of 'unknown unknowns.'" Anti-GM campaigners oppose field trials of transgenic crops on the ground that, even if the trials show that GM crops benefit or cause no harm to the environment, there may still be consequences that cannot be foreseen. Peter Melchett, formerly of Greenpeace, told a House of Lords committee that no evidence could ever persuade Greenpeace not to oppose GM crops: "It is a permanent and definite and complete opposition based on the view that there will always be major uncertainties."
It is argued that surprise effects, such as those of CFCs on the ozone layer, prove the need to allow for ignorance. But if we aim to avoid all actions that may conceivably cause harm, hardly any action will ever be justified. The precautionary principle becomes the principle of inaction, hostile to all innovation. From the invention of fire and the wheel to the development of the motor car and the aeroplane, let alone the splitting of the atom, almost every invention has led to unforeseen and often harmful consequences. It is also the case that many unforeseen consequences have been beneficial. None foresaw that aspirin would turn out to be a wonder drug or that Fleming's carelessness would lead to the discovery of penicillin.
In drawing lessons from the BSE debacle, the authors of the EEA report berate the British government for having at one stage relied on the absence of evidence of harm, when it should have been looking for evidence of no harm. In the BSE case, some warnings were actually ignored, largely because there was a conflict of interest within the ministry of agriculture between its responsibility both for safety and for the welfare of the farming industry. But governments can hardly be criticised for failing to seek positive evidence of safety. If all the food we buy had to be proved positively safe, we would starve.
As we face fewer life-threatening risks, largely thanks to technical progress, people become more frightened. Never have there been more food scares, though food has never been safer. No accident can happen without the need to apportion blame. Indeed, the concept of an "accident" is being abolished. The British Medical Journal announced last year that it was banning the word itself. Even "Acts of God," such as floods and hurricanes, should, it is argued, no longer be regarded as accidents. If they can be predicted, we should be able to avoid their consequences.
Nearly every risk is avoidable if we exercise enough caution. Sailing and climbing accidents could be avoided if no one sailed the seas or climbed mountains. No children would drown if they were never taken swimming. Street crime could be drastically reduced if there were a 7pm curfew. The No Risk society looms ever closer. It would be a paradise, but only for lawyers pursuing negligence claims.
The EEA's report echoes another fashionable view. It suggests that there should be more public deliberation over the questions which science addresses, because the public's concerns are based on sound instinct. The authors imply that science is arrogant, especially since "its growing innovative powers are increasingly outrunning its capacity to anticipate the consequences." It all sounds democratic and reasonable; and some scientists are beginning to accept that public misunderstandings must, indeed, be their fault.
There are, of course, arrogant scientists just as there are arrogant environmentalists, politicians and journalists. Yet arrogance is not a natural characteristic of science, whose method is empirical and whose conclusions are always subject to challenge and modification. Science has been defined as tentative knowledge. Indeed, many misunderstandings arise because scientists will not commit themselves to categorical statements that something is "safe." Their meticulous qualifications about probabilities are then presented by the media as admissions of risk.
Naturally, it is sensible to encourage more genuine dialogue between scientists and the public. The public is entitled to expect regulation to ensure new products are safe and it is highly desirable to secure public representation on regulatory bodies. But the EEA report, like many other calls for democratising science, is effectively trying to impose political responsibilities on science to compensate for a lack of confidence in politicians. The advantages of dialogue cannot obscure the fact that science is fundamentally an elite activity. The questions which science addresses cannot be decided by popular vote. You do not decide whether the earth goes round the sun by referendum.
Public instincts may be right or wrong, but the public does not have some special quality of intuition. If we were to trust public opinion and take due regard to public concern, Jenner's smallpox vaccine would have been banned (at the time the public widely believed it would cause baby cows to grow out of people's arms). Public mistrust of the MMR vaccine, against all the evidence, is likely to lead to a measles epidemic from which some children will die. A large section of the public believes in astrology and even a broadsheet paper, the Observer, publishes a column on the subject. There is a huge market for homeopathy and organic crops, despite an absence of evidence for the virtues of either.
This does not mean that the wishes of consumers can be disregarded or that they are stupid. It is for consumers to decide if they wish to avoid GM food, even if their decisions have no rational basis. In time, no doubt, regard for evidence will prevail. When shoppers can see clear benefits in lower prices or better nutritional value, they will no doubt accept the new technology, just as patients have accepted the benefits of genetically engineered drugs.
There are dangers to democracy in pandering to a form of populism that fosters a distrust of expertise. Scientists, like all experts, make mistakes. This is not a reason to trust lay instincts instead. The alternative to relying on the expert in technical issues is to rely on ignorance.