Double standards in safety assessments

Two years ago, the International Agency for Research on Cancer (IARC), an advisory body to the World Health Agency, published an apparently damning report on glyphosate, one of the most widely-used herbicides around the world, and marketed by Monsanto under the Roundup brand. This was extensively reported, for example by the Natural Resources Defence Council in the US (Glyphosate herbicide linked to cancer – IARC World Health Organization assessment).

The IARC put glyphosate in Group 2A of its classification, as a ‘probable’ human carcinogen. This was commented on by the Scientific Alliance at the time (Pesticides and Health) and I’ll take the liberty of quoting two important expert comments from it:

Professor Alan Boobis of Imperial College said in part “The IARC process is not a risk assessment. It determines the potential for a compound to cause cancer, but not the likelihood….The UK Committee on Carcinogenicity has evaluated possible links between pesticide exposure and cancer on several occasions. It has found little evidence for such a link.”

And Dr Oliver Jones of RMIT University, Melbourne, raises some other interesting points: “People might be interested to know that there are over 70 other things IARC also classifies as ‘probably carcinogenic’, including night shifts. In the highest category of known carcinogens are ‘alcoholic beverages’ and ‘solar radiation’ (sunlight) – along with plutonium. So yes, pesticides can be dangerous, but are many other common things which are also dangerous in sufficient amounts or over long periods of time – the dose makes the poison.”

Nevertheless, this supposed condemnation of a widely-used pesticide heightened the intensity of existing campaigns, with groups such as the Pesticides Action Network probably believing that they had a good chance of removing an iconic crop protection chemical from the market, following their successful campaign against neonicotinoids.

It certainly began to be removed from certain shelves, although one particular UK retailer’s action highlighted some ostensible hypocrisy (Waitrose bans sale of weedkiller used on its own land). A spokesman had to tread a fine line in pointing out that the company used glyphosate on its own Leckford Estate for very specific purposes, but had withdrawn it from retail shelves ‘to reflect customer demand’.

Of greater consequence are moves in the European Parliament for a complete ban on the chemical. The Parliament, despite its apparently democratic credentials, seems to have been heavily swayed by the anti-pesticide lobby and also to have ignored the advice of EU scientific advisers. EFSA disagreed with the findings of IARC and more recently the European Chemicals Agency also gave its opinion that glyphosate does not present an unacceptable risk to human health, to the huge disappointment of PAN and others. Not that this will stop MEPs doing whatever they are set on doing.

However, this week, two rather worrying pieces of news have emerged, not that you will necessarily have seen them, because they have not been widely reported. First, on October 18, the Times published an article under the headline Weedkiller scientist was paid £120,000 by cancer lawyers. While this would most likely have been front-page news if a company such as Monsanto had been seen to try to influence regulators, in this case Christopher Portier, an adviser to the IARC, had been paid $160,000 by the law firm representing cancer sufferers suing Monsanto.

In an ideal world, we wouldn’t worry about conflicts of interest: everyone would behave entirely ethically and give purely objective evidence to anyone who asks. But the world is far from perfect and, as public trust in experts has fallen, so rules have been put in place to ensure transparency and at least give everyone a clear picture of the interests of the person giving an opinion. Often, any whiff of lack of impartiality is grounds for exclusion from consideration.

Dr Portier nailed his colours to the mast by writing an open letter to the Health Commissioner, Vytenis Andriukaitis, in November 2015 arguing that EFSA’s findings should be disregarded. Unfortunately, he failed to disclose his earnings from the law firm, Lundy and Lundy. If he had been arguing that EFSA was right and that the IARC should be ignored and had received money from Monsanto, he would doubtless have been crucified.

Even more worrying was a piece carried by Reuters on 19 October, which I haven’t yet seen in any other media: In glyphosate review, WHO cancer agency edited out ‘non-carcinogenic’ findings. To quote the opening sentence: “The World Health Organization’s cancer agency dismissed and edited findings from a draft of its review of the weedkiller glyphosate that were at odds with its final conclusion that the chemical probably causes cancer”.

This behaviour is very disturbing. It is worth quoting further from the Reuters’ story: “One effect of the changes to the draft, reviewed by Reuters in a comparison with the published report, was the removal of multiple scientists’ conclusions that their studies had found no link between glyphosate and cancer in laboratory animals. In one instance, a fresh statistical analysis was inserted – effectively reversing the original finding of a study being reviewed by IARC. In another, a sentence in the draft referenced a pathology report ordered by experts at the U.S. Environmental Protection Agency. It noted the report “firmly” and “unanimously” agreed that the “compound” – glyphosate – had not caused abnormal growths in the mice being studied. In the final published IARC monograph, this sentence had been deleted.”

Neither the IARC nor any of the contributing scientists approached would answer any questions from Reuters. Unlike EFSA, which shows the stages of its deliberations leading up to its final opinion, the IARC process is opaque and keeping of drafts is discouraged. This would matter less if these critical changes – for which no-one will take responsibility – had not come to light.

The IARC claims to select scientists for its working groups on the basis of “their expertise and the absence of real or apparent conflicts of interest.” Well, they may try to occupy the moral high ground, but they have singularly failed in this instance. The evidence points to a certain outcome being desirable, despite any evidence that might contradict it.

If this was the behaviour of a private sector company, there would be a scandal, but the likelihood of change in the IARC is minimal at best. In the meantime, we have the prospect of MEPs virtuously voting to deprive farmers of one of their most useful and least toxic pesticides. To put it mildly, this is depressing.

Advertisements
Posted in Food and agriculture, Newsletter, Policymaking, Science | Leave a comment

Monochrome Vision

As most readers will already be very well aware, toxicity is a relative term. But for the general public, this simple but important concept is all too often misunderstood. For many people, if something is toxic, then it’s dangerous, end of message. To complicate this, many of us are easily capable of a rather bizarre form of Orwellian double-think: we may strive to avoid all traces of synthetic pesticides, for example, while being perfectly willing to consume a wide range of natural chemicals that are demonstrably carcinogenic. In both cases, the dose makes the poison, but one is ‘good’ and one is ‘bad’.

This black or white, good or bad view of things can lead to poor decisions and unwanted outcomes. On one hand, wanting to eliminate pesticide residues on produce has led to more efficient means of crop protection, minimising spraying or extending the interval between spraying and harvesting, for example. On the other hand, highly effective compounds which are toxic in smallish doses but can be used perfectly safely with simple precautions have been phased out because of the hazard they present if these precautions are not taken. Their (lower toxicity) replacements may be less effective, so increasing losses and food costs.

Similarly, it is a failure to take account of the evidence that low doses of radiation cause no ill-effects (and may even have a positive effect on living things, a phenomenon known as hormesis) that has inflated the costs of building new nuclear power stations considerably. Once a safety standard has been set, it’s a very brave legislator who relaxes it, whatever the evidence in favour of doing so.

And yet a new fleet of fail-safe nuclear generating stations is just what is needed at present. They would provide an ideal way of slashing carbon dioxide emissions while providing a reliable, secure supply of electricity; a least-regrets option whether or not climate change turns out to be as significant as projected.

In an opinion piece this week, the inimitable Matt Ridley argued that Politics is obsessed with virtue signalling.  In essence, the means becomes much more important than the result. Adam Smith pointed out long ago that free markets and competition create wealth, despite the fact that they are driven by that most selfish of purposes, the profit motive. But the profit motive is decried despite its key role in creating wealthy societies, and socialism is regarded as ‘good’ despite the clear lessons from every country that has taken that road in all but its mildest form.

Transport has been a battleground for different interpretations of what is ‘good’. Diesel engines were favoured via the tax system when carbon dioxide emissions were the primary focus. However, the diesel engine recently became ‘bad’ because of the higher level of particulates and nitrogen oxides produced by these engines compared with their petrol equivalents.

This has been exacerbated by the ‘Dieselgate’ scandal in which VW (and others?) deliberately used software that improved the official test results for some engines, while allowing much higher levels of emissions under real use conditions. Even without this, the nature of official emissions tests means that in many cases the results are falsely reassuring when compared to the reality of everyday driving.

A combination of pollution charges for entering some urban areas and scrappage schemes offered by many manufacturers have been designed to take older diesels off the road. However, another result has been to discourage sales of new diesels, despite there being little to choose in terms of overall emissions between petrol and diesel variants under the latest Euro 6 standard.

All of this is, in the longer term, overshadowed by the intention of manufacturers to move solely to hybrid and all-electric cars, although there is little sign that all conventional vehicles are likely to disappear over the next twenty or so years without considerable public subsidy. Battery technology may continue to improve, but the increasing demand for lithium and other essential elements is also driving up prices of these commodities.

In the meantime, this demonization of diesel may turn out to be something of a blind alley. The justification at present is the need to reduce urban air pollution to meet the latest standards and reduce negative impacts on health. This would be helped enormously by using hybrid cars in electric-drive mode in built-up areas, but on the open road diesel or petrol still makes more sense overall.

Diesels are, of course, still emitting similar amounts of PM2.5 and NOx into the air wherever they are driven, but out of town these emissions are diluted in a much greater volume of air. Since it is the concentration – the dose – that is important, any health impact is minimised. It is the geography of urban streets that contains emissions and creates the problem.

Campaigners are pushing for similar reductions in emissions from shipping. Ships use very low-grade fuels, as we can readily see from what comes out the funnel. This may not be very aesthetic, but the exhaust is rapidly diluted in the open air over the sea. Since even busy sea lanes are very uncrowded places compared to cities or even rural roads, this obvious pollution causes no practical problem, while making good use of heavy oil fractions which would otherwise be effectively wasted.

This is not an argument for simply ignoring air quality. This has certainly improved enormously over the last 50 years, but could still be better. However, it would be better to look at the end result and prioritise goals rather than take a sledgehammer – banning the internal combustion engine – to crack a nut – urban air quality.

The campaign against the internal combustion engine is couched in terms of air pollution at present, but the main driver behind it is still the ambitious goal of slashing carbon dioxide emissions. Cars are a relatively soft target for politicians to focus on at present, but the real heavy lifting comes with the enormous task of developing a reliable, cost-effective, zero-carbon electricity supply capable of heating homes and offices and powering all road and rail transport as well as displacing conventional generation sources for present uses of electricity. While this is generally seen as a ‘good’ objective, opinion may change quickly if costs are high, energy security is lower and the rest of the world fails to follow suit.

Posted in Climate change, Newsletter, Nuclear energy, Pollution, Science, Transport | Leave a comment

Does fracking have a future in Europe?

Last year, the UK government gave a cautious go-ahead to fracking, after years of a moratorium following concerns about minor earth tremors near the Caudrilla exploration site in Lancashire. The company has since restarted test drilling in the area, against intense opposition from activists determined to prevent the birth of a viable onshore gas and oil extraction industry in the UK.

These activists may not have managed to sway the government in Westminster, but they have the Scottish government on their side (Scottish government backs ban on fracking). Although the intention to continue the existing moratorium indefinitely needs to be endorsed by the Holyrood parliament, only Conservative MSPs will oppose it, so it is a done deal, to all intents and purposes.

The SNP appear to see no downside to their decision. After all, they follow France, Germany and Ireland – and even Bulgaria – in imposing a de facto ban. England seems to be out of step with the rest of the EU and campaigners are determined to prevent a potentially successful shale gas industry develop to tempt other countries to change their position.

The Westminster government has often been in the minority on environmental issues across the EU, mainly because ministers tend to follow the advice of scientific advisers rather than the demands of campaigners. On GM crops and the ban on use of neonicotinoid insecticides, for example, the UK has been in the minority in listening to and following the advice of its scientists.

When the SNP government originally introduced a temporary embargo in 2015, it did the right thing in commissioning a series of expert studies to inform its later position. These covered economic impacts, decommissioning and aftercare, climate change impacts, seismic activity, health impacts and community impacts. These studies were completed last year and were broadly positive about the well-regulated use of fracking (What do the Scottish government’s six fracking reports say?).

The only exception was regarding health, where Health Protection Scotland concluded from a literature review that there was insufficient evidence to draw conclusions. Nevertheless, they recommended that, if fracking went ahead, “a precautionary approach could be adopted which involves operational best practice, regulatory frameworks and community engagement.”

The Scottish government also carried out a public consultation, and it was this that took precedence. The Energy Minister, Paul Wheelhouse, said “the consultation came back with ‘overwhelming’ opposition to fracking, with 99% of the 60,000 respondents supporting a ban. He said this showed that ‘there is no social licence for unconventional oil and gas to be taken forward at this time’”.

However, this consultation gives a falsely democratic justification for the ban. First, the response rate was very low: about 1.5% of the number of registered voters in Scotland and, since the consultation was open to anyone, an even lower percentage of the potential turnout. Second, the consultation website (Talking “fracking” – a consultation on unconventional oil and gas) was not one to be dipped into lightly by the average citizen, instead offering considerable background information and requiring a degree of interest found amongst campaigners rather than the man or woman in the street.

Which is why it is not entirely surprising that the response was as overwhelmingly in one direction as the results of elections for Soviet officials. Campaigners are good at organising their supporters to respond to things such as this. Companies involved in the industry also respond, but members of the public ambivalent to the whole issue or frankly not interested would not offer any balance.

This is a pity, because a domestic supply of gas to replace the fast-declining North Sea fields would be beneficial both to the economy and to the millions of people in fuel poverty. The elephant in the room, however, is climate change policy. The strategy of most campaigners is to discourage any development of fossil fuels, even as a reliable stop-gap while reliable sources of renewable energy are developed. This trumps all other considerations.

In this vein, the Holyrood parliament has also effectively banned underground coal gasification, which could potentially use the country’s large remaining coal reserves to generate methane. This would avoid the dangers of mining while providing a fuel which is both cleaner and lower in carbon than coal itself.

In practice, these potential new domestic sources of energy are replaced by imported gas, together with the remaining supply from the North Sea. There would be no impact on global carbon dioxide emissions, as the Committee on Climate Change itself acknowledged in the study it submitted to the Scottish government.

South of the border, meanwhile, Ineos (operator of Scotland’s Grangemouth refinery, significant local employer and importer of US shale gas) has exploration rights across large areas of northern England, and Cuadrilla continues to do exploratory drilling. This, though, continues to meet strong opposition from a hard core of activists (Tensions rise at fracking site in UK after police and activists clash).

There is inevitably local opposition from people concerned about potential noise or health effects, following the negative messages from campaigners. But it seems that the main campaigners are not local and use highly disruptive tactics to block legitimate and legal exploration. The chief executive of Cuadrilla, Francis Egan is quoted as saying the “charade of a so-called peaceful protest should be condemned and halted. I strongly condemn the increased illegal and aggressive behaviour of activists which has put all road users near our Preston New Road site at serious risk. The majority of these irresponsible individuals are from outside the local area and seem determined to use regular disruption to local road users, and abuse and violence towards police and security staff, as so called direct action tactics.”

It is still uncertain whether shale gas can be profitably extracted in the UK, but properly regulated exploration should be allowed to go ahead without being blocked by a highly vocal minority. Only if shale gas production is properly evaluated in one country will we see if it has a future across the rest of Europe.

Posted in Climate change, Energy, Newsletter, Policymaking | Leave a comment

Cost and value

I was interested to see this week the seemingly ubiquitous Sir James Dyson announcing that his company is developing a novel electric car (Dyson to make electric cars from 2020). Since a prototype does not yet exist, this is a pretty ambitious goal but if anyone can do it, it’s probably Dyson. And bullish statements like this are not uncommon in this new sector; Elon Musk is perhaps a role model in this case.

What this new car will look like is anyone’s guess. According to the report, “Important points that are undecided or secret include the firm’s expected annual production total, the cost of the car, or its range or top speed. Sir James said about £1bn would be spent on developing the car, with another £1bn on making the battery.”

This is a very different company from Tesla, of course. Tesla was set up solely to make electric cars, albeit it has now branched out into selling batteries for domestic energy storage. It is still far from making a profit, but retains the confidence of the market and a share price to match. In many ways, it is analogous to Amazon or Google; a dynamic start-up that many people see as a future leader.

Dyson has little in common with Tesla, other than being fronted by a high-profile public figure and being engineering- (or technology-) based. It is very profitable, holding a strong position both in vacuum cleaners and hand dryers. It sells other products, such as hair dryers and air circulators/heaters, but only the vacuum cleaners and hand dryers are truly mass-market items. Even these are at the premium end of the market, but a £300 hair dryer is in a different league again. As others have said, they make Apple products look cheap.

Not surprisingly, then, Sir James said that the car would not be aimed at the mass market, and promised it would not be cheap. If it sees the light of day, it will compete with the likes of the Tesla model S and, most likely, offerings from the likes of Mercedes and Porsche. How it fares will then depend on what it does better than the competition. As a leader in cutting-edge engineering and design, the new Dyson car will probably do reasonably well, but in a niche market segment.

Even assuming lack of charging infrastructure, range problems or some other factor do not dampen demand, there are only so many prosperous first adopters in a position to pay a premium for these new cars. There are, of course, plenty of lower-priced alternatives, but they will still be out of reach of many car buyers.

Take the Nissan Leaf, one of the cheaper small family cars and the best-selling all-electric car worldwide to date. The base model Visia has a list price of £21,680, a significant premium over other similar cars. The cost of electric cars becomes more apparent when we see that this price is after the government (ie taxpayer-funded) grant of £4,500 has been taken into account. When an economic price for a small, basic car with a nominal range of 124 miles is over £26,000, the cost base for this sector is obviously very high.

To set against that, low running costs will certainly bring overall ownership costs down (until, that is, the government of the day finds a way to recoup the lost revenue from foregone fuel tax…). On the other hand, there is a rather large elephant in the room: resale values plummet much faster than for most conventional cars. The numbers of early adopters are not matched by second hand buyers.

All in all, it is difficult to see the ambitions of governments in a number of European countries to facilitate an accelerating switch away from the internal combustion engine being realised. The economic realities of life for most people mean that the transition will be in favour of hybrid vehicles rather than all-electric ones, in the absence of very significant falls in price and improvements in functionality. Electric cars are currently not only relatively expensive, but also offer poorer value for money.

There will doubtless be greater economies of scale when electric cars become more popular. Given that they are mechanically a good deal simpler than their petrol- or diesel-powered equivalents, there should also be some manufacturing cost advantages in the equation. But much of the higher cost resides in the batteries. In the case of the Nissan Leaf, the base model, available for £21,680 (plus £4,500 contributed by other tax-payers) is also available without paying for the batteries up-front, for £5,000 less. The owner then pays for the batteries via a lease arrangement.

It’s useful to bear this in mind when hearing that developers of the newest (and largest) offshore wind farm planned in the North Sea (Hornsea 2) have settled for a strike price of just £57.50 per MWh (‘world’s biggest’ wind farm secures Yorkshire coast contract). This compares to the £92.50 per MWh secured for the Hinkley Point C nuclear plant. On the face of it, why would anyone bother with nuclear when wind energy is so cheap?

The answer lies in the value rather than the price. The Hinkley Point scheme is more expensive than necessary because of the choice of a design that has not yet been successfully built to completion. But, when built, it will produce at a rate of over 3GW, day in, day out. The Hornsea scheme has permission for up to 1.8GW of installed capacity, but the Contract for Difference is for 1386MW.

This is the rated capacity, reached when the wind is blowing at the right speed, but in practice the output will be lower (often very much lower) for most of the time. Smoothing the output over modest timescales using batteries would be prohibitively expensive, which means that thermal generating capacity to cover the full rated output would be needed. In practice this would be gas, because of the need to ramp up and down quickly.

Because Dong, the wind farm developers and operators, have no responsibility for delivering a secure supply of electricity, the additional costs accrue elsewhere in the system and, ultimately, are borne by the consumer. Dong can operate on a strike price of £57.50 per MWh because that is enough to cover their direct costs in this instance. The headline price may look low, but this project certainly doesn’t represent good value to British consumers. Given a free choice, they might very well choose Hinkley Point C over Hornsea 2, if they knew the full picture.

Posted in Energy, Newsletter, Transport | Leave a comment

Working together

I make no apologies for talking about the thorny issue of climate change yet again. There’s a good reason why: after a decade or more of unwillingness to listen to criticism of the IPCC story on climate change, this week a mainstream paper was published in Nature Geoscience that to all intents and purposes shows many of the criticisms to be justified.

Even the most objective-sounding paper is open to different interpretations, and this one is no exception, despite its apparently unambiguous title: Emissions budgets and pathways are consistent with limiting warming to 1.5°C. The message from the authors is seemingly a clear one. If we try hard, global warming can be limited and managed.

To quote from the paper’s summary: “Hence, limiting warming to 1.5°C is not yet a geophysical impossibility, but is likely to require delivery on strengthened pledges for 2030 followed by challengingly deep and rapid mitigation. Strengthening near-term emissions reductions would hedge against a high climate response or subsequent reduction rates proving economically, technically or politically unfeasible.” This straightforward message is taken up by the BBC under the headline Paris climate aim is ‘still achievable’.

The authors’ research was a reassessment of the projections from climate models, the output of which has so often in the past been used to argue that the world was, in effect, already past the point of no return. The iconic figure of a 2°C rise in average temperatures had been taken as the limit above which the net impacts of climate change would become seriously negative. At the end of the 20th Century, stories of what life might be like on an Earth that had warmed by 4, 6 or 8 degrees were commonplace.

The surprising thing about the current study, therefore, is that not only is the 2° temperature rise seen as a practical reality, but that the more stretching 1.5° target arising from the Paris agreement is also deemed possible. The underlying message is that the computer models have indeed overestimated the extent of warming and that all is not lost. The BBC nevertheless report a conflicting view from another recent paper: Importance of the pre-industrial baseline for likelihood of exceeding Paris goals.

Co-authored by Michael Mann, this paper argues that, as the target temperature is measured against a baseline of pre-industrial conditions, taking an earlier reference point makes the goal more difficult to achieve. This is hardly surprising, given that the last few centuries have seen the planet emerge from the so-called Little Ice Age.

Since cooler conditions have historically been less favourable for farming and hence for society overall, it seems a little perverse to consider this to have been an ideal climate to which we should aspire. Nevertheless, Professor Mann is quoted by the BBC as saying “There is some debate about [the] precise amount of committed warming if we cease emitting carbon immediately. We’re probably very close to 1.5C.”

However, there are alternative views. The Times, for example, put the story on the front page under the headline We were wrong – worst of effects of climate change can be avoided, say experts. The focus of this story is that the present climate models systematically overstate the amount of warming arising from a rise in carbon dioxide levels. In this report, co-author Professor Michael Grubb explains this.

Having said at the Paris summit “All the evidence from the past 15 years leads me to conclude that actually delivering 1.5C is simply incompatible with democracy”, he told the Times “When the facts change, I change my mind, as [John Maynard] Keynes said. It’s still likely to be very difficult to achieve these kind of changes quickly enough but we are in a better place than I thought.”

The comments about facts and evidence are interesting. The ‘facts’ referred to in Paris were the output from the computer models now deemed to be biased. The ‘evidence’ of today is that these models have projected higher temperatures than observed for the century so far. Until the models can produce a reasonable hindcast of the pattern of average temperatures over the past 50 years without incorporating unexplained or unjustified fudge factors, we can hardly place much credence on the forecasts for the next half century. Even then, we have to bear in mind that modelling is not reality, but at least it might then represent our best current understanding and be some guide to the future.

Whatever the different interpretations of this important paper, it is hard to escape the conclusion that the understanding of climate change espoused by the mainstream of scientists and the IPCC is looking more and more like that of sceptics whose views have so often been dismissed and who have regularly been tagged ‘deniers’. This, surely, should be good news, and an illustration of shifts in the paradigm based on an increasing body of evidence. In an ideal world, it means that the common ground can be occupied by many more scientists and policymakers, working together both to improve our understanding of the complexities of climate and to formulate effective and economic ways to deal with any challenges.

Would that this were so, but in practice there is still a clear line separating many critics from the mainstream view: the nature of governments’ response to the situation. The clear message from the latest Nature Geoscience paper is that policymakers need to continue to increase their efforts, but that these will be rewarded by the outcome. Meanwhile, there are others who continue to argue that the consensus remedy – a drastic reduction of carbon dioxide emissions – cannot be achieved with the current tools at our disposal.

No matter how low the price of wind-generated electricity, no matter how sophisticated electric cars become, no matter how quickly we convert domestic heating to electricity, the cost of a secure energy supply given the current state of technology will be very significantly higher than at present. It need not be like this if the best minds on both sides of the ideological divide can work together to develop better solutions.

Posted in Climate change, Energy, Newsletter, Policymaking | Leave a comment

Rush to judgement

This is proving to be a pretty bad season for Atlantic hurricanes, after several years in which few intense ones made landfall. Hurricane Harvey, which started in late August, was the first major hurricane to hit the United States mainland since 2005 (Hurricane Wilma, in the same year as the flooding of New Orleans caused by Katrina). Irma, coming along a few days later and only dissipating this week, was a category 5 storm bringing destruction to the Caribbean and Florida.

Despite the intensity of the storms, the total death toll so far is around 150. Each fatality is a personal tragedy, of course, but without evacuation and shelter the number would have been far, far worse. Because of the widespread destruction of property, the communities hit will take quite some time to recover from this, but recover they will. And it’s certain that those same communities will be hit by more violent weather at some time in the future.

Tropical storms – hurricanes in the Atlantic, typhoons in the Pacific – are a fact of life for countries that lie in their path. While we cannot predict them until they begin to form, we can record their characteristics, path and effect in great detail. And, while we can make comparisons across recent decades, we have far less detail about highly destructive storms from the first half of the 20th Century and earlier. This year may have set records, but only over a comparatively short timescale. How it sits with past centuries is anyone’s guess.

Given the severity of the tropical Atlantic storms over the past months, it is human nature to look for a reason why this year has been so much worse than the previous decade. In part, this is because of the vagaries of the path taken by such storms. There have indeed been very intense hurricanes, but most of them over recent years have either not made landfall or have weakened before doing so.

But now, some people are increasingly talking about the impact of climate change (for which, we should understand man-made climate change). In particular, they are using the destruction and loss of life in the Caribbean and USA to bolster their call to take radical action to cut greenhouse gas emissions. For example, the Grantham Institute’s Bob Ward, never one to miss an opportunity to bang the drum, contributed an article headlined Irma and Harvey lay the cost of climate change denial at Trump’s door to the Observer.

The moral outrage is palpable: “But the merciless assault on the US mainland by Harvey and Irma should be forcing the president to recognise the consequences of his arrogance and complacency in dismissing the research and analysis carried out by scientists.” If that isn’t enough, the head of the Catholic Church has also made his contribution (Hurricane Irma: Pope Francis condemns climate change sceptics): “Those who deny it (climate change) should go to the scientists and ask them. They are very clear, very precise.”

Most politicians will find it hard to resist this pressure, especially couched in terms that mix moral responsibility with (supposedly) hard science. President Trump is probably the exception, as he is in so many ways. There are undoubtedly others who simply don’t want to put their head above the parapet; there is not political capital to be gained by voicing even mild criticism of radical action.

Note that the terms ‘scepticism’ (a quality that all scientists should have in abundance) and ‘denial’ are increasingly being used almost interchangeably. Normally, we might want to make a distinction between those who completely disagree with a hypothesis on what we believe to be spurious grounds and those who want to discuss more detailed interpretation. Properly grounded criticism from sceptics should strengthen a viable hypothesis but weaken an already dubious one.

In the case of climate change, the vast majority of people dismissed as ‘deniers’ or ‘sceptics’ have a lot of common ground with those who cleave to the received wisdom. Their disagreements are quantitative rather than qualitative. But a lower than predicted degree of warming (as has actually been the case for the last 20 years) leads to a different policy prescription: a focus of adaptation rather than mitigation.

This is why climate change activists and many mainstream scientists would apparently reserve a special place in Hell for those who take issue with them, however constructively. Because if the projections of the impact on humans of a 4°C+ rise in average temperatures over this century turn out to greatly exaggerated, then the entire edifice of climate change policy as enshrined in the Paris Accord could simply collapse. And that, for the IPCC and climate change establishment, would be intolerable.

In fact, nothing in emissions reduction policy would have made a scrap of difference to the two recent hurricanes (nor, for that matter, to heatwaves, droughts or any other extreme weather that some try to ‘link’ to climate change). Communities in vulnerable areas will always be at risk, including to the risk of flooding from naturally steadily rising sea levels. Focussing on adaptation and protection will pay dividends, installing more wind turbines will not.

While using resources to make communities more resilient, there is nothing to stop us continuing to develop alternative energy generation and storage systems and rolling these out as they become dependable and economic. These will make a real impact on future emissions as part of a least regrets policy. But no amount of moral blackmail will enable us to tune the climate to our liking when long term natural processes are underway, about which we understand very little and cannot control.

Posted in Climate change, Newsletter | Leave a comment

Debunking claims about renewable energy costs

Letter in the Times, 13 September

Sir, The claim that the Hornsea Two wind farm “will cut the cost of green energy” (report, Sep 12) is factually correct but nonetheless misleading. Generation costs have undeniably come down but the real cost is that of delivering a secure supply of electricity to the end user. Even if wind turbines produced electricity at zero cost, the total system cost would be higher than a system relying purely on nuclear and gas-fired power stations. The need for back-up means retaining conventional generation capacity, to be run intermittently, inefficiently and at high cost until the holy grail of cheap energy storage on a vast scale can be realised. In the meantime, domestic and industrial consumers will continue to pay higher bills.
Martin Livermore

Scientific Alliance, Cambridge

Posted in Energy, Letter, Uncategorized | Leave a comment