Archive for the ‘Science’ Category


December 1, 2016

Every now and again we get to hear about people who, through their heroic deeds, managed to save other human beings from certain death. These compassionate acts of bravery typically happen during wars, terror attacks or natural disasters. But I recently became aware of someone who helped save 1 billion people and did it in the time of peace. To add another twist to the story his efforts were funded by the fortunes of two of the US greatest industrialists who had at the time been dead for a few decades. The name of this hero hardly anyone knows about is Norman Borlaug.

In 1960 International Rice Research Institute was established in the Philippines with the financial assistance of the two charitable trusts – Rockefeller Foundation and Ford Foundation. The institute worked on developing new genetic varieties of rice capable of better yields. In 1961 they crossed the Chinese dwarf rice known as Dee-geo-woo-gen and tall, bushy Peta rice from Indonesia. This produced the semi-dwarf variety which became known as IR8. In field trials in optimal conditions IR8 delivered an astonishing yield of over 10t per hectare – the average rice yield in the Philippines at that time was about 1t per hectare.

Following the introduction of the IR8 rice variety the annual rice production in the Philippines rose from 3.7 to 7.7 million tons. Similar results were achieved in other countries. The person requested by the Indian government in 1961 to modernise its agriculture was Norman Borlaug, who already had an impressive track record of introducing high-yielding wheat varieties in Mexico (funded by Rockefeller Foundation). Due to the adoption of IR8 and improvement in agricultural practices the average rice yield per hectare in India rose from 2t in 1960s to 6t in 1990s which helped avert mass famine. The price reduced from $550 per ton to $200 per ton making rice more affordable to the poor. According to some estimates the Green Revolution spearheaded by Norman Borlaug saved up to 1 billion people from starvation.

Like all achievers, Norman Borlaug has his share of detractors. In particular the Greens hate intensive agriculture ushered in by the Green Revolution because it relies on industrial fertilisation and uses much more water than a 1-ton-per-hectare approach. These miserable, negative people would be happy to sacrify the lives of millions in the name of ecological purity. It seems that being despised by the Greens is almost a prerequisite for having a meaningful and productive life these days but I digress…



Told ya

May 19, 2016

Da-boss went on record before  with a claim that climate change may be less of a threat to humanity than a number of other, little publicised scenarios like superbugs.

I am not buying into the argument that because some analyses indicate potential for catastrophic warming by 2100 we as a humanity should automatically commit all available resources to combat this particular (perceived) threat. There is any number of possible scenarios which might spell doom to mankind and we have to prioritise based on the credibility of individual threats. Other serious contenders are for example (…) Super-bugs

What I did not know is how much more deadly microbes resistant to antibiotics are likely to be than global warming but a recently published piece of research is a good starting point for this comparison. The article on the BBC website summarises the findings of the Review on Antimicrobial Resistance which started in 2014:

The review says the situation will get only worse with 10 million people predicted to die every year from resistant infections by 2050. And the financial cost to economies of drug resistance will add up to $100 trillion (£70 trillion) by the mid-point of the century. (…) Lord Jim O’Neill, the economist who led the global review, told the BBC: (…) “If we don’t solve the problem we are heading to the dark ages, we will have a lot of people dying.”

This is a very sobering prospect which we cannot afford to ignore. The problem is a lot of contingency funding in the national budgets has already been committed to fighting another threat to humanity – global warming. So how do the two scenarios compare in terms of their lethality? How deadly climate change is likely to be by mid-century? WHO has a factsheet which quantifies it:

Between 2030 and 2050, climate change is expected to cause approximately 250 000 additional deaths per year, from malnutrition, malaria, diarrhoea and heat stress.

The direct damage costs to health (i.e. excluding costs in health-determining sectors such as agriculture and water and sanitation), is estimated to be between US$ 2-4 billion/year by 2030

So there we have it. The pet cause of the environmentalists climbing oil rigs and vandalising gas stations is likely to kill 250 000 a year while drug resistance in bugs which few people worry about can cause up to 40 times more deaths. Another quote from my previous post sums it up:

All of this does not mean that the AGW is a fantasy but, being only one of many possible global threats, it should compete for the contingency funding with other nasty scenarios like accidental release of smallpox virus, emergence of drug-resistant E.coli etc. Instead, AGW has become the only thing activists obsess about, which is dangerously narrow-minded.



The Chernobyl nature reserve

May 2, 2016

To those used to the doom and gloom reporting surrounding the Chernobyl disaster this article may come as a surprise:

The exclusion zone around the Chernobyl nuclear plant, which was evacuated in 1986 after a devastating explosion and fire, has become a wildlife haven on a par with heavily-protected nature reserves, scientists have found.

A detailed survey of the huge forested area around the stricken plant has revealed that it is teeming with large animals such elk, roe deer, red deer, wild boar and wolves despite being contaminated with radioactive fallout

Even more encouragingly:

The scientists found no evidence to support earlier studies suggesting that wildlife in the region had suffered from the radiation released after the Chernobyl accident of 1986


The absence of human activity in the exclusion zone has benefited the wildlife of the region more than any possible damage it may have suffered as a result of coming into contact with radioactive elements, the researchers said.

So, to sum it all up, we now know that the hysterical predictions made by the likes of Greenpeace were politically motivated bollocks – the wildlife around Chernobyl is doing remarkably well. The main threat to nature is not exotic nuclear contamination but rather humans going about their everyday lives. But don’t tell this to the greenies who, instead of advocating to eliminate nuclear power, might want to eliminate humanity…



The anatomy of scientific fraud

July 18, 2015

Hardly a day goes by without the media reporting the results of recent scientific research into our health and well-being. Sometimes these news items (the three examples linked below are all taken from the recent issues of New Zealand Herald) appear perfectly reasonable:

Exercising cuts risk of breast cancers – scientists

Women aged 50-plus urged to do at least five hours of exercise a week.

sometimes they sounds a bit puzzling:

Bad moods ‘make sugary foods taste less sweet’ – study

Our emotions affect flavour and can dull the sweetness of sugary foods, according to a study. The research also found sour foods taste even more sour when you’re feeling down.

and sometimes downright weird :

Drinking orange juice may raise risk of skin cancer – research

Drinking just two glasses of orange juice a day could increase your risk of getting the deadliest form of skin cancer.

The study reported in the screenshot below falls between puzzling and weird but otherwise does not look out of the ordinary:


So what – eating chocolate can aid weight loss and has a number of other health benefits. If drinking orange juice can cause skin cancer then the chocolate study findings do not strike as particularly suspect. This, however, was not your normal scientific study but rather a sting operation. It was carried out by a journalist John Bohannon who set out to prove that it is possible to publish a paper based on seriously flawed research in scientific journals and that mainstream media will report these findings as fact.

I Fooled Millions Into Thinking Chocolate Helps Weight Loss. Here’s How

You might think that Mr Bohannon had falsified the results the paper was based on. It would be very difficult to detect outright fraud like this but this is not what had happened. The study had really been carried out and its results reported truthfully. The paper is a joke not because Mr Bohannon lied but because the statistical methods employed were flawed and no one picked it up:

Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.

Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out—the headline could have been that chocolate improves sleep or lowers blood pressure—but we knew our chances of getting at least one “statistically significant” result were pretty good.

So they recorded 18 different physiological measurements but only reported the ones catchy enough to make the headlines and – because of the statistical variation caused by a small study population – they were always going to get some statistically significant results to report. Very clever.

According to the publisher of the Health News Review, Gary Schwitzer:

“[John Bohannon is] really only scratching the surface of a much broader, much deeper problem,” Schwitzer says. “We have examples of journalists reporting on a study that was never done. We have news releases from medical journals, academic institutions and industry that mislead journalists, who then mislead the public.” And the pressure to publish or perish, he says, can lead well-intentioned scientists to frame their work in ways that aren’t completely accurate or balanced or supported by the facts. “We are really mired in a mess, the boundaries of which few people really have a sense for,” says Schwitzer.

Chocolate, anyone?


An extreme risk of misunderstanding

November 28, 2014

Some newspaper articles must be read very carefully in order to understand their real message and realise that the title was quite misleading. A good example is the “Risk from extreme weather set to rise” piece recently published in the Science & Environment section of the BBC website:


In the body of the article we read:

“For most hazards, population increase contributes at least as much as climate change – sometimes more.”

So the problem is largely caused by the development of marginal land to house the increasing population. In other words people end up exposed to the extreme weather because we are running out of safe places to settle.

They warn that the effects of extremes will be exacerbated by the increase in elderly people, who are least able to cope with hot weather.

So some of the problem is caused by the fact people live longer. This is similar to the issue of cancer being “on the rise” – people get cancer at 85 because they did not die of TB at 55. Increased life expectancy in saner times used to be celebrated as a triumph of medicine but these days we manage to present everything as a problem.

Urbanisation will make the issue worse by creating “heat islands” where roads and buildings absorb heat from the sun.

Similar to the above – a technological advancement which saves land and reduces commuting time and transport emissions is presented as a problem. I guess if we all lived in mud huts on one acre plots of savanna the scientists would be more positive about the future?

The authors say cutting greenhouse gas emissions is essential. But they argue that governments will also need to adapt to future climatic shifts driven by climate change.

The message is beginning to filter through to the eco-nuts – adaptation is the way to go.

They suggest threats could be tackled through a dual approach. The simplest and cheapest way of tempering heatwaves, they say, is to maintain existing green space.

So (part of) the answer is sensible urban design, with parks and green spaces. This is something urban designers have known for decades.

The authors say air conditioners are the most effective way of keeping cool – but they are costly, they dump heat into city streets and their use exacerbates climate change.

So the air-conditioners work but are expensive to run and only pump heat from area to another. This is something mechanical engineers have known for decades.

It finds that large-scale engineering solutions like sea walls offer the most effective protection to coastal flooding – but they are expensive, and when they fail the results can be disastrous.

So building dikes is better than putting all houses on stilts but when dikes fail there is trouble. This is something civil engineers have known for decades.

It puts a figure on those at greatest overall risk: populations in poor countries make up only 11% of those exposed to hazards but account for 53% of the disaster deaths.

So wealth buys resilience. This is something most people blessed with common sense have known for decades.

Some economists argue this shows that poor nations should increase their economies by burning cheap fossil fuels because that will allow them to spend more later on disaster protection.


Power to the people (2)

November 24, 2014

In the recent post I commented on the article by two engineers who had done research on the viability of the renewable energy, commissioned by the co-owner of Google, Larry Page. Their conclusion was that even a wholesale adoption of today’s renewable generation will not be able stave off the catastrophic climate change predicted by some climatologists. Does it mean that all hope is lost?


This is where things are getting interesting. While I do not share Ross Koningstein & David Fork’s obsession with the carbon emissions humanity will at some point need to move on from fossil fuels. Coal and oil have contributed to the incredible advancement in all areas of life but they will not last forever. So what does the linked article say we should do?

What’s needed, we concluded, are reliable zero-carbon energy sources so cheap that the operators of power plants and industrial facilities alike have an economic rationale for switching over soon—say, within the next 40 years. Let’s face it, businesses won’t make sacrifices and pay more for clean energy based on altruism alone.

As opposed to the green Nazis keen on regulating, taxing and policing Messrs Koningstein & Fork prefer a market-driven mechanism. If we come up with new technologies which are economically viable, the operators will make a switch. Not for the noble reasons like saving the polar bears from extinction but purely to make more money. So far so good. But how to achieve that? Would solar be the way to go?

Solar panels, for example, can be put on every rooftop but can’t provide power if the sun isn’t shining.

Hallelujah! From my observations this simple fact fails to register with 90% of the eco-minded idealists. Yes – coal is dirty but we can burn it after dark or on still days, when wind turbines do not spin. But are there any zero-carbon alternatives?

What, then, is the energy technology that can meet the challenging cost targets? How will we remove CO2 from the air? We don’t have the answers. Those technologies haven’t been invented yet.

There we go. Despite what the environmentalists keep telling us solar and wind are not the answers – the numbers simply do not stack up. But in which direction might we look for the solutions?

A disruptive fusion technology, for example, might skip the steam and produce high-energy charged particles that can be converted directly into electricity. For industrial facilities, maybe a cheaply synthesized form of methane could replace conventional natural gas. Or perhaps a technology would change the economic rules of the game by producing not just electricity but also fertilizer, fuel, or desalinated water. In carbon storage, bioengineers might create special-purpose crops to pull CO2 out of the air and stash the carbon in the soil.

Let me summarise what the authors of the article propose as possible drivers of the zero-carbon reality:

  • Relying on the market forces, as opposed to subsidies and tax disincentives
  • Nuclear fusion, ideally without steam generation
  • Synthesised methane as fuel
  • Producing fertiliser as by-product of power generation
  • Genetically engineered crops to bind carbon from the atmosphere

If you look at them every single bullet point above runs against the Green agenda. Free market, nuclear, hydrocarbons, fertiliser industry, genetic engineering. But these are the ideas environmentally minded engineers are left with when they have cracked their numbers. Everything else, like solar panels, wind farms and composting are just the PR stuff.

As mentioned above I do not share the authors’ concerns that carbon emissions will cause catastrophic changes to climate. What this post aims to show is that the environmentalists live in a fairy-land where imaginary solutions are proposed to be applied to equally imaginary problems.

Power to the people (1)

November 23, 2014

What frustrates me most in the power generation debate are the wishy-washy statements of the environmentalists presented in the media as facts. Conversely, every now and again knowledgeable and reasoned voices can be heard coming from the people with engineering background who have found an independent source of funding and are not aligned with the governments or the Big Green. One such article has just been published and it makes for an interesting reading – if one wants to know what the numbers are telling us that is.

Windmills in a row on cloudy weather

Larry Page is filthy rich.  Having made billions (about thirty, to be precise) on the search engine Google he is now helping search for the answers to the pressing problems of the humanity. Quoting from Wiki:

Page is an investor in Tesla Motors. He has invested in renewable energy technology, and with the help of, Google’s philanthropic arm, promotes the adoption of plug-in hybrid electric cars and other alternative energy investments. Page is also interested in the socio-economic effects of advanced intelligent systems and how advanced digital technologies can be used to create abundance (as described in Peter Diamandis’ book), provide for people’s needs, shorten the workweek, and mitigate the potential detrimental effects of technological unemployment.

As part of his interest in the renewable power generation Larry Page funded in 2007 the Renewable Energy Cheaper than Coal (RE<C) initiative to help drive down the cost of renewable energy. The project was shut down in 2011 but two of the engineers involved in it have now shared what they had learnt from the research conducted for RE<C. Their article was published in the online journal of the Institution of the Electrical and Electronics Engineers IEEE and here are some tasty quotes, adorned with my comments:

As we reflected on the project, we came to the conclusion that even if Google and others had led the way toward a wholesale adoption of renewable energy, that switch would not have resulted in significant reductions of carbon dioxide emissions. Trying to combat climate change exclusively with today’s renewable energy technologies simply won’t work; we need a fundamentally different approach.

I have long claimed that the current crop of renewable generation technologies are of limited usefulness but my angle is different. Since I do not obsess about the CO2 emissions my main concern is that wind and solar are intermittent, unreliable and prohibitively expensive. But Ross Koningstein & David Fork are taking things a step further. In their view even a wholesale adoption of the renewable generation as we know it would not materially change things as far as the climate change is concerned. This pulls the rug from under the claims of the green lobby that all we have to do is build more wind turbines using money collected through taxing coal and the (perceived) problems will go away. They will not.

RE<C invested in large-scale renewable energy projects and investigated a wide range of innovative technologies, such as self-assembling wind turbine towers, drilling systems for geothermal energy, and solar thermal power systems, which capture the sun’s energy as heat. (…) By 2011, however, it was clear that RE<C would not be able to deliver a technology that could compete economically with coal, and Google officially ended the initiative and shut down the related internal R&D projects.

Competing with coal fired generation on price was always a tall order and I am not surprised that they failed. Coal is abundant and cheap, the technology for turning it into power is mature, safe and reliable so I doubt Larry Page ever harboured a realistic hope of delivering something better that would not cost more.

Our study’s best-case scenario modeled our most optimistic assumptions about cost reductions in solar power, wind power, energy storage, and electric vehicles. In this scenario, the United States would cut greenhouse gas emissions dramatically: emissions could be 55 percent below the business-as-usual projection for 2050.

This sound totally unrealistic but hey, with USD30b in the bank one can afford to be a dreamer!

Hansen set out to determine what level of atmospheric CO2 society should aim for “if humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted.” His climate models showed that exceeding 350 parts per million CO2 in the atmosphere would likely have catastrophic effects.

Using the self-declared environmental activist masquerading as a climate scientist as a reference is dubious but let us see where this is going.

We decided to combine our energy innovation study’s best-case scenario results with Hansen’s climate model to see whether a 55 percent emission cut by 2050 would bring the world back below that 350-ppm threshold. Our calculations revealed otherwise. (…) So our best-case scenario, which was based on our most optimistic forecasts for renewable energy, would still result in severe climate change, with all its dire consequences: shifting climatic zones, freshwater shortages, eroding coasts, and ocean acidification, among others.

So, cutting to the chase, even if all stars aligned perfectly – political will, technological development, co-operation of the general population – we still would not be able to return to the “safe” CO2 levels. Bummer.


Those calculations cast our work at Google’s RE<C program in a sobering new light. Suppose for a moment that it had achieved the most extraordinary success possible, and that we had found cheap renewable energy technologies that could gradually replace all the world’s coal plants—a situation roughly equivalent to the energy innovation study’s best-case scenario. Even if that dream had come to pass, it still wouldn’t have solved climate change. This realization was frankly shocking: Not only had RE<C failed to reach its goal of creating energy cheaper than coal, but that goal had not been ambitious enough to reverse climate change.

There we have it. If you believe that CO2 is bad then we are already as good as cooked.

Power to the people (2)

The less the better

November 16, 2014

In this post I will share with the readers the recent technological advances aimed at making the internal combustion engine more fuel efficient.

Three years ago I bought a brand new Suzuki Swift with a 1372cc normally aspirated petrol engine. It is such a fantastic car that I was surprised to learn Suzuki are about to improve it by putting in a new “DualJet” 1242cc engine which will use much less fuel. Before we get into how they managed to achieve it let us look at what physical and engineering realities impact on the fuel economy.

In essence internal combustion automotive engines convert the chemical energy in carbohydrates into mechanical energy required to move the vehicle. In any such process some energy will be wasted – mainly as heat which is dispersed into the atmosphere. There are certain thermodynamic rules which govern the overall efficiency of an engine which, at the practical level, mean that for example:

  • Higher compression of fuel in cylinders leads to better thermal efficiency
  • Smaller engines have lower friction and inertial losses which minimises waste
  • Turbo-charged engines recover some of the wasted energy but cost more to build and service

I will now look at how the recent engine designs by various car makers take advantage of one or more of the above principles.

The EcoBoost family of Ford engines are small displacement turbo-charged petrol units. Since their turbo only activates when the car is pushed hard most of the time they operate as economical small engines with low friction and inertial losses. The additional savings come from the fact that the EcoBoost engines are lightweight, reducing the kerb weight of the car. What a brilliant idea – a well engineered compact and lightweight engine with more power available through turbo-charger which only kicks in when needed. Here is the cut-out of an EcoBoost engine, with the turbo on the right.


General Motors chose a different way of making their engines more economical, known as variable displacement. Would it not be great to have a small engine for cruising at modest speeds and a larger one for uphill stretches and acceleration off the mark? Well, this is what Active Fuel Management effectively provides. The 5.3L engine in Chevrolet Silverado switches off four of its eight cylinders when the power demand is low. If the pedal is pushed hard all cylinders seamlessly activate and full power is available. The fuel saving is less impressive than in the EcoBoost engines but there are no extra costs associated with the turbo. Great.

Hybrid petrol/electric engines utilise a somewhat similar concept in that a small displacement internal combustion engine is supplemented by a battery powered electric power train. When driven at normal speeds Toyota Prius is powered by a small, economical petrol engine. During uphill acceleration the electric motor activates giving the extra push when required. The extra fuel saving is provided by regenerative breaking. This is when the kinetic energy of the decelerating car is converted into electric power and stored in the batteries to be used during re-start. Well done, Toyota.


But what have Suzuki done to make the engine in my Swift obsolete? They went for a higher compression ratio. Instead of the usual 10.5 their new “DualJet” unit compresses the fuel/air mix in the cylinders by a factor of 12.0, which gives higher thermal efficiency. This was not easy to achieve as petrol vapours tend to self-ignite when compressed – the phenomenon known as “pinging”. But Suzuki engineers found out that by improving the engine cooling of the engine block, re-shaping the combustion chamber and using two injectors per cylinder they can suppress pinging. This resulted in a small and economical engine which, according to the official figures, reduced the fuel consumption of the new Swift to 4.3L/100km. Click on the photo below to watch a short movie explaining the DualJet concept in detail.


There seems to be no end to progress in the world of technology.

Steven Koonin on climate change

September 21, 2014

This is possibly the best overview of what we know and do not know about climate change by Steven Koonin who was undersecretary for science in the Energy Department during President Barack Obama’s first term. It is well worth a read in full but I will reproduce a few crucial passages for those too busy to click on links:

The idea that “Climate science is settled” runs through today’s popular and policy discussions. Unfortunately, that claim is misguided. It has not only distorted our public and policy debates on issues related to energy, greenhouse-gas emissions and the environment. But it also has inhibited the scientific and policy discussions that we need to have about our climate future.


The crucial scientific question for policy isn’t whether the climate is changing. That is a settled matter: The climate has always changed and always will (…) Nor is the crucial question whether humans are influencing the climate. That is no hoax: There is little doubt in the scientific community that continually growing amounts of greenhouse gases in the atmosphere, due largely to carbon-dioxide emissions from the conventional use of fossil fuels, are influencing the climate (…) Rather, the crucial, unsettled scientific question for policy is, “How will the climate change over the next century under both natural and human influences?”


We often hear that there is a “scientific consensus” about climate change. But as far as the computer models go, there isn’t a useful consensus at the level of detail relevant to assessing human influences (…) These and many other open questions are in fact described in the IPCC research reports, although a detailed and knowledgeable reading is sometimes required to discern them. They are not “minor” issues to be “cleaned up” by further research. Rather, they are deficiencies that erode confidence in the computer projections (…) Yet a public official reading only the IPCC’s “Summary for Policy Makers” would gain little sense of the extent or implications of these deficiencies. These are fundamental challenges to our understanding of human impacts on the climate, and they should not be dismissed with the mantra that “climate science is settled.” (…) Policy makers and the public may wish for the comfort of certainty in their climate science. But I fear that rigidly promulgating the idea that climate science is “settled” (or is a “hoax”) demeans and chills the scientific enterprise, retarding its progress in these important matters.


Any serious discussion of the changing climate must begin by acknowledging not only the scientific certainties but also the uncertainties, especially in projecting the future. Recognizing those limits, rather than ignoring them, will lead to a more sober and ultimately more productive discussion of climate change and climate policies. To do otherwise is a great disservice to climate science itself.

The above so brilliantly encapsulates my own views that I do not have much to add. If written by a skeptic this essay would be dismissed as a Big Oil PR job but the fact it was penned by Obama’s former scientific adviser should lend it more credibility.


The end of the rotten weather

September 2, 2014

This is an environmentally responsible post with a twist. In the past I have gleefully reported on the so-called “pause” in the global warming (here and here) evidenced by the plateau in the temperature record after 1997. To the delight of the climate sceptics, there has been no warming in the last 17-18 years. But I may yet have to eat a humble pie since, according to the research paper by the US Naval Research Laboratory published in Geophysical Research Letters and reported in the Guardian:

the “pause” will soon end and the warming resume. The research paper shows convincingly that the temporary hiatus in the recorded warming was due to low solar activity and no El-Nino events which would push the temperatures up. Both factors are due to shift in the near future and the relentless CO2-driven warming will be back with a vengeance. A few quotes from the Guardian article are reproduced here for your convenience.

The analysis shows the relative stability in global temperatures (…) is explained primarily by the decline in incoming sunlight associated with the downward phase of the 11-year solar cycle, together with a lack of strong El Niño events. These trends have masked the warming caused by CO2 and other greenhouse gases. As solar activity picks up again in the coming years, the research suggests, temperatures will shoot up at 150% of the rate predicted by the UN’s Intergovernmental Panel on Climate Change.

So there you have it – the science has again proved sceptics like me wrong. The research paper even specifies when exactly we can expect the resumption of the warming on steroids:

The world faces record-breaking temperatures as the sun’s activity increases, leading the planet to heat up significantly faster than scientists had predicted for the next five years

So, shamed and disgraced, I may decide to ditch blogging and chain myself in protest to the nearest SUV within the next five years.

Ok – now the twist. Have you noticed that the article in the Guardian is dated 27 July 2009? This was almost exactly … let me count it … 14 take away 9 –  five years ago!

How cool is this?

August 25, 2014

There are some indications that, after a 17 years long pause in the warming trend, the global climate may be going into a cooling phase of the cycle. As always, da-boss is here to report and comment on this exciting development.

The climate change debate is typically dominated by propaganda and speculation so let us start by reviewing the graph of the actual global temperature record for the last 30 years:


To my semi-trained eye the fat black line shows a plateau starting around 2002 and if we look at the coloured wiggle then there does not appear to have been much warming since maybe 1997. I discussed the reliability of the temperature measurements in a series of posts a while ago but, imperfect as it is, the global temperature record appear to shows no warming for the last 15 or so years. This may surprise you since the media regularly report that the situation is “worse than we thought” and demand urgent action to reduce the CO2 emissions. Heat waves and draughts are presented as evidence for global warming and a sign of things to come. But, logically, if there has been no overall warming in the last decade and a half there should have also been some unusually cold spells to balance things out. The mainstream media are not that keen to report them but here are some examples I managed to track down.

As shown in the plot below, data from University of Illinois Cryosphere Today shows that Antarctic Sea Ice Extent Anomaly has been positive since July 5th, 2011. We are now on day 1001 of positive anomaly based on the 1979-2008 baseline.

So the Antarctic ice cover has been above average for almost three years now and is currently at an all-time high. This may not mean much in terms of the global trends but if we are getting excited about the shrinking Arctic ice cover then why ignore the news from the South pole?

A guide who’s been hiking the mountain for the past seven years (…) laughed when he was asked about the likelihood that Kilimanjaro’s snowcap would disappear soon. The glaciers, he claimed, no longer are shrinking, but growing.  “Before, we were seeing glaciers melting,” he explained during a recent descent from the summit. “But from 2010 to now, we have been seeing new glaciers.”

Glaciers in parts of the greater Himalayas are growing despite the worldwide trend of ice melting due to warmer temperatures, a study has found.

Hazards common in arctic and alpine areas but described as “extremely unusual” in the UK during the summer have been found on Ben Nevis. A team of climbers and scientists investigating the mountain’s North Face said snowfields remained in many gullies and upper scree slopes. On these fields, they have come across compacted, dense, ice hard snow call neve. Neve is the first stage in the formation of glaciers, the team said.

So some existing glaciers are growing and new ones are forming. This is not a big deal in the scheme of things but perhaps worth noting considering the amount of media coverage devoted to other glaciers shrinking.

So far the Summer of 2014 is shaping up to be the coldest summer on record in the U.S.A., with temperatures rarely breaking the 90-degree mark.

Weather records have tumbled across North America, with freezing temperatures even in the southern US.

So the US had an unusually cold winter and summer. It is not that remarkable – cold spells are bound to happen from time to time – as long as a heat wave somewhere is viewed in the same context. To me the above news reports simply point to the variability of climate but some people believe that a cooling trend is beginning to develop.

a sharp cooling change appears to be developing and set to hit in the next five years

The PDO cool mode has replaced the warm mode in the Pacific Ocean, virtually assuring us of about 30 years of global cooling, perhaps much deeper than the global cooling from about 1945 to 1977.

You may say that the Earth’s climate is always either warming or cooling so it does not matter which way thing go but, historically, colder climate did more damage to humans than heat:

It wasn’t generic climate “change” that doomed Mediterranean civilizations around 1,200 B.C., just as it wasn’t generic climate “change” that revitalized them several hundred years later. Cool temperatures shorten growing seasons. Cool temperatures also reduce evaporation from the seas, resulting in less precipitation over land. The result is fewer months to grow crops, colder temperatures during the growing season, and less rainfall to hydrate the crops. Crop failures and famine predictably follow. By contrast, warmer temperatures lengthen growing seasons, facilitate more oceanic evaporation, and produce more vital rainfall to hydrate crops. Climate “change” doesn’t destroy crop production, climate cooling does.

This is very interesting in the context of us spending 1 billion dollars a day to fight global warming…–DOUBLE-needed-combat-climate-change-claims-report.html

Global climate update

December 17, 2013

Recognising the depth of concern about the environmental issues da-boss runs periodic updates on the goings-on in the weather and climate. So has anything remarkable happened since the previous post on climate?

Let us start by revisiting the dire prediction Al Gore made in December 2008:

If the link does not open on your PC Al Gore prophesises in the clip that the Arctic ice cap would disappear in 5 years. Well, 5 years have just passed so let us check how the Goracle’s prediction squares up with reality:


From this image prepared by NSIDC on 15/12/2013 it looks like the Arctic ice cover is quite extensive so another environmental prediction turned out to be a complete flop. I may be the last man alive who reckons that public figures should be accountable for what they say but I am eagerly expecting a public pronouncement from Al Gore, admitting he was wrong in 2008. In case I miss it could you please post the link to a news item reporting Al Gore’ s retraction in the comment section?

A failed environmental prediction by a serial liar like Al Gore does not mean there are no weather extremes happening in the World. For example a powerful typhoon hit the Philippines in November. The resulting 6,000 or so casualties, while tragic, were nowhere near the death toll of other tropical storms. In fact Haiyan rated outside the top 35 tropical cyclones in this sad statistic:


Predictably, there has been no trend in the number of typhoons hitting the Philippines in the last 100 years:


so I guess tragedies like Haiyan are going to happen once in a while – like they have in the past.

Looking for the recent weather anomalies I have come across the reports of a snowy winter blast in Israel:


and Egypt:


The global temperature metric for November was +0.19C – slightly down from the October figure of +0.29C.


So all is quiet on the climate front – the Arctic ice has not melted as feared, there are some weather extremes but with no trend and the global warming “pause” continues. Nothing much to report.