A small revolution happens at the World Meteorological Organization (WMO). Its main governing body (WMO congress) is discussing a draft resolution that national weather services shall provide free and unrestricted access to climate data. The problem is the fine print. The fine print makes it possible to keep on refusing to share important climate data with each other.
The data situation is getting better, more and more countries are freeing their climate data. The USA, Canada and Australia have a long traditions. Germany, The Netherlands, Finland, Sweden, Norway, Slovenia, Brazil and Israel have just freed their data. China and Russia are pretty good with sharing data. Switzerland has concrete plans to free their data. I probably forgot many countries and for Israel you currently still have to be able to read Hebrew, but things are definitely improving.
That there are large differences between countries is illustrated by this map of data availability for daily mean temperature data in the ECA&D database, a dataset that is used to study changes in severe weather. The green dots are data where you can download and work with the station data, the red dots are data that ECA&D are only allowed to use internally to make maps. In the number of stations available you can clearly see many national boundaries; that is not just the number of real stations, but to a large part national policies on data sharing.
Sharing data is importantWe need this data to see what is happening to the climate. We already had almost a degree of global warming and are likely in for at least another one. This will change the sea level, the circulation, precipitation patterns. This will change extreme and severe weather. We will need to adapt to these climatic changes and to know how to protect our communities we need climate data.
Many countries have set up Climate Service Centres or are in the process of doing so to provide their populations with the information they need to adapt. Here companies, (local) governments, non-governmental organisation and citizens can get advice on how to prepare themselves for climate change.
It makes a large difference how often we will see heat waves like the one in [[Europe in 2003]] (70 thousand additional deaths; Robine et al., 2008), in [[Russia in 2010]] (a death toll of 55,000, a crop failure of ~25% and an economic loss of about 1% of the GBP; Barriopedro et al., 2011) or now in India. It makes a large difference how often a [[winter flood like in the UK in 2013-2014]] or [[the flood now in Texas and Oklahoma]] will occur. Once every 10, 100 or 1000 years? If it is 10 years, expensive infrastructural changes will be needed, if it is 1000 years, we will probably decide to life with that. It makes a difference how long droughts like the ones in California or in Chile will last and being able to make regional climate prediction requires high-quality historical climate data.
One of the main outcomes of the current 17th WMO congress will be the adoption of the Global Framework on Climate Services (GFCS). A great initiative to make sure that everyone benefits from climate services, but how will the GFCS framework succeed in helping humanity cope with climate change if there is almost no data to work with?
In their own resolution (8.1) on GFCS, the Congress recognizes this themselves:
Congress noted that EC-66 had adopted a value proposition for the international exchange of climate data and products to support the implementation of the GFCS and recommended a draft resolution on this topic for consideration by Congress.
To understand climate, we need a global overview. National studies are not enough. To understand changes in circulation, interactions with mountains and vegetation, to understand changes in extremes, we need spatially resolved information and not just a few stations.
HomogenizationTo reduce the influence of measurement errors and non-climatic changes (inhomogeneities) on our (trend) assessments we need dense networks. These errors are detected and corrected by comparing one station to its neighbours. The closer the neighbours are, the more accurate we can assess the real climatic changes. This is especially important when it comes to changes in severe and extreme weather, where the removal of non-climatic changes is very challenging.
For the global mean land temperature the non-climatic changes already represent 25% of the change: After homogenization (to reduce non-climatic changes) in GHCNv3 the trend is 0.8°C per century since 1880 (Lawrimore et al., 2011; table 4). In the raw data this trend is only 0.6°C per century. That makes a large difference for our assessment how far climate change has progressed, while for large parts of the world we currently do not have enough data to remove such non-climatic changes well. This results in large uncertainties.
This 25% is global, but when in comes to the impacts of climate change, we need reliable local information. Locally the (trend) biases are much larger; on a global scale many biases cancel each ohter. For (decadal) climate prediction we need accurate variability on annual time scales, not "just" secular trends, this is again harder and has larger uncertainties. In the German climate prediction project MiKlip it was shown that a well-homogenized radiosonde dataset was able to distinguish much better between prediction systems and thus to better guide the development. Based on the physics of the non-climatic change we expect that (trend) biases are much stronger for extremes than they are for the mean. For example, errors due to insolation are worst on hot, sunny and calm days, while they are much less a problem on normal cloudy and windy days and thus less of a problem for the average. For the best possible data to protect our communities, we need dense networks, we need all the data there is.
WMO resolutionTheoretically the data exchange resolution will free everything you ever dreamed of. A large number of datasets is mentioned from satellites to sea and lake level, from greenhouse gases to snow cover and river ice. But exactly for the historical climate station that is so important to put climate change into a perspective a limitation is made. Here the international exchange is limited to the [[GCOS]] Stations. The total number of GCOS Stations is 1017 (March 01, 2014). For comparison, Berkeley Earth and the International Surface Temperature Initiative have records with more than 30 thousand stations. And most GCOS stations are likely already included in that. Thus in the end, this resolution will free almost no new climate station data.
The resolution proposes to share “all available data”. But they define that basically as data that is currently already open:
“All available” means that the originators of the data can make them available under this resolution. The term recognizes the rights of Members to choose the manner by, and the extent to, which they make their climate relevant data and products available domestically and for international exchange, taking into consideration relevant international instruments and national policies and legislation.I have not heard of cases where national weather services denied access to data just for the fun of it. Normally they say it is due to "national policies and legislation". Thus this resolution will not change much.
No idea where these counterproductive national policies come from. For new instruments, for expensive satellites, for the [[Argo system]] to measure the ocean heat content, it is normally specified that the data should be open to all so that society maximally benefits from the investment. In America they naturally see the data as free for all because the tax payer has already paid for it.
In the past there may have been strategic (military) concerns. Climate and weather information can determine wars. However, nowadays weather and climate models are so good that the military benefit of observations is limited. Had Napoleon had a climate model, his troops would have been given warmer cloths before leaving for Russia. To prepare for war you do not need it more accurate than that.
The ministers of finance seems to like the revenues from selling climate data, but I cannot imagine them making much money that way. It is nothing in comparison to the impacts of climate changes or the costs of maladaptation. It will be much less than the money society invested in the climate observations. An investment that is devalued by sitting on the data and not sharing it.
All that while the WMO theoretically recognises how important sharing data is. In another resolution (9.1), ironically on big data, they write:
With increasing acceptance that the climate is changing, Congress noted that Members are again faced with coming to agreement with respect to the international exchange of data of importance of free and unrestricted access to climate-related information at both global and regional levels.
UN and Data Revolution. In August 2014 UN Secretary-General Ban Ki-moon asked an Independent Expert Advisory Group to make concrete recommendations on bringing about a data revolution in sustainable development (). The report indicates that too often existing data remain unused because they are released too late or not at all, not well-documented and harmonized, or not available at the level of detail needed for decision-making. More diverse, integrated, timely and trustworthy information can lead to better decision-making and real-time citizen feedback.All that while citizen scientists are building up huge meteorological networks in Japan and North America. The citizen scientists are happy to share their data and the weather services should fear that their closed datasets will soon become a laughing stock.
Free our climate dataMy apologies when this post sounds angry. I am angry. If that is reason to fire me as chair of the Task Team on Homogenization of the WMO Commission for Climatology, so be it. I cannot keep my mouth shut while this is happing.
Even if this resolution is a step forward and I am grateful for the people who made this happen, it is impossible that in these times the weather services of the world do not do everything they can to protect the communities they work for and freely share all climate data internationally. I really cannot understand how the limited revenues from selling data can seriously be seen as a reason to accept huge societal losses from climate change impacts and maladaptation.
Don't ask me how to solve this deadlock, but WMO Congress it is your job to solve this. You have until Friday next week the 12th of June.
[UPDATE. It might not be visible here because there are only little comments, but this post is read a lot for one without a connection to the mass media. That often happens below science posts that do not say something controversial. (All the scientists I know see data exchange as holding climate science back.) Also the tweet to this post is popular, never had one like this before, please retweet it to show your support for the free exchange of climate data.
[UPDATE. Wow, the above tweet has now been seen over 7,000 times (4th June; 19h). Not bad for a small station data blog. Never seen anything like this. Also Sylvie Coyaud blogs at La Repubblica now reports about freeing climate data (in Italian). If there are journalists in Geneva, do ask the delegates about sharing data, especially when they present the Global Framework for Climate Services as the prime outcome of this WMO Congress.]
Related readingNature published a column by Martin Bobrow of the Expert Advisory Group on Data Access, which just wrote a report on the governance of scientific data access: Funders must encourage scientists to share.
Why raw temperatures show too little global warming
Just the facts, homogenization adjustments reduce global warming
New article: Benchmarking homogenisation algorithms for monthly data
Statistical homogenisation for dummies
New article: Benchmarking homogenisation algorithms for monthly data
A framework for benchmarking of homogenisation algorithm performance on the global scale - Paper now published