Friday, June 26, 2015

Errors in 63% of temperature observations in Global Historical Climate Network database, Celsius, Fahrenheit rounding errors, apart from original precision being generally unknown, per peer reviewed study, Royal Meteorological Society Quarterly Journal, Rhines et al.

The precision technique "can alter the apparent frequency of record-breaking events."

2015, "Decoding the precision of historical temperature observations," Quarterly Journal of Royal Meteorological Society
Andrew Rhines1,*, Martin P. Tingley2, Karen A. McKinnon1 and Peter Huybers1
DOI: 10.1002/qj.2612
Historical observations of temperature underpin our ability to monitor Earth's climate. We identify a pervasive issue in archived observations from surface stations, wherein the use of varying conventions for units and precision has led to distorted distributions of the dataApart from the original precision being generally unknown, the majority of archived temperature data are found to be misaligned with the original measurements because of rounding on a Fahrenheit scale, conversion to Celsius, and re-rounding. Furthermore, we show that commonly-used statistical methods including quantile regression are sensitive to the finite precision and to double-rounding of the data after unit conversion. To remedy these issues, we present a Hidden Markov Model that uses the differing frequencies of specific recorded values to recover the most likely original precision and units associated with each observation. This precision-decoding algorithm is used to infer the precision of the 644 million daily surface temperature observations in the Global Historical Climate Network database, providing more accurate values for the 63% of samples found to have been biased by double-rounding. The average absolute bias correction across the dataset is 0.018C, and the average inferred precision is 0.41C, even though data are archived at 0.1C precision. These results permit for better inference of when record temperatures occurred, correction of rounding effects, and identification of inhomogeneities in surface temperature time series, amongst other applications. The precision-decoding algorithm is generally applicable to rounded observations — including surface pressure, humidity, precipitation, and other temperature data--thereby offering the potential to improve quality control procedures for many datasets." via Hockey Schtick


Above by Rhines et al. published by AGU, Dec. 2014: 

The precision technique "can alter the apparent frequency of record-breaking events."

Decoding the Surface Temperature Record
Rhines, A. N.; Tingley, M.; McKinnon, K. A.; Huybers, P. J.
AA(Harvard University, Cambridge, MA, United States, AB(Pennsylvania State University Main Campus, University Park, PA, United States, AC(Harvard University, Cambridge, MA, United States, AD(Harvard University, Cambridge, MA, United States
American Geophysical Union, Fall Meeting 2014, abstract #GC13H-0769
Publication Date:
1616 Climate variability, , 3252 Spatial analysis, , 3270 Time series analysis, 3309 Climatology
Bibliographic Code:


Historical temperature observations from surface stations have been recorded using a variety of units and levels of precision, with metadata that are often incomplete. As a result, the amount of rounding applied to these observations is generally unknown, posing a challenge to statistical methods that are sensitive to the use of discrete data. Methods used to infer distributional changes often assume that data are continuously distributed and can only be reliably applied when the specific discreteness of each sample is known. We present a new technique, termed `precision-decoding,' that identifies the original precision and units of time series data. Applying it to the GHCND database, we identify temporal and spatial patterns in the precision and units used by surface stations. We show that many archived values have been offset from the original observations due to double-rounding in the presence of conversion between Fahrenheit and Celsius, and provide additional metrics to identify stations in need of further quality control. While the discreteness of the data is unlikely to have influenced global mean temperature trends, we show that it can affect higher-order moments of the temperature distribution such as the variance or skewness, and that it can alter the apparent frequency of record-breaking events."

6/18/15, "Global Historical Climatology Network - Daily (GHCN-Daily), Version 3," NOAA, Updated: Jun 18, 2015

"The Global Historical Climatology Network-Daily (GHCN-Daily) dataset integrates daily climate observations from approximately 30 different data sources. Version 3 was released in September 2012 with the addition of data from two additional station networks. Changes to the processing system associated with the version 3 release also allowed for updates to occur 7 days a week rather than only on most weekdays. Version 3 contains station-based measurements from well over 90,000 land-based stations worldwide, about two thirds of which are for precipitation measurement only. Other meteorological elements include, but are not limited to, daily maximum and minimum temperature, temperature at the time of observation, snowfall and snow depth. Over 25,000 stations are regularly updated with observations from within roughly the last month. The dataset is also routinely reconstructed (usually every week) from its roughly 30 data sources to ensure that GHCN-Daily is generally in sync with its growing list of constituent sources. During this process, quality assurance checks are applied to the full dataset. Where possible, GHCN-Daily station data are also updated daily from a variety of data streams. Station values for each daily update also undergo a suite of quality checks." 


No comments: