Why there was no pause in warming
The fact that global warming has never paused is now quite well established through a slew of careful analysis of the 1998-2013 data in multiple papers. Here I will explain the science and the Mathematics behind this conclusion following the useful open access review paper that recently came out.
Global temperature evolution: recent trends and some pitfalls - IOPscience
First let us discuss what temperature we are talking about. The temperature data we wish to analyze is the
Global Mean Surface Temperature. There are many layers to our planet who take up heat from the sun. The ocean is the most important which is divided into several layers, the upper layer, the middle layer and the deepest ocean. Similarly the atmosphere is divided into lower troposphere, upper troposphere, the stratosphere and higher but thinner layers above. Each of these take up different parts of the heat, but much of our weather depends on surface events... and hence the surface temperature... which is the joint land surface and ocean surface combined... is both the most well measured and well understand part of our climate science.
The key question that the paper (and at least 5 more) addresses is as follows:-
"
While many scientific publications of the past years have discussed an alleged 'hiatus' or 'slowdown' and its possible causes, few have provided any statistical assessment of whether a significant trend change actually occurred. While it is clear and undisputed that the global temperature data show short periods of greater and smaller warming trends or even short periods of cooling, the key question is: is this just due to the ever-present noise, i.e. short-term variability in temperature? Or does it signify a change in behavior, e.g. in the underlying warming trend? In other words, are periods of particularly high or low warming trend significant, in that any of them is unexpected and requires further explanation than just the usual noise in the data? "
Climate, like every complex, chaotic self organized system contains inherent irreducible randomness and stochastic effects. Other examples that may be familiar to everybody are turbulent flow of rivers or how a wood fire burns. In each of these, the flow has a general large scale trend, and on it is superpower lots and lots of Eddie's and fluctuations. Climate is similar. We have a mean trend and over it multiple kinds of periodic and a-periodic fluctuations.
The scientific question therefore is, does the climate data of the recent years signify a change in the warming trend or they are simply expected stochastic fluctuations. Fortunately the ways of determining this is quite well developed in statistics, as removing the noise from the signal is one of the basic capabilities needed in information technology and the science of complex systems.
The most popular manner in which the stochastic fluctuations are modeled in information theory is through additive white gaussian noise
"Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics:
- Additive because it is added to any noise that might be intrinsic to the information system.
- White refers to the idea that it has uniform power across the frequency band for the information system. It is an analogy to the color white which has uniform emissions at all frequencies in the visible spectrum.
- Gaussian because it has a normal distribution in the time domain with an average time domain value of zero.
Wideband noise comes from many natural sources, such as the thermal vibrations of atoms in conductors (referred to as thermal noise or Johnson-Nyquist noise), shot noise, black body radiation from the earth and other warm objects, and from celestial sources such as the Sun. The central limit theorem of probability theory indicates that the summation of many random processes will tend to have distribution called Gaussian or Normal."
Consider any signal, like temperature T(t), where t refers to the fact that the temperature depends on time. A simple way to expand the temperature is to separate it into a trend component and a Gaussian noise component.
T(t) = T'(t) + c(t)
Where T(t) is the raw variable, T'(t) is the signal and c(t) is the noise.
If a bunch of data values are taken in each of various time points t1, t2, t3... then we can write for each of these groups
T3 = T'(t3) + c3 and so on for all other (t1, t2,...)
where T3 are the data value, T'(t3) is the signal value at that time point and c3 is the noise value. T' is considered the mean or expectation value of the variable (here temperature) at that time. The mean of the noise is zero. So
T'(t3) = E(T|t3) (how to calculate expectations)
and
E(c3) = 0
Same holds for all other data points in various other times.
The
mean rate of increase of temperature around a specific time t3 is given by the differentiation of the signal at t3.
DT(t3) = dT'/dt @ t3
Now that we have defined what the trend is and means to say that rate of increase in global warming trend has increased or decreased (DT changes), lets see how to assess it.
"Establishing acceleration or deceleration of global temperature means detecting and confirming a change in the trend. Since the trend is distinct from the noise, the influence of noise will always lead to apparent changes. Distinguishing those which are genuine from those induced by noise is the purpose of statistical significance testing...
For our purpose, a significant slowdown or acceleration in global warming is a behavior of global-mean temperature which is highly unlikely to occur under the null hypothesis of a constant warming trend plus short-term random variations as observed in the past (where 'past' refers to a suitably defined baseline period)...
Any claim of a significant slowdown or acceleration would require data that are highly unlikely (e.g. 5% or 10% likelihood depending on the desired confidence level) to be consistent with this null hypothesis....
We consider five prominent global temperature data sets: (i) NASA GISTEMP (Hansen et al 2010, GISTEMP Team 2016), NOAA (Smith and Reynolds 2005, Smith et al 2008), HadCRUT4 (Morice et al 2012), the revision of HadCRUT by Cowtan and Way (2014), and the Berkeley Earth Surface Temperature (Rohde et al 2013). "
The issue might be compared to deciding the form of a tennis player... say Roger Federer. Based on his past performance we have an expectation of what his firm was like. Now we can look at more current win/loss statistics to decide whether they are within the expected variability of day to day chance or whether his form has truly dipped or improved. We determine this by calculating how likely is the current stats of win/loss based on the previous form and its variance.
Continued...