Cha-am Jamal, Thailand

Archive for the ‘global warming’ Category

You know who Charles Darwin is of course but you may not have heard of his mad cousin Francis Galton who did the math for Darwin’s theory of evolution. Two of the many procedures Sir Galton came up with to help him make sense of the data are still used today and are possibly the two most widely used tools in all of statistics. They are ordinary least squares (OLS) linear regression and OLS correlation.

Both of these statistics are measures of a linear relationship between two variables X and Y. The linear regression coefficient B of Y against X is a measure of how much Y changes on average for a unit change in X and the linear correlation R is a measure of how close the observed changes are to the average. The regression and correlation metrics are demonstrated below with data generated by Monte Carlo simulation used to control the degree of correlation.

In the HIGH (R=0.94) and VERY HIGH (R=0.98) correlation charts, linear regression tells us that on average, a unit change in X causes Y to change by about B=5 and this assessment is very consistent. The consistency in this case derives from a low variance of the regression coefficient implied by high correlation. The strong correlation also implies that the observed changes in Y for a unit increases in X is close to the the average value of B=5 over the full span of the data and for any selected sub-span of the time series.

In the LOW (R=0.36) and MID (R=0.7) correlation charts, the regression coefficients are correspondingly less precise varying from B=1.8 to B=7.1 for LOW-R and B=3.5 to B=5.6 for MID-R in the five random estimates presented. The point here is that without a sufficient degree of correlation between the time series at the time scale of interest, though regression coefficients can be computed, the computed coefficients may have no interpretation.

The weak correlations in these cases also imply that the observed changes in Y for a unit increases in X would be different in sub-spans of the time series. The so called “split-half” test, which compares the first half of the time series with the second half, may be used to examine the instability of the regression coefficient imposed by low correlation.

Correlation is a necessary but not always a sufficient evidence of causation. Although correlation may imply causation in controlled experiments, field data do not offer that interpretation. If Y is correlated  with X in field data, it may mean that X causes Y, or that Y causes X, or that a third variable Z causes both X and Y, or that the correlation is a fluke of the data without a causation interpretation. However, because correlation is a necessary condition for causation, the absence of correlation serves as evidence to refute a theory of causation.

An issue specific to the analysis of time series data is that the observed correlation in the source data must be separated into the portion that derives from shared long term trends (that has no interpretation at the time scale of interest) from the responsiveness of Y to changes in X at the time scale of interest. If this separation is not made, the correlation used in the evaluation may be, and often is spurious.

An example of such a spurious correlation is shown in the graphic below. It was provided by the TylerVigen collection of  spurious correlations. As is evident, the spurious correlation derives from a shared trend. The fluctuations around the trend at an appropriate time scale are clearly not correlated.

The separation of these effects may be carried out using detrended correlation analysis. Briefly, the trend component is removed from both time series and the residuals are tested for the responsiveness of Y to changes in X at the appropriate time scale. The procedure and its motivation are described quite well in Alex Tolley’s Lecture  .

Spurious_Correlation

The motivation and procedure for detecting and removing such spurious correlations in time series data are described in a short paper available for download at this link: Spurious Correlations in Time Series Data .

It is for these reasons that the argument that “the theory that X causes Y is supported by the data because X shows a rising trend and at the same time we see that Y has also been going up” is specious because for the data to be declared consistent with causation theory it must be shown that Y is responsive to X at the appropriate time scale when the spurious effect of the shared trend is removed. Examples from climate science are presented in the papers listed below along with the URL to their download sites.

  1. Are fossil fuel emissions causing atmospheric CO2 levels to rise? Responsiveness of Atmospheric CO2 to Fossil Fuel Emissions
  2. Can sea level rise be attenuated by reducing or eliminating fossil fuel emissions? A Test of the Anthropogenic Sea Level Rise Hypothesis
  3. Can ocean acidification be attenuated by reducing or eliminating fossil fuel emissions? An Empirical Study of Fossil Fuel Emissions and Ocean Acidification
  4. Is surface temperature responsive to atmospheric CO2 levels? #1 Validity and Reliability of the Charney Climate Sensitivity Function
  5. Is surface temperature responsive to atmospheric CO2 levels? #2 Uncertainty in Empirical Climate Sensitivity Estimates 1850-2017
  6. Is surface temperature responsive to atmospheric CO2 levels? #3 The Charney Sensitivity of Homicides to Atmospheric CO2: A Parody

A further caution needed in regression and  correlation analysis of time series data arises when the source data are preprocessed prior to analysis. In most cases, the effective sample size of the preprocessed data is less than that of the source data because preprocessing involves using data values more than once. For example taking moving averages involves multiplicity in the use of the data that reduces the effective sample size (EFFN) and the effect of that on the degrees of freedom (DF) must be taken into account when carrying out hypothesis tests. The procedures and their rationale are described in this freely downloadable paper Illusory Statistical Power in Time Series Analysis.

Failure to correct for this effect on DF may result in a false sense of statistical power and faux rejection of the null in hypothesis tests as shown in this analysis of Kerry Emmanuel’s famous paper on what he called “increasing destructiveness” of North Atlantic hurricanes: Circular Reasoning in Climate Change Research. When the statistics are done correctly, we find no evidence for the claim that “human caused climate change is supercharging tropical cyclones”. A General Linear Model for Trends in Tropical Cyclone Activity

An extreme case of the effect of preprocessing on degrees of freedom occurs when a time series of cumulative values is derived from the source data as in the famous Matthews paper on the proportionality of warming to cumulative emissions [Matthews, H. Damon, et al. “The proportionality of global warming to cumulative carbon emissions.” Nature 459.7248 (2009): 829]. It has been shown in these downloadable papers that the time series of cumulative values has an effective sample size of EFFN=2 and therefore there are no degrees of freedom and there is no statistical power.

  1. Degrees of freedom in the cumulative values of time series #1 Effective Sample Size of the Cumulative Values of a Time Series
  2. Degrees of freedom in the cumulative values of time series #2 Limitations of the TCRE: Transient Climate Response to Cumulative Emissions
  3. Degrees of freedom in the cumulative values of time series #3 From Equilibrium Climate Sensitivity to Carbon Climate Response
  4. Degrees of freedom in the cumulative values of time series #4 The Spuriousness of Correlations between Cumulative Values
  5. Degrees of freedom in the cumulative values of time series #5: Extraterrestrial Forcing of Surface Temperature and Climate Change: A Parody

 

 

 

Link

Posted on: May 2, 2016

4/15/2018:  The Charney Sensitivity of Homicides to Atmospheric CO2: A Parody

3/21/2018:  Extraterrestrial Forcing of Surface Temperature and Climate Change: A Parody

3/17/2018:  From Equilibrium Climate Sensitivity to Carbon Climate Response

2/14/2018: Uncertainty in Empirical Climate Sensitivity

8/272017: Effect of Fossil Fuel Emissions on Sea Level Rise

7/5/2017: Responsiveness of Atmospheric CO2 to Fossil Fuel Emissions 

7/12/2017: Limitations of the TCRE: Transient Climate Response to Cumulative Emissions

12/1/2016: Illusory Statistical Power in Time Series Analysis

11/21/2016: Some Methodological Issues in Climate Science

11/15/2016: Responsiveness of Polar Sea Ice Extent to Air Temperature 1979-2016

11/1/2016: Responsiveness of Atmospheric CO2 to Fossil Fuel Emissions: Part 2

10/30/2016: Unstable Correlations between Atmospheric CO2 and Surface Temperature

10/21/2016: The Acid Rain Program Part 1: Lake Acidity in the Adirondacks

10/16/2016: Effective Sample Size of the Cumulative Values of a Time Series

9/30/2016: Generational Fossil Fuel Emissions and Generational Warming: A Note

9/24/2016: The Trend Profile of Mean Global Total Column Ozone 1964-2009

9/15/2016: Trend Profiles of Atmospheric Temperature Time Series

08/22/2016: Spurious Correlations in Time Series Data

07/23/2016: SDG: Climate Activism Disguised As Development Assistance

06/13/2016: The United Nations: An Unconstrained Bureaucracy

5/18/206: Changes in the 13C/12C Ratio of Atmospheric CO2 1977-2014

5/16/2016: Shale Gas Production and Atmospheric Ethane

5/6/2016: The OLS Warming Trend at Nuuk, Greenland

4/30/2016: Dilution of Atmospheric Radiocarbon CO2 by Fossil Fuel Emissions

4/19/2016: The Hurst Exponent of Sunspot Counts

4/12/2016: Seasonality and Dependence in Daily Mean USCRN Temperature

4/1/2016: Mean Global Total Ozone from Ground Station Data: 1987-2015

3/15/2016: Latitudinally Weighted Mean Global Ozone 1979-2015

2/1/2016: The Spuriousness of Correlations between Cumulative Values

1/21/2016: An Empirical Test of the Chemical Theory of Ozone Depletion

11/2015: The Hurst Exponent of Precipitation

11/11/2015: The Hurst Exponent of Surface Temperature

10/14/2015: Responsiveness of Atmospheric Methane to Human Emissions

10/6/2016: An Empirical Study of Fossil Fuel Emissions and Ocean Acidification

9/19/2015: Decadal Fossil Fuel Emissions and Decadal Warming

9/1/2015: Uncertain Flow Accounting and the IPCC Carbon Budget

8/21/2015: Responsiveness of Atmospheric CO2 to Anthropogenic Emissions

7/15/2015: A Robust Test for OLS Trends in Daily Temperature Data

6/2015: A General Linear Model for Trends in Tropical Cyclone Activity

3/1/2015: Uncertainty in Radiocarbon Dating: A Numerical Approach

10/20/2014: Simulation as a Teaching Tool in Finance

6/25/2014: The Rise and Fall of the Arbitrage Pricing Theory

6/11/2014: There is No Chaos in Stock Markets

3/23/2014: The Hamada Equation Reconsidered

More: All papers at ssrn.com/author=2220942

SSU: Sonoma State University

The notion that our carbon dioxide emissions are causing the oceans to warm at an alarming rate making glaciers flow faster into the sea (Staying afloat in a sinking world, Bangkok Post, November 24, 2010) is logically and scientifically flawed in many ways. I would like to cite only one of them and it has to do with the Argo Project. It was launched with much fanfare about six years ago. Thousands of robotized floats were installed in oceans around the globe to measure “just how fast the ocean is warming”.  By their own reckoning, these measurements provide the most accurate and comprehensive sea temperature data available to them. Yet, mysteriously, the hype went out of the Argo Project almost as soon as it was implemented. Not only that, the Argo data are apparently being shunned by climate scientists who prefer the old measuring devices whose inadequacy was apparently the reason that they had sought funding for Argo. NASA’s JPL, the keepers of the Argo data, admitted that it is because there are no trends in the temperature or salinity data from the Argo floats. Had the data showed the kind of warming they had hoped to find, the media would have been inundated with that information. The fundamental bias in climate science is that data that do not support its presumptions are not considered valid.

Cha-am Jamal

During 2005 and 2006 the global warming press was abuzz with news about the Argo project – a global effort by climate scientists to cover the earth with thousands of robotized buoys to measure sea temperature. The new devices would aid global warming scientists to “gain new information on the heat trapped in the oceans” and “really track how the ocean is warming” (Sea robots aid climate research, ABC Online, abc.net.au, November 16, 2006).

The initial deployment of the measuring stations was completed in 2007 and more than 3 years have  now elapsed but we have not heard from the climate scientists about the new information they have found about how the oceans are trapping heat and warming. The line has gone dead. Could it be that they did not find what they had spent all the money and effort to find? It is clear from the language that the effort was not an unbiased study to discover whether the oceans were warming but only to confirm that it was warming and to hand skeptics a slam dunk but instead of silencing skeptics with the new data, climate scientists appear to have forgotten about the Argo Project and are now pushing land temperatures.

Cha-am Jamal

The so called “climate change vulnerability index”, that is likely causing great economic harm to countries like Bangladesh and India by implying that they pose higher risk to investors, is based on the proposition that “there is growing evidence that climate change is increasing the intensity and frequency” of weather related natural disasters. In fact there is no such evidence. This idea was included in the IPCC’s 2007 assessment report based on a peer reviewed research paper but that paper having been shown to be flawed, the IPCC has since made a full retraction of this claim (UN wrongly linked global warming to natural disasters, The Sunday Times, January 24, 2010). However, this orphaned idea has taken on a life of its own and remains in the media and apparently even with the architects of the “climate change vulnerability index”. The perpetrators of this falsehood are likely the real vulnerable parties having exposed themselves to lawsuits by countries suffering economic harm from their flawed prophecies of doom.

Cha-am Jamal
Thailand

It is reported that there are 6.8 billion humans living on our planet but that it is endowed with natural resources and ecosystems that can support only 4.5 billion humans. The pressure on the ecosystem thus induced will cause a mass extinction of species by way of global warming and climate change. The scale of the mass extinction will be comparable with the extinction of dinosaurs  (UN urges action to save species, Bangkok Post, October 19, 2010). It is the old and completely discredited Paul Ehrlich Population Bomb hype of the 1960s and 1970s (2001 an Overpopulation Odyssey, Los Angeles Times, October 22, 1974). It has been resurrected to be recycled in the fancy new language of global warming and climate change apparently to present known falsehoods as climate science. The new global warming hype is thus exposed as nothing more than the old overpopulation pig with lipstick. It is a continuation of the movement by human beings against the habitation of the planet by other human beings but not themselves. This time around, not limited resource consumption, but carbon dioxide emission is presented as the proxy for destructive human activity. Ironically, in the same issue of the Bangkok Post, we read that Europeans are alarmed that phthalates in toys can damage the sexual development of children (The problem with hazardous phthalates, Bangkok Post, October 19, 2010). Those who really believe in the alleged dangers of overpopulation should be comforted by the population control effect of phthalates. That they are alarmed shows that the global warming mass extinction alarm is a lie disguised as science, and that overpopulation is not a concern that there are too many of us but that there are too many other people.

1960s: The over-population theory explores the fear that there are too many people on earth and they are breeding too fast. It is predicted that by 1987 human activity will exceed the planet’s ability to sustain us with food, energy, and raw materials. The scenario, explored in the movie “Soilent Green”, is predicted to includes Biblical famine and death, anarchy, and the devolution of human society possibly including cannibalism. Human activity will have destroyed the earth’s ability to sustain human beings.

1970s: The “limits to growth” theory disseminates the fear that society will collapse by the year 2000 because there is a hard upper limit to the amount of fossil fuels, minerals, and other planetary resources that we can consume and therefore a limit to the level of economic growth that is achievable. Continued economic growth will run into this upper limit and cause a complete collapse of civilization as we know it.

1970s: The first ozone depletion scare campaign is waged against the development of the SST high altitude airliner with the allegedly scientific argument that nitric oxide (NOx) in the jet exhaust will deplete ozone in the ozone layer. The campaign is successful and the SST program is canceled. Their success emboldens environmental extremists and the modern version of planetary environmentalism based on fear takes form. Twenty years later the same scientists, alarmed by falling NOx concentration in the lower atmosphere declared that “NOx is the immune system of the atmosphere” because it prevents chlorine from depleting ozone.

1980s: The second ozone depletion scare campaign is waged against refrigerants that contain CFC chemicals saying that human activity was causing an ozone hole over the Antarctic and causing the establishment of the Montreal Protocol and a comprehensive ban on the most efficient and inexpensive refrigerants used worldwide. The ozone depletion science is proven wrong but the media that helped hype the ozone hole scare are silent on the issue. The ozone hole scare quietly disappears from the media.

1990s to present: The global warming scare campaign rises like a Phoenix from the ashes of the failed ozone hole scare campaign with the theory that carbon dioxide from fossil fuels accumulates in the atmosphere, traps heat, and warms up the planet with catastrophic consequences of Biblical proportions.