• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Global Warming | Fact or Fiction?

How do you feel about Global Warming?

  • Global Warming is a myth and the climate will stabilize soon.

    Votes: 4 3.4%
  • Global Warming is happening but Humanity has nothing to do with it.

    Votes: 8 6.9%
  • Global Warming is happening and Humanity is partly to blame.

    Votes: 41 35.3%
  • Global Warming is happening and Humanity is mostly to blame.

    Votes: 52 44.8%
  • Global Warming is happening and Humanity is the only cause.

    Votes: 8 6.9%
  • Don’t know, don’t care.

    Votes: 3 2.6%

  • Total voters
    116

LegionOnomaMoi

Veteran Member
Premium Member
How was it misrepresented?

Here- I'll provide you the full nature article so that you don't have to rely on some summary: Role of sulphuric acid, ammonia and galactic cosmic rays in atmospheric aerosol nucleation


Explain those research papers and what they really meant. I did read them. Post what they mean in your own words, like you have asked me to do since you say you understand this science so well.

Which paper? They argue different things. For example, Jager's paper, although it explains a great deal of relevant solar activity, doesn't actually get into very much into the mechanisms through which these drive the climate. Rather, he uses a similar argument that AGW theory uses when examining proxy records of temperature and co2: correlation. Essentially, Jager examines the magnetic fields associated with solar phenomena and cycles, both short term and long term (and the difficulties in predicting future patterns due to the dynamical nature of solar phenomena). However, while Jager doesn't address how the solar dynamo influences climate (other than to note that it is through equatorial and polar magnetic fields), he uses temperature trends and solar trends of the past to establish a connection between these fluctations in the magnetic field and temperature trends, introducing difference equation relating tropospheric temperature to the geomagnetic fieldstrength of polar activity (sunspots), the number of sunspots, and the fraction of equatorial and polar magnetic fields that actually influences the troposphere/climate:
T= yR+z(aa)+c+deltat. This nonlinear equation, as an iterated function, generates a temperature trend as a result of solar activity, and a regression line (a least-squares solution) shows that the two are highly correlated. As it is impossible for earth's temperature to cause solar activity, then this correlation must have some other explanation. Jager doesn't give an answer, but merely notes that establishing what the cause is will be a challenge.

The other studies, however, DO provide a causal mechanism. The most common is that the fluctations in the sun's magnetic field result in variations of GCRs reaching our atmosphere. These particles act as seeds which generate clouds. An increase or decrease of cloud coverage can radically alter the climate, and if a change in the sun's net magnetic field persists over an extended period of time (as it does) than this hypothesis connects the correlation in magnetic flux and global temperature esatablished by research such as Jager's. However, one paper I linked to (Lu's) focuses on a different relationship between magnetic flux and GCR variation: the ozone layer. Lu builds off of previous work which established the connection between human use of things like CFCs and ozone depletion, but finds that cosmic rays not only are the drastically increase the effect on CFCs and ozone depletion because of they are the major source of the very elements which transform halogenated molecules like CFCs (but also other atmospheric elements like hydrogen chloride) their photoactive forms responsible for ozone depletion.

"The CERN experiment only tested one-third of one out of four requirements to blame global warming on cosmic rays
Hmmm...maybe that's because they've only done one experiment. The relationship between GCRs and atmospheric changes as well as solar magnetic flux and temperature changes was already found.

Cosmic rays would also produce climate-cooling clouds and cool the earth, but that is the opposite of what were seeing.

First, it depends on where the clouds are. But, just so you know, the whole point is that GCRs COOL THE CLIMATE in general. So a decrease results in less cloud coverage and higher temperatures.

Fortunately we have empirical observations against which we can test these requirements.

According to which long term data set? Because plenty of studies have shown that over long periods of time magnetic flux correlates more strongly with temperature trends than co2.


Solar magnetic field strength correlates strongly with other solar activity, such as solar irradiance and sunspot number. As is the case with these other solar attributes, solar magnetic field has not changed appreciably over the past three decades (Lockwood 2001).

Oh this is great. I get to do what you do. Here's a response to such claims: The persistant role of the Sun in climate forcing


Solar magnetic field have a long-term positive trend happening?

Has Galactic cosmic ray flux on Earth show a long-term negative trend?

Has it been shown Cosmic rays can and must successfully seed low-level clouds?

Has it been shown we have had a Low-level cloud cover with a long-term negative trend?
All of the above are still being debated. But guess what? Persistant increases in co2 over a ~30 year period in the 20th century corresponded to a drop in global temperatures. Currently, an 10+ year increase in co2 has not led to a further upward trend. So, if we use the argument that gaps between the expected results of magnetic flux and GCRs on climate should negate the theory, then we should throw AGW theory out the window. Because we see an even greater disparity between atmospheric co2 trends and temperate.
 

shawn001

Well-Known Member
Originally Posted by shawn001
How was it misrepresented?

I asked you how this quote in Nature from Physicist Jasper Kirkby about the study he did at Cern was misrepresented.

"But, Physicist Jasper Kirkby adds, those particles are far too small to serve as seeds for clouds."At the moment, it actually says nothing about a possible cosmic-ray effect on clouds and climate, but it's a very important first step," he says. Nature, 8/24/11

That was in his own words. Then you post the link to the study. Your not addressing the misrepresentation.


Here- I'll provide you the full nature article so that you don't have to rely on some summary: Role of sulphuric acid, ammonia and galactic cosmic rays in atmospheric aerosol nucleation

I am not relying on some summary, I am quoting him VERBATIM! I know what this study was about and why they did it.

again

The lead scientist in the CERN CLOUD experiment explicitly stated that the experiment "actually says nothing about a possible cosmic-ray effect on clouds and climate." Many other studies have concluded that cosmic rays play a minor role in cloud formation, and have not contributed in any significant way to the global warming over the past 50 years.



I'll address this one first. Because it seems you just post tons of research papers and then say they don't know anything yet.

Here's a response to such claims: The persistant role of the Sun in climate forcing


International Journal of High-Energy Physics
CERN Courier


Here is Cern's responce to their work in
Feb 24, 2010

A test of the causal hypothesis is to examine the correlation as a function of geomagnetic latitude. The 11-year cosmic-ray variation becomes bigger at higher magnetic latitudes because of the effect of the Earth’s magnetic field. Fewer low-energy cosmic rays enter the Earth’s atmosphere near the magnetic equator than near the poles. This effect is measured by the vertical rigidity cut-off (VRCO) – the minimum rigidity for a primary cosmic ray to reach the Earth’s atmosphere – which is computed from the local value of the planet’s magnetic field. Our analysis looked at the differences between the low cloud cover at solar minima in 1985 and 1996 and that at solar maximum in 1990 at different VRCO (Sloan and Wolfendale 2008). These were then compared with the changes in the cosmic-ray rate as measured from neutron monitors located around the world (figure 2). If the dip in the low cloud cover observed in 1990 was caused by the decrease in ionization from cosmic rays then all of the points in figure 2 would follow the line of the cosmic-ray variation, marked NM. They do not.

Recently, Svensmark’s group examined the so-called Forbush decreases in cosmic-ray intensity, which are caused by solar coronal mass ejections. The group found that the six strongest (over the past 20 years) are followed by significant drops in low cloud cover and in other indicators of atmospheric water content. We have examined the evidence in detail and concluded that it is not only statistically weak but that it also needs unphysically long periods (6–9 days) for the change in cosmic-ray flux to manifest itself as changes in cloud cover or the cloud water content.
The correlation between low cloud cover and cosmic rays in figure 1 is presumably therefore not causal because we have found that ionization is not efficient at yielding cloud cover. A more likely cause relates to solar irradiance, not least because the change in energy content of solar irradiance is about 108 times that of cosmic rays. In this context, Mirela Voiculescu of Dunarea de Jos University in Romania and colleagues showed correlations between low cloud cover and either the cosmic-ray rate or the solar irradiance in limited geographical areas (Voiculescu et al. 2006). Such areas cover less than 20% of the area of the globe. A close examination of these geographical areas reveals that only the correlation between the solar irradiance and the cloud cover is seen in both solar cycles. By contrast, any correlation that there is with cosmic rays does not appear in both cycles.

Cosmic rays, climate and the origin of life - CERN Courier

which takes care of that claim, although there is way more evidence against it anyway.
 

LegionOnomaMoi

Veteran Member
Premium Member
Originally Posted by shawn001
How was it misrepresented?

Because you did not accurately represent either the nature of the experiment or the findings. The study found that while this first experiment did not once and for all settle anything, it did in fact provide preliminary experimental evidence for the actual mechanics of GCRs in cloud-seeding.

But you didn't read the article, and quoted Kirby out of context:


"But, Physicist Jasper Kirkby adds, those particles are far too small to serve as seeds for clouds."At the moment, it actually says nothing about a possible cosmic-ray effect on clouds and climate, but it's a very important first step," he says. Nature, 8/24/11
The "particles" in question aren't galactic cosmic rays, but an artificial atmosphere and artificial ionization which doesn't completely simlulate either the effect of GCRs or the atmosphere, as this is just the first experiment. It does, however, provide evidence to support the link between GCRs and climate change.
That was in his own words. Then you post the link to the study. Your not addressing the misrepresentation.
You removed the context and didn't accurately represent the Nature article.

I am not relying on some summary, I am quoting him VERBATIM! I know what this study was about and why they did it.
You are quoting what some article chose to include among the various statements he made. And you distorted Kirby's point. From your source:
"People are far too polarized, and in my opinion there are huge, important areas where our understanding is poor at the moment," says Jasper Kirkby, a physicist at CERN. In particular, he says, little controlled research has been done on exactly what effect cosmic rays can have on atmospheric chemistry...
Early results seem to indicate that cosmic rays do cause a change. The high-energy protons seemed to enhance the production of nanometre-sized particles from the gaseous atmosphere by more than a factor of ten. But, Kirkby adds, those particles are far too small to serve as seeds for clouds. "At the moment, it actually says nothing about a possible cosmic-ray effect on clouds and climate, but it's a very important first step," he says."

But you conveniently omitted the underlined part. You also didn't bother to quote how your source ends:
"Kirkby hopes that the experiment will eventually answer the cosmic-ray question. In the coming years, he says, his group is planning experiments with larger particles in the chamber, and they hope eventually to generate artificial clouds for study."

The lead scientist in the CERN CLOUD experiment explicitly stated that the experiment "actually says nothing about a possible cosmic-ray effect on clouds and climate."

But that same scientists concludes it was an important first step. Because it was the first experiment of many. Yet you didn't include any of Kirby's statements in your source (or the actual research article) which indicate support for such a link. The reason it says "nothing" is because the this experiment was preliminary. Yet your quote mining is an attempt to show that the issue is now settled. Kirby doesn't think so.
Many other studies have concluded that cosmic rays play a minor role in cloud formation, and have not contributed in any significant way to the global warming over the past 50 years.
I know. Unlike you, I actually read the research rather than quote mining. Which means I also know that many other studies show the opposite.


I'll address this one first. Because it seems you just post tons of research papers and then say they don't know anything yet.

Here's a response to such claims: The persistant role of the Sun in climate forcing
They don't say that, but as you don't seem to bother reading them...

International Journal of High-Energy Physics
CERN Courier

Here is Cern's responce to their work in
Feb 24, 2010

Oh this is TOO funny. You didn't link to any recent CERN research, but a non-scientific article which cites a few studies, the most from 2008. However, in 2009 Svensmark et al. published a study in in the journal GRL which concludes that "relatively small" variations in GCRs controls a a great deal of cloud coverage on earth. So instead of quote mining some non-scientific article which makes claims about what CERN but doesn't cite any CERN research after 2007, and cites a 2007 article by Svensmark when a more recent paper published by Svensmark et al. finds significant support for the link between GCRs and climate, READ RESEARCH AND STOP QUOTE MINING!
which takes care of that claim, although there is way more evidence against it anyway.
It sure does. I mean, it cites FIVE WHOLE RESEARCH papers. Of course, they don't all agree, and more recent work contradicts the statements made, but as you aren't concerned with accuracy, science, or actual discussion, who cares? All that matters is that you can mine quotes from websites without reading research. That way, you can avoid actually dealing with science.
 

shawn001

Well-Known Member
"The CERN experiment only tested one-third of one out of four requirements to blame global warming on cosmic rays
Hmmm...maybe that's because they've only done one experiment.

That experiment wasn't about climate change.


The relationship between GCRs and atmospheric changes as well as solar magnetic flux and temperature changes was already found.

Yes they have found some links that is not in dispute.

But its not one that is causing global warming.

In order for this to be true as stated

"Therefore, in order for this theory to be plausible, all four of the following requirements must be true.
  • Solar magnetic field must have a long-term positive trend.
  • Galactic cosmic ray flux on Earth must have a long-term negative trend.
  • Cosmic rays must successfully seed low-level clouds.
  • Low-level cloud cover must have a long-term negative trend.
Solar magnetic field must have a long-term positive trend.

Solar Magnetic Flux from 1967 to 2009 (Vieira and Solanki 2010)

"solar magnetic field has not changed appreciably over the past three decades"

http://arxiv.org/PS_cache/arxiv/pdf/0911/0911.4396v1.pdf


Galactic cosmic ray flux on Earth must have a long-term negative trend.

Cosmic ray flux on Earth has been monitored since the mid-20th century, and has shown no significant trend over that period.

Cosmic Ray Intensity (blue) and Sunspot Number (green) from 1951 to 2006 (University of New Hampshire)

http://www.skepticalscience.com/images/VieiraandSolanki2010.png

In fact cosmic ray flux has lagged behind the global temperature change since approximately 1970 (Krivova 2003).

"between 1970 and 1985 the cosmic ray flux, although still behaving similarly to the temperature, in fact lags it and cannot be the cause of its rise. Thus changes in the cosmic ray flux cannot be responsible for more than 15% of the temperature increase"


Cosmic ray flux on Earth has been monitored since the mid-20th century, and has shown no significant trend over that period.

And since 1990, galactic cosmic ray flux on Earth has increased - "the opposite direction to that required to explain the observed rise in global mean temperatures" (Lockwood 2007). In fact, cosmic ray on flux recently reached record levels. According to Richard Mewaldt of Caltech, "In 2009, cosmic ray intensities have increased 19% beyond anything we've seen in the past 50 years."


Figure 4: Record cosmic ray flux observed in 2009 by the Advanced Composition Explorer (NASA)
Despite this record high GCR flux which we would expect to increase cloud cover and cause cooling, 2009 was tied for the second-hottest year on record, and the 12-month running mean global surface temperature record was broken 3 times in 2010 (NASA GISS).

Inability to explain other observations

In addition to these multiple lines of empirical evidence which contradict the GCR warming theory, the galactic cosmic ray theory cannot easily explain a number of observed fingerprints of the increased greenhouse effect, such as the cooling of the upper atmosphere and greater warming at night than day.

Additionally, because cosmic radiation shows greater variation in high latitudes, we expect larger changes in cloud cover in polar regions if GCRs are succesfully influencing cloud cover. This is not observed. Furthermore, examining the nuclear reactor accident at Chernobyl, ionization from the radioactivity would be expected to have produced an increase in cloud cover. There is no evident increase in cloud cover following the accident (Sloan 2007).
Galactic cosmic rays can't explain global warming

In summary, studies have shown that GCRs exert a minor influence over low-level cloud cover, solar magnetic field has not increased in recent decades, nor has GCR flux on Earth decreased. In fact, if GCRs did have a significant impact on global temperatures, they would have had a cooling effect over the past 20 years.

What's the link between cosmic rays and climate change?


All of the above are still being debated.

Not so much, as a cause but as a possible contributing factor.

Persistant increases in co2 over a ~30 year period in the 20th century corresponded to a drop in global temperatures. Currently, an 10+ year increase in co2 has not led to a further upward trend


Lets see what happens when the Co2 is released again from the oceans scrubbing it for one.

Global warming 'not slowing down,' say researchers

December 6, 2011

They revealed the true global warming trend by bringing together and analysing the five leading global temperature data sets, covering the period from 1979 to 2010, and factoring out three of the main factors that account for short-term fluctuations in global temperature: El Niño, volcanic eruptions and variations in the Sun's brightness.

After removing these known short-term fluctuations, the researchers, statisticians and climate experts from Tempo Analytics and the Potsdam Institute for Climate Impact Research, showed that the global temperature has increased by 0.5°C in the past 30 years. In all of the five global data sets, 2009 and 2010 were the two hottest years. In the average over all five data sets, 2010 is the hottest year on record.

Global warming 'not slowing down,' say researchers

Global temperature evolution 1979


NASA Finds 2011 Ninth-Warmest Year on Record
01.19.12


The global average surface temperature in 2011 was the ninth warmest since 1880, according to NASA scientists. The finding continues a trend in which nine of the 10 warmest years in the modern meteorological record have occurred since the year 2000.

NASA - NASA Finds 2011 Ninth-Warmest Year on Record

 

LegionOnomaMoi

Veteran Member
Premium Member
That experiment wasn't about climate change.
Yes, it was. It was about cloud dynamics, one of the least understood and most important aspects of climate science.



But its not one that is causing global warming.

According to a few sources you can quote mine anyway.
In order for this to be true as stated


"Therefore, in order for this theory to be plausible, all four of the following requirements must be true.
  • Solar magnetic field must have a long-term positive trend.
  • Galactic cosmic ray flux on Earth must have a long-term negative trend.
  • Cosmic rays must successfully seed low-level clouds.
  • Low-level cloud cover must have a long-term negative trend.
Solar magnetic field must have a long-term positive trend.

Let's grant that in order to establish the influence of cosmic rays and solar magnetic flux, we need to show long term trends. First, we have. I linked to articles which showed this. Second, when you point to sources which show that the trends don't correspond with magnetic fields/GCRs and then conclude this disproves the theory, you ALSO DISPROVE AGW THEORY by the same logic. Because the temperature trends and co2 levels doesn't match up much of the time, including the 20th and 21st centuries.

"solar magnetic field has not changed appreciably over the past three decades"

For roughly three decades, after the early 20th century warming period, atmospheric co2 content rose while temperatures either dropped or did not rise So, even if we grant your quote above, then using it to conclude that solar magnetic flux is not a principle driver of climate change means co2 isn't either and AGW theory is wrong. The temperature records over millenia are better correlated with solar magnetic fluctuations than co2. To dismiss this because you point to research showing that this correspondence isn't perfect is ridiculous. Because THAT'S TRUE FOR co2 AS WELL.


In fact cosmic ray flux has lagged behind the global temperature change since approximately 1970 (Krivova 2003).

Interesting, given that the correlation between atmospheric co2 content and temperature has lagged for thousands of years.

"between 1970 and 1985 the cosmic ray flux, although still behaving similarly to the temperature, in fact lags it and cannot be the cause of its rise. Thus changes in the cosmic ray flux cannot be responsible for more than 15% of the temperature increase"

Well let's extend that logic: Between about 1940 to 1970 atmospheric co2 content continued to rise. However, temperatures at first dropped sharply, rose a bit, then dropped again, and didn't start a steady trend until ~30 years after the initial drop. Ergo (the above reasoning) co2 cannot be responsible for temperature changes.

Cosmic ray flux on Earth has been monitored since the mid-20th century, and has shown no significant trend over that period.


Wrong. I've posted several scientific studies from the past 10 years, including very recent studies, which find the opposite. However, as you don't bother to read any real research and rely on quote mining whatever google can provide you, you conclude the issue settled. None so blind as those that will not see.


Not so much, as a cause but as a possible contributing factor.
As a major contributing factor, along with other natural forces. Again, if you want to point to periods where temperatures don't correspond to GCR fluctations as evidence that these aren't primary drivers of climate, then you are throwing out co2 as well.


The global average surface temperature in 2011 was the ninth warmest since 1880, according to NASA scientists. The finding continues a trend in which nine of the 10 warmest years in the modern meteorological record have occurred since the year 2000.
Do you know what the word "trend" means? How about "average?" It doesn't matter if this past year was the warmest in 3,000,000 years. That's not a trend. It also doesn't matter is 9 of the 10 warmest years have occurred since 2000. In order for that to be a trend, then they have to increase in a linear fashion. That didn't happen. I showed you graphs from GISS, HadCRU, etc. I gave you the actual "top ten warmest years" chart. But despite all this, you still seem incapable of realizing that a trend requires not just ten of the warmest years, but a general increase during those years. That's not what happened. The temperature remained high, but didn't increase. When you understand "trend" and "average" then go back to the charts and records and indicate a warming trend over the past ~10 years. If you can't do this, then feel free to "refute" a lack of a trend by quoting figures which have nothing to do with a trend.
 

Trey of Diamonds

Well-Known Member
Leaked docs: Heartland Institute think tank pays climate contrarians very well

Yesterday, a series of documents that allegedly originated form the Heartland were leaked to a prominent climate blog. The documents reveal that most of the funding for its climate activities come from a small range of very generous donors, and that big plans are afoot for 2012. If the Heartland has its way, it will fund the launch of a new website by meteorologist and climate skeptic Anthony Watts, and prepare a school curriculum intended to keep teachers from addressing climate science.
 

work in progress

Well-Known Member
I'll have to check up on this, but if memory serves that simply one area where good measurements could be taken, and so they were. It's also the (in terms of time span) the longest record. But now there are others. And if we look at the annual mean changes of the global set, while there is definitely a pronounced upwared trend, it does fluctuate even on a yearly basis.
I have enough awareness of statistics to know that it can turn into a quagmire very quickly based on methodology. That's why I expect the contrarians such as those who had claimed global temps were on the decline (like Monckton) to prove their claims to other scientists, rather than making me spend hours pouring through data and trying to critique it myself. It's similar to what happened when the Intelligent Design movement started 20 years ago. A few smart mathematicians claimed to have proven I.D. by evolutionary changes that were irreducibly complex. But, they weren't able to prove their claims to other mathematicians who were familiar with designing information theories.

In this example, the numbers look quite conclusive that the trend is upward, and any variations are only temporary. The rate of increase in CO2 is increasing, regardless of any arguments about whether the percentage of increase is on the rise: 2.07 ppm per year, if we use the average for the last decade as an example. Acceleration of Atmospheric CO2 | CO2 Trend | Current CO2
Those numbers are from NOAA, and if there are changes to the increase annually ppm., the average increase is still over 2 ppm. per year.




What is the most accurate paleoclimate record isn't certain, nor is it certain which proxy source is.
Offhand, I would say that if there is only one paleoclimate measuring system that correlates well with ice core data, that would be the one to use.


It's definitely going up, that's true. As to the rate, I think I recall a recent slowing, but I have to look into it. In any event, it wouldn't really matter unless the increase actually slowed to the point of stopping.
Unless I see some plausible theory behind a claim that CO2 could decline, I can't accept it as serious. It seems so ludicrous for carbon to actually be able to decrease at a time when we are pumping more into the atmosphere, and almost certainly setting off long feared positive feedbacks in the Arctic. For CO2 to decline, that would mean that the planetary carbon sequestration systems would have to increase the amount of carbon they can take out of the atmosphere. I don't see how this is even a possibility, considering that the trend has kept rising for over 150 years now.


Well a lot of interest seems to be about why/how increases in emission don't correspond regularly to increases in atmospheric content. We know that it isn't a linear correlation, but why it seems to fluctuate isn't fully understood. Neither is the relationship between CO2 levels and temperature.
I noticed from that hour-long lecture w. slides that I posted previously from geologist Richard Alley, that one of the points he stressed in the middle of his lecture was that past examples of diversion between CO2 and average temps were false...such as one often cited from the Ordovician Era, because only one sample was taken, and that sample was contaminated from improper storage. His theme was that carbon is the only correlating factor with global temperatures, not solar output, or magnetic fields, or cosmic rays...the only trend that matches the temp records has been CO2. It can lag temperature increase, as rising temps can release large amounts of sequestered carbon; but then CO2 becomes a driver of warmer temperatures until it is removed from the atmosphere.
 

LegionOnomaMoi

Veteran Member
Premium Member
I have enough awareness of statistics to know that it can turn into a quagmire very quickly based on methodology.
It can indeed.

It's similar to what happened when the Intelligent Design movement started 20 years ago. A few smart mathematicians claimed to have proven I.D. by evolutionary changes that were irreducibly complex. But, they weren't able to prove their claims to other mathematicians who were familiar with designing information theories.
As far as data sets go, it's a bit different. There's no "us vs. them" in the way there is in climate science in general (both in the worlds of specialists and non-specialists). Both "deniers" and proponents of AGW disagree among one another about the right proxy sets, the right adjustments, the best statistical measures, etc. There's no consensus on many of the statistical procedures or data sets the way there is on AGW theory in general.

In this example, the numbers look quite conclusive that the trend is upward, and any variations are only temporary.
You're also talking about a fairly small time frame though. Which is where paeloclimotology becomes important:

Offhand, I would say that if there is only one paleoclimate measuring system that correlates well with ice core data, that would be the one to use.
The ice-core data is a proxy set. So are a lot of other things we find in the environment. There's no reason I can think of to conclude a priori that the ice-core data is the most robust.


It seems so ludicrous for carbon to actually be able to decrease at a time when we are pumping more into the atmosphere
Actually that's part of AGW theory (sort of). That is, AGW theory is a general model (with certain specifics widely debated) about climate. Climate, however, involves all sorts of things. AGW theory holds that these "subsystems" of and external influences on our atmosphere interact in very complex ways which result in things like a lack in a warming trend which "should" be there or a dangerous rise in temperature because of interactions between carbon dioxide and the central internal (i.e., atmospheric) components responsible for temperature fluctuations.

A great deal of the CO2 (roughly half) that we emit doesn't go into the atmosphere. Nor is there a straightforward correlation between emission amount and atmospheric content. We know where a lot of it goes (like the ocean), but we don't know all, and we are also not sure how the system as a whole adapts to changes in content. Take the Gaia hypothesis. This hypothesis holds that the earth gradually tended towards one of self-regulation. Chaotic systems in general exhibit widely fluctuating behavior. They are aperiodic, they don't settle, they are unpredictable, etc. But they somehow stay within certain "bound." I think a graphic depiction is beast here, and as I went with a logistic graph before, I'll go with the more attractive fractal:
56953-500-375.jpg


It's certainly possible (it happens in our brains all the time) for a chaotic system fluctuate beyond the "normal" regulated chaos. But there are mechanisms which do act in ways described in the Gaia hypothesis which regulate the effects of changes, even large ones, in or on the climate. So it is certainly possible for the environment to adjust to the increases in co2 and, through emergent behavior (e.g., increasing certain types of life or descreasing water vapor) absorb far more co2. Most research suggests, however, that this is not the case, and if anything the opposite is true.

I don't see how this is even a possibility, considering that the trend has kept rising for over 150 years now.

But it's risen for longer periods before (and to greater amounts), and also dropped for long periods. Additionally, the direct measurements only go back to the late 50s. Although records (both direct measurements and proxy records) are good, a more complete understanding is better.


I noticed from that hour-long lecture w. slides that I posted previously from geologist Richard Alley, that one of the points he stressed in the middle of his lecture was that past examples of diversion between CO2 and average temps were false...such as one often cited from the Ordovician Era, because only one sample was taken, and that sample was contaminated from improper storage. His theme was that carbon is the only correlating factor with global temperatures, not solar output, or magnetic fields, or cosmic rays...the only trend that matches the temp records has been CO2. It can lag temperature increase, as rising temps can release large amounts of sequestered carbon; but then CO2 becomes a driver of warmer temperatures until it is removed from the atmosphere.

That's not entirely accurate, but for the most part it is one theory. However, we're talking about correlation sets which involve "lags." In other words, they don't correlate (at least not temporally). So one can generate all sorts of arguments about the lag being just that (a lag) or argue that the increase in temperature caused the co2, or that something else caused both. That's the problem with using correlation. And if one isn't factoring in theories concerning lags and what causes them and other explanations for why this increase in carbon didn't correspond with that increase in temperature, or vice versa (and the same for GCRs), then solar activity is a better predictor of climate change than co2. But again, that's why correlations are so difficult (especially given our data), and why there are "lies, damned lies, and statistics."
 

Trey of Diamonds

Well-Known Member
Leaked Docs From Heartland Institute Cause a Stir—but Is One a Fake?

Now, caveats out of the way, here's why I think that memo is probably fake:


1. All of the documents are high-quality PDFs generated from original electronic files . . . except for the "Climate Strategy" memo. (Hereinafter, "the memo"). That appears to have been printed out and scanned, though it may also have been faxed.


Either way, why? After they wrote up their Top Secret Here's All the Bad Stuff We're Gonna Do This Year memo, did the author hand it to his secretary and say "Now scan this in for the Board"? Or did he fax it across the hall to his buddy?
 

work in progress

Well-Known Member
As far as data sets go, it's a bit different. There's no "us vs. them" in the way there is in climate science in general (both in the worlds of specialists and non-specialists). Both "deniers" and proponents of AGW disagree among one another about the right proxy sets, the right adjustments, the best statistical measures, etc. There's no consensus on many of the statistical procedures or data sets the way there is on AGW theory in general.
Yes, except that some of the deniers....most notoriously - Monckton, were carefully choosing the periods to measure that would give the appearance of a decrease in global average temperatures. But my point of comparing the math debates in climate science with the intelligent design arguments, is that a crank (like William Dembski) who has sufficient expertise in mathematics and information theory, can easily construct a mathematical system that can only really be challenged successfully by other mathematicians. That's where a sort of faith in the process of determining the consensus of expert opinion becomes necessary.

You're also talking about a fairly small time frame though. Which is where paeloclimotology becomes important:


The ice-core data is a proxy set. So are a lot of other things we find in the environment. There's no reason I can think of to conclude a priori that the ice-core data is the most robust.
I would still have more confidence when two systems that may be subject to error, can be correlated close enough to give us some confidence that they are providing something close to what would have been gathered if there was someone able to measure the atmosphere during that time.


Actually that's part of AGW theory (sort of). That is, AGW theory is a general model (with certain specifics widely debated) about climate. Climate, however, involves all sorts of things. AGW theory holds that these "subsystems" of and external influences on our atmosphere interact in very complex ways which result in things like a lack in a warming trend which "should" be there or a dangerous rise in temperature because of interactions between carbon dioxide and the central internal (i.e., atmospheric) components responsible for temperature fluctuations.

A great deal of the CO2 (roughly half) that we emit doesn't go into the atmosphere. Nor is there a straightforward correlation between emission amount and atmospheric content. We know where a lot of it goes (like the ocean), but we don't know all, and we are also not sure how the system as a whole adapts to changes in content. Take the Gaia hypothesis. This hypothesis holds that the earth gradually tended towards one of self-regulation. Chaotic systems in general exhibit widely fluctuating behavior. They are aperiodic, they don't settle, they are unpredictable, etc. But they somehow stay within certain "bound." I think a graphic depiction is beast here, and as I went with a logistic graph before, I'll go with the more attractive fractal:
If there is one, single reason why I am more partial to a Gaia theory - that the Earth is self-regulating...or at least attempting to be self-regulating, as opposed to chaotic models of climate (Peter D. Ward wrote a book about it recently called The Medea Hypothesis), it's the point that James Lovelock frequently stresses for starting him along this thinking over 40 years ago: that while the Sun has grown increasingly stronger over time, most of Earth's recent history has seen ice ages more frequently than warmer, interglacial periods. This is especially true during the Pleistocene leading up to the recent warming of the Holocene 12000 years ago. It would give the appearance that something is trying to work against the warming effects of the Sun, and it seems more likely that there is some emergent process at work powering the increase in negative sequestration of carbon to balance out the Sun's growing intensity. A chaotic, haphazard approach, would not provide any likely means to explain why CO2 levels were forced down to such low levels (180 ppm.) in more recent times, until humans started affecting climate, possibly right from the beginning of the Holocene.
 

LegionOnomaMoi

Veteran Member
Premium Member
Yes, except that some of the deniers....most notoriously - Monckton, were carefully choosing the periods to measure that would give the appearance of a decrease in global average temperatures.[/quotes]

That's true. And what I find interesting is how the same data can be accurately represented (in that neither is actually lying) by both sides and imply totally different things. You can see one example in this thread. It's certainly true that the last decade has seen most of the highest temperatures on record. And this is what AGW proponents point to. It is also true that the temperature hasn't risen. And this is what skeptics/deniers point to. Both are correct. You even when neither side is actually outright distorting the fact, just presenting them in a particular way can be misleading.



I would still have more confidence when two systems that may be subject to error, can be correlated close enough to give us some confidence that they are providing something close to what would have been gathered if there was someone able to measure the atmosphere during that time.

The more correlations the better. But that's the problem. The satellite data is the most accurate method for measuring temperatures we have. But it doesn't match the surface data in terms of climate theory (which predicts we should see more warming in the troposphere not the surface). Then there's the fact that none of the proxy records correlate very well with the entire direct temperature data. But why? Is it that our surface records are too inaccurate (i.e., beyond the given margin of error) because they don't adequately account for surface processes? Or are the statistical procedures used to come up with a global average flawed? Or are the proxies inadequate? And if not, which ones are best? That's why while we can be very confident (in my opinion) that humans are at least contributing to a warming trend, how much is a much more difficult question.


If there is one, single reason why I am more partial to a Gaia theory - that the Earth is self-regulating...or at least attempting to be self-regulating, as opposed to chaotic models of climate
"Chaos" in the technical sense (dynamical systems) is necessary for self-regulating systems. It's how your brain works, for example.

Chaotic (or dynamical systems) aren't "chaos" in the sense the term is normally used. They are "chaotic" in that we can't predict their evolution easily (or at all) because they don't settle or even fluctuate in a steady way. But that doesn't mean they just chaotically fluctuate all over the place. Rather, many tend to irregularly behave regularly. That is, while they fluctuate in ways we can't predict, they do so within a certain range and exhibit certain patterns. Sometimes this involves self-regulation or emergent behavior. We know the climate is an extremely complicated dynamical systems. We also know that to some extend this system tends to fluctuate aperiodically but within a certain range. It does self-regulate to some extent. The question is how, to what extent, in what ways, and what will cause it to transition to a totally new phase space?
 

work in progress

Well-Known Member
Yes, except that some of the deniers....most notoriously - Monckton, were carefully choosing the periods to measure that would give the appearance of a decrease in global average temperatures.[/quotes]

That's true. And what I find interesting is how the same data can be accurately represented (in that neither is actually lying) by both sides and imply totally different things. You can see one example in this thread. It's certainly true that the last decade has seen most of the highest temperatures on record. And this is what AGW proponents point to. It is also true that the temperature hasn't risen. And this is what skeptics/deniers point to. Both are correct. You even when neither side is actually outright distorting the fact, just presenting them in a particular way can be misleading.
Where is the data supporting no temperature increases? If you're talking about highs, I know that analysis of meteorological data has shown that record highs are occurring twice as often as record lows in the U.S.

As for average temperatures, the trend has been that the Arctic is increasing temperatures at twice the rate of lower latitudes. The evidence for warming Arctic is beyond any serious challenge, since the decline in summer sea ice is getting close to half of what it was 40 years ago, and the increase in open ocean in the Arctic is wreaking havoc with our weather here by busting up the jet stream and allowing cold winter air to move south...as we can see in Europe and Asia's case this year, and record winter temperatures in the far north. But it's stories like this one in my weekend paper, which make any challenge of climate change absurd: Head of Canadian navy says climate change boosts need for bigger presence in Arctic
The opening up of the Arctic Ocean is presenting opportunities for oil companies and mining companies to go prospecting in an area that had formerly been locked in; and if anything of value can be developed, it presents huge security risks for Canada as the U.S. and Russia will try to stake claims just off our shores.

As an anecdote, I recall hearing some Inuit leaders over 30 years ago trying to find an audience down south, so they could tell them that everything was changing in the Far North. For example: the Inuit bands that lived along the coasts of Hudson's Bay and the Arctic Ocean, and hunted seals, had dozens of words in their languages to describe various snow conditions, but had no word for lightning....because it was a weather phenomena that had been previously unheard of until 30 years ago. People who live off the land, as the Inuit did up until 30 to 50 years ago, had to carry a deep understanding of their environment in order to survive and make a living. They carried on long oral histories to use as a guide; but nothing for many generations back in time matched the weather changes that started happening in the Arctic about 30 years ago, and are starting to make their presence known to the rest of us now.

The more correlations the better. But that's the problem. The satellite data is the most accurate method for measuring temperatures we have. But it doesn't match the surface data in terms of climate theory (which predicts we should see more warming in the troposphere not the surface). Then there's the fact that none of the proxy records correlate very well with the entire direct temperature data. But why? Is it that our surface records are too inaccurate (i.e., beyond the given margin of error) because they don't adequately account for surface processes? Or are the statistical procedures used to come up with a global average flawed? Or are the proxies inadequate? And if not, which ones are best?
Which surface measurements are you talking about? The Argo Ocean Floats seem to have the ocean surface pretty well covered. Water is less susceptible to short term fluctuations that air is, so I would expect that rising temperatures and melting sea ice would give us the long term trend.
http://www.argo.ucsd.edu/

That's why while we can be very confident (in my opinion) that humans are at least contributing to a warming trend, how much is a much more difficult question.
In the final analysis, it doesn't matter, if rising temperatures cause increased floods and droughts and make high-yield agriculture next to impossible. But, some of the other causes mentioned frequently, like the Sun, were taken down by the continued temperature increases during a period of lower solar activity.


"Chaos" in the technical sense (dynamical systems) is necessary for self-regulating systems. It's how your brain works, for example.

Chaotic (or dynamical systems) aren't "chaos" in the sense the term is normally used. They are "chaotic" in that we can't predict their evolution easily (or at all) because they don't settle or even fluctuate in a steady way. But that doesn't mean they just chaotically fluctuate all over the place. Rather, many tend to irregularly behave regularly. That is, while they fluctuate in ways we can't predict, they do so within a certain range and exhibit certain patterns. Sometimes this involves self-regulation or emergent behavior. We know the climate is an extremely complicated dynamical systems. We also know that to some extend this system tends to fluctuate aperiodically but within a certain range. It does self-regulate to some extent. The question is how, to what extent, in what ways, and what will cause it to transition to a totally new phase space?
I see. The Gaia models describe the climate changes as a chaotic transition to new equilibrium points. The problem for us today is that, as we push greenhouse gas levels to highs not reached since the early Cenozoic, we, along with many other animals that depend on tropical rainforests, colder oceans etc., will not likely survive the transition to a hotter earth with no ice.
 
Last edited:

Trey of Diamonds

Well-Known Member
Santorum: Global warming is politics, not science

STEUBENVILLE, Ohio (AP) — Rick Santorum says President Barack Obama is pushing a radical environmental agenda that unwisely limits energy production and turns its back on science.

Santorum told voters in eastern Ohio on Monday that science is on the side of those who want to aggressively produce more oil and natural gas in America. He said the notion of global warming is not climate science but "political science."

Santorum said Obama and his allies want to frighten people about new oil-exploration technologies so they can get your dollars and turn it over to politicians to win elections "so they can control your lives."

Ohio's GOP primary is March 6.

Santorum also planned several campaign appearances later Monday in Michigan. Voters there go to the polls on Feb. 28.
 

LegionOnomaMoi

Veteran Member
Premium Member
Where is the data supporting no temperature increases?

The global temp records put out by GISS, the MET office/HadCRU, etc. The global temperature has remained high (and as far as we can tell from our records of direct temperature measurements, a record high) since ~1998. However, the trend of the last 10+ years has not seen any warming. That is, while the average global temperature of individual years in the past ~15 years have all been quite high, they average temperature during this period hasn't changed (at least not in any appreciable way, given the margin of error).


If you're talking about highs, I know that analysis of meteorological data has shown that record highs are occurring twice as often as record lows in the U.S.
Record highs matter, but what I'm talking about (and AGW theory is talking about) are trends. Currently, the global average temperature has remained quite high for ~15 years, but there is no warming trend. I'll post this graph again produced by the East Anglia Climate Research Unit (Phil Jones is the director):
nhshgl.gif



The bottom chart is the global average. Notice that in 1998 there is a huge red spike. Also notice that at around 2000 there is no upward trend, and in fact the temperature average is about what it has been since just before the 1998 spike. According to several recent papers by AGW proponents, this is because natural oscillations are "hiding" a warming trend we would see otherwise. In other words, were it not from even more extreme natural forcings, we would continue to see an upward trend from anthropogenic forcings. The length of this flat trend is roughly half of the period of warming humans are believed to be responsible form ~1975 to ~1995.


As for average temperatures, the trend has been that the Arctic is increasing temperatures at twice the rate of lower latitudes.
That's true, but the question is why. If you look back at the graph, you will see the greater increase in the Northern Hemisphere compared to the Southern. However, while the ~15 year flat trend isn't there, there is still no warming trend since ~2000. The temperatures continue to rise for for several years, and then drop again.


Which surface measurements are you talking about?

The combined sets which use temperature records from surface monitoring stations and instruments around the globe, both sea and land.

The Argo Ocean Floats seem to have the ocean surface pretty well covered.
The Argo group is only about a decade old.

Water is less susceptible to short term fluctuations that air is, so I would expect that rising temperatures and melting sea ice would give us the long term trend.

The problem is we don't have good long term data for things like sea ice and even ocean temperatures. Nor is it an easy task. Also, there are several studies which find an increase in artic sea-ice in several places during the ~30 year period of AGW warming(see, e.g., here, where the researchers found a steady increase during the entire AGW warming period) while others find a decrease in other areas during the same time.



The problem for us today is that, as we push greenhouse gas levels to highs not reached since the early Cenozoic, we, along with many other animals that depend on tropical rainforests, colder oceans etc., will not likely survive the transition to a hotter earth with no ice.
That's certainly a possibility. But I'm not convinced that that co2 levels haven't been higher in the past few thousand years. Same with temperatures. Of course, if the climate system can't adapt to continuing increases in co2, or cannot sufficiently adapt, and the connection between co2 and climate systems (e.g., the feedbacks) posited by AGW theory are correct, then it doesn't matter if the current atmospheric content was matched or is less than some point a few thousand years ago as long as we continue to push it up. Eventually, it will be higher, and the result will be dangerous to catastrophic (again, providing there is a positive feedback parameter and other central tenets of AGW theory are also accurate) .
 

work in progress

Well-Known Member
The global temp records put out by GISS, the MET office/HadCRU, etc. The global temperature has remained high (and as far as we can tell from our records of direct temperature measurements, a record high) since ~1998. However, the trend of the last 10+ years has not seen any warming. That is, while the average global temperature of individual years in the past ~15 years have all been quite high, they average temperature during this period hasn't changed (at least not in any appreciable way, given the margin of error).




Record highs matter, but what I'm talking about (and AGW theory is talking about) are trends. Currently, the global average temperature has remained quite high for ~15 years, but there is no warming trend. I'll post this graph again produced by the East Anglia Climate Research Unit (Phil Jones is the director):
nhshgl.gif



The bottom chart is the global average. Notice that in 1998 there is a huge red spike. Also notice that at around 2000 there is no upward trend, and in fact the temperature average is about what it has been since just before the 1998 spike. According to several recent papers by AGW proponents, this is because natural oscillations are "hiding" a warming trend we would see otherwise. In other words, were it not from even more extreme natural forcings, we would continue to see an upward trend from anthropogenic forcings. The length of this flat trend is roughly half of the period of warming humans are believed to be responsible form ~1975 to ~1995.




1998 also just happens to be during the peak of an El Nino, which I can remember at the time, those who claimed it was just natural forces at work, claimed El Nino was causing the heat waves and Arctic ice melting, and everything would return to normal afterwards. So what is the excuse for years like 2010, which recorded high global average temperatures during a La Nina Pacific, along with low solar activity?



UAH_LT_1979_thru_Nov_10.gif

Figure 1: University of Alabama, Huntsville (UAH) temperature chart from January 1979 to November 2010. This chart is shown with no trend lines so the viewer may make his own judgment.


If we just compared that brief peak in 1998, with any other reading, we could make a lame argument for cooling...since every other reading would be lower. But, that 1998 high wasn't sustained, as the graph shows, using a 13 month mean average, the recent warming trend is indicating an increasing amount of sustained heating in the atmosphere.

But, I recall you've raised objections before about using such short ranges for determining trends, so when it comes to air temperatures, averaging decades would be more accurate than using annual numbers, because of the effects of solar activity, volcanoes, El Ninos/La Ninas etc.


That's true, but the question is why. If you look back at the graph, you will see the greater increase in the Northern Hemisphere compared to the Southern. However, while the ~15 year flat trend isn't there, there is still no warming trend since ~2000. The temperatures continue to rise for for several years, and then drop again.
An obvious reason would be that the Northern Hemisphere has the bulk of the human population...and more pertinent, the industrial nations, which have been burning coal and oil for the past 150 years. Also, something close to 90% of the World's ice lies in the Southern Hemisphere. The East Antarctic Ice Sheet alone, contains something like 80% of the world's ice. So, it seems obvious that the Antarctic will be the last place to melt. The ice has acted as a restraining influence on temperature rise. First, in the oceans - a large quantity of sea ice prevents temperatures to increase with any signficance until the energy absorbed by the oceans starts to melt the ice.

The rapid loss of sea ice in the Arctic, is causing the Arctic Ocean to heat up at an increasing rate, and is beginning to affect our weather down here, as we've noticed with increasingly bizarre winter weather patterns caused by the disruption of the normal patterns of the Arctic Oscillation and North Atlantic Oscillation. The resulting chaos up there is starting to make its presence known in the lower latitudes in recent years.

But the key point is that the Greenland and the Arctic Ice Cap has been the temperature regulator for the Northern Hemisphere. With the elimination of the sea ice within the next few decades....and most of Greenland ice a short time afterwards, the Northern Hemisphere will have radical fluctuations in temperature and precipitation that have never been observed in human history. And, of course, that will also speed up the warming process down here as well, with the loss of that ice.

The problem is we don't have good long term data for things like sea ice and even ocean temperatures. Nor is it an easy task. Also, there are several studies which find an increase in artic sea-ice in several places during the ~30 year period of AGW warming(see, e.g., here, where the researchers found a steady increase during the entire AGW warming period) while others find a decrease in other areas during the same time.
Your link's abstract indicates that they are only observing one region (Baffin Island and Davis Inlet), and their readings could be skewed by sudden glacial melts, which can dump a lot of land-based ice into the water in a relatively short period of time. That would give a false appearance that the ice is increasing.

According to measurements of Arctic Ice minimum volume by the Polar Science Center --the trend is going in one direction, downward:
BPIOMASIceVolumeAnomalyCurrentV2.png

Determining ice volume is especially important for several reasons: for one, thin ice does not support many Arctic fauna, such as seals, walruses and polar bears; and two, thin ice is going to melt even faster.

That's certainly a possibility. But I'm not convinced that that co2 levels haven't been higher in the past few thousand years. Same with temperatures. Of course, if the climate system can't adapt to continuing increases in co2, or cannot sufficiently adapt, and the connection between co2 and climate systems (e.g., the feedbacks) posited by AGW theory are correct, then it doesn't matter if the current atmospheric content was matched or is less than some point a few thousand years ago as long as we continue to push it up. Eventually, it will be higher, and the result will be dangerous to catastrophic (again, providing there is a positive feedback parameter and other central tenets of AGW theory are also accurate) .
Well, we've been over this a number of times already, so I'll just say that I'll go by the Tripati paleoclimate measurements until someone finds a problem with them, or finds something more accurate, that does show higher CO2 levels in the recent past. The overall trend seems clear that Co2 was generally at very low levels, and ice ages were the typical climate condition, with what we have now being the interglacial anomalies, not what is supposed to be the norm.
 

LegionOnomaMoi

Veteran Member
Premium Member
UAH_LT_1979_thru_Nov_10.gif

Figure 1: University of Alabama, Huntsville (UAH) temperature chart from January 1979 to November 2010. This chart is shown with no trend lines so the viewer may make his own judgment.


Here's the latest graph, with both a regression line (a straight line which best represents the trend) at 0 and a nonlinear best fit line.
UAH_LT_1979_thru_January_2012.png

If we just compared that brief peak in 1998, with any other reading, we could make a lame argument for cooling...since every other reading would be lower. But, that 1998 high wasn't sustained, as the graph shows, using a 13 month mean average, the recent warming trend is indicating an increasing amount of sustained heating in the atmosphere.
If we look at your graph (which, by the way, Spencer produced), then we see the "running 13 month mean" as you say. That's what we can then use as a basis of comparison for the later years. At about 2000, the effect of the 1998 spike "wears off" and the temperature begins to climb back up. It hits a high in about 2002, and then remains pretty constant until about 2007, where it begins to drop steeply. It then picks back up after about a year, and continues to rise, but never reaching much beyond the 13 month average (that is, the amount of increase after the ~2007-2008 drop is equal to that decrease.


If we include the new data points, the temperatures drop again. Again, overall, there's no warming trend for the last ~15 years.

But, I recall you've raised objections before about using such short ranges for determining trends, so when it comes to air temperatures, averaging decades would be more accurate than using annual numbers, because of the effects of solar activity, volcanoes, El Ninos/La Ninas etc.

That's the problem though. The entire period of observed warming is ~30 years. This is half that. The cooling before is ~30 years. And the warming before that was ~30 years. It seems as if every time there is a cooling trend since the effects of the industrial revolution began to take place, climate scientists have no problem coming up with a reason why this or that natural event caused the cooling. Same for the early warming trend from 1910-1940, which is too early to be attributed to humans (co2 was too low). This doesn't make the explanations wrong, but it's a bit odd that natural cycles are always around to explain trends we didn't expect or which don't fit well with the theory, but natural causes which are posited for the warming period attributed to human emissions are quickly rejected by the majority of the climate science community.

An obvious reason would be that the Northern Hemisphere has the bulk of the human population
But obvious reasons are also often just to easy. For one thing, AGW theory predicts that the surface warming should be less than that of the lower troposphere. We know that increases in human population (more buildings, land cultivation, and other surface processes) increase the surface temperatures in ways completely unrelated to global warming. The more buildings you have, the more cities, etc., the greater the bias in the surface record will be. What we are really interested in is what's happening in the lower atmosphere, but most of our records are from the ground/sea. Also, even if we stick to surface records in the northern hemisphere (see the HadCRU graph from my last post) we still don't see a warming trend. The temperature continues to go up a bit after 1998, and then drops a bit.



Your link's abstract indicates that they are only observing one region (Baffin Island and Davis Inlet), and their readings could be skewed by sudden glacial melts, which can dump a lot of land-based ice into the water in a relatively short period of time. That would give a false appearance that the ice is increasing.

Most of the studies which show melting also concern observations of certain areas.
And take a closer look at your graph:

BPIOMASIceVolumeAnomalyCurrentV2.png


This graph is a bit deceptive, or rather it doesn't quite tell the whole story. Notice the "0" point from which all anomalies are meaured. It starts at 5,000 km^3 above this zero point. So far, so good. But how is this sea ice measured? Well, they used the information from NSIDC. Except the NSIDC specifically states that there data "isn't suitable for time-series, anomalies, or trend analyses." However, it is good for one thing (which the Polar Science Center uses it for): sea-ice concentration (where there is more or less). This data is then plugged into a model which incorporates individual measurements from various places into what (we hope) is an accurate measure of the arctic sea ice as a whole. Unfortantely, however, as the Polar Science center states, sea-ice volume can't be continuously measured. That's actually a big understatement. There's an sheer volume of sea-ice locations is incredible. That's why it takes individual studies of certain areas. The way to get a a chart like the one you have above is to plug these disparate readings into your model and hope it's accurate. The only other thing to do is just go by individual locations, which doesn't help much either.

Then there's the issue (again) of why they are melting. The obvious answer is temperature increase. So why the steady increase in certain areas for 50 years or so? Who knows. And how much of the temperature is caused by human emissions? The generally accepted theory is, of course, most of it. But again, we haven't seen a warming trend even in the northern hemisphere for ~10 years, going by surface data (the most likely to be biased upwards).

Well, we've been over this a number of times already, so I'll just say that I'll go by the Tripati paleoclimate measurements until someone finds a problem with them,
Which Tripati (et al., I'm assuming) study? And why them? There are multiple paleoclimate sets. Also, the bigger issue is whether or not co2 correlates well enough with past temperatures, not actual co2 levels. There are plenty of studies which indicate that this is not the case, even that (and I believe I linked to one recent study) solar activity correlates with past records much better than co2.


or finds something more accurate, that does show higher CO2 levels in the recent past. The overall trend seems clear that Co2 was generally at very low levels, and ice ages were the typical climate condition, with what we have now being the interglacial anomalies, not what is supposed to be the norm.[/quote]
 

work in progress

Well-Known Member
[/i]

Here's the latest graph, with both a regression line (a straight line which best represents the trend) at 0 and a nonlinear best fit line.
UAH_LT_1979_thru_January_2012.png


If we look at your graph (which, by the way, Spencer produced), then we see the "running 13 month mean" as you say. That's what we can then use as a basis of comparison for the later years. At about 2000, the effect of the 1998 spike "wears off" and the temperature begins to climb back up. It hits a high in about 2002, and then remains pretty constant until about 2007, where it begins to drop steeply. It then picks back up after about a year, and continues to rise, but never reaching much beyond the 13 month average (that is, the amount of increase after the ~2007-2008 drop is equal to that decrease.


If we include the new data points, the temperatures drop again. Again, overall, there's no warming trend for the last ~15 years.
Your trend line doesn't exactly show a steep drop! And I am aware that the data comes from UAH, but the 30 year trend line produced here shows an upward trend, for what it's worth:
UAH-C1-screenshot.jpg

Figure 3: Trend lines showing the sudden jump in temperatures in the 1995 La Niña (Green lines) and the 1998 (Pink lines) El Niño events. Brown line indicates overall increase in temperatures.
-----------------------------------
But, this is just one method of measuring temperatures; what about ocean temps, sea level rise, the retreat of Arctic sea ice and the Greenland ice sheet? There are a number of methods for measuring global temperatures, and as mentioned previously, over 90% of increased heat added has been in the oceans. So why not look at what's happening to the oceans, since increased temperatures eventually make an impact with what's happening to our weather:
heat_content700m2000myr.png



That's the problem though. The entire period of observed warming is ~30 years. This is half that. The cooling before is ~30 years. And the warming before that was ~30 years. It seems as if every time there is a cooling trend since the effects of the industrial revolution began to take place, climate scientists have no problem coming up with a reason why this or that natural event caused the cooling. Same for the early warming trend from 1910-1940, which is too early to be attributed to humans (co2 was too low). This doesn't make the explanations wrong, but it's a bit odd that natural cycles are always around to explain trends we didn't expect or which don't fit well with the theory, but natural causes which are posited for the warming period attributed to human emissions are quickly rejected by the majority of the climate science community.
If it was just natural trends, and there was no effects of adding to greenhouse gas levels, we should be in a significant cooling trend right now, because of a prolonged La Nina in the Pacific and a long period of low solar activity....isn't this cited by skeptics as the cause of the "mini ice age" that happened to Europe a few centuries back? But that is not what is happening, because natural influences are not the whole story: Extreme weather 2011: warmest La Nina year on record, around 10th warmest overall

But obvious reasons are also often just to easy. For one thing, AGW theory predicts that the surface warming should be less than that of the lower troposphere. We know that increases in human population (more buildings, land cultivation, and other surface processes) increase the surface temperatures in ways completely unrelated to global warming. The more buildings you have, the more cities, etc., the greater the bias in the surface record will be. What we are really interested in is what's happening in the lower atmosphere, but most of our records are from the ground/sea. Also, even if we stick to surface records in the northern hemisphere (see the HadCRU graph from my last post) we still don't see a warming trend. The temperature continues to go up a bit after 1998, and then drops a bit.
I'm not up on what AGW theory is or is not supposed to predict....who's theory are we talking about anyway. I'm aware that increased greenhouse gas levels should trap heat in the lower atmosphere and lead to cooling in the stratosphere....which would be the opposite if the Sun was causing global warming....but I'm not familar with this surface temps vs. troposphere temperatures, or why there should be difference. If there is, then it's time to change the model.

Most of the studies which show melting also concern observations of certain areas.
And take a closer look at your graph:



This graph is a bit deceptive, or rather it doesn't quite tell the whole story. Notice the "0" point from which all anomalies are meaured. It starts at 5,000 km^3 above this zero point. So far, so good. But how is this sea ice measured? Well, they used the information from NSIDC. Except the NSIDC specifically states that there data "isn't suitable for time-series, anomalies, or trend analyses." However, it is good for one thing (which the Polar Science Center uses it for): sea-ice concentration (where there is more or less). This data is then plugged into a model which incorporates individual measurements from various places into what (we hope) is an accurate measure of the arctic sea ice as a whole. Unfortantely, however, as the Polar Science center states, sea-ice volume can't be continuously measured. That's actually a big understatement. There's an sheer volume of sea-ice locations is incredible.
No doubt measuring ice volume is more difficult than sea ice extent, but it is a better indicator of how fast the ice is melting. But, I would say the best indicator of the extent of Arctic sea ice loss is provided by shipping companies drawing up plans for how they are going to exploit newly opened waterways in the Arctic.
Then there's the issue (again) of why they are melting. The obvious answer is temperature increase. So why the steady increase in certain areas for 50 years or so? Who knows. And how much of the temperature is caused by human emissions? The generally accepted theory is, of course, most of it. But again, we haven't seen a warming trend even in the northern hemisphere for ~10 years, going by surface data (the most likely to be biased upwards).
Well, wouldn't an obvious reason be that the cause of sea ice melt have more to do with what's going on underneath the ice than what's happening above? Specifically, if the ocean currents coming from the south are increasing in temperature, this will break up the leading edge of the ice pack quicker during the summers, and delay freeze up in the winter. The water temperatures aren't as susceptible to seasonal changes, so the long term trend is easily 50 years, if not longer.

Which Tripati (et al., I'm assuming) study? And why them?
This one, from 2009 -- which I have posted several times in the past: Aradhna Pripati of UCLA - Last time carbon dioxide levels were this high: 15 million years ago, scientists report
"We are able, for the first time, to accurately reproduce the ice-core record for the last 800,000 years — the record of atmospheric C02 based on measurements of carbon dioxide in gas bubbles in ice," Tripati said. "This suggests that the technique we are using is valid.

"We then applied this technique to study the history of carbon dioxide from 800,000 years ago to 20 million years ago," she said. "We report evidence for a very close coupling between carbon dioxide levels and climate. When there is evidence for the growth of a large ice sheet on Antarctica or on Greenland or the growth of sea ice in the Arctic Ocean, we see evidence for a dramatic change in carbon dioxide levels over the last 20 million years.
--------------------------------------------------------------------------------------

Tripati's new chemical technique has an average uncertainty rate of only 14 parts per million.

"We can now have confidence in making statements about how carbon dioxide has varied throughout history," Tripati said.

In the last 20 million years, key features of the climate record include the sudden appearance of ice on Antarctica about 14 million years ago and a rise in sea level of approximately 75 to 120 feet.

"We have shown that this dramatic rise in sea level is associated with an increase in carbon dioxide levels of about 100 parts per million, a huge change," Tripati said. "This record is the first evidence that carbon dioxide may be linked with environmental changes, such as changes in the terrestrial ecosystem, distribution of ice, sea level and monsoon intensity."

Today, the Arctic Ocean is covered with frozen ice all year long, an ice cap that has been there for about 14 million years.

"Prior to that, there was no permanent sea ice cap in the Arctic," Tripati said.
 

LegionOnomaMoi

Veteran Member
Premium Member
Your trend line doesn't exactly show a steep drop!
There is no drop. The point is that there is no rise. The trend of the past 10+ years has been flat. Temperatures were high, but didn't increase. That doesn't mean they decreased though. Letme be clear about what I'm saying about our direct temperature records. For the total surface records, we see a rise from ~1910 until ~1940. Then a drop and a period of low temperatures until ~1977. Then we see a warming from until about ~1998. After that, the global temperature remains pretty much the same. Some years are a bit hotter, some a bit colder, but there is no warmin trend and there is no cooling trend either.

And I am aware that the data comes from UAH, but the 30 year trend line produced here shows an upward trend, for what it's worth:
Your graph is a bit out dated. The one I provided was the most recent. And both graphs show most of the observed warming attributed to humans (the satellite data only goes back to 1979, but the warming trend started a bit earlier). But there's a problem with the following:

Figure 3: Trend lines showing the sudden jump in temperatures in the 1995 La Niña (Green lines) and the 1998 (Pink lines) El Niño events. Brown line indicates overall increase in temperatures.

This trend line is quite misleading, and not just because the graph is a bit dated. I'm not sure if you've taken an intro statistics course before, but just in case you haven't or don't remember it, that "brown line" is a typical regression line. So what does that mean?

Given any series of points on a coordinate plane, I can create a regression or "best fit" line. Take your graph: The Y-axis is temperature and the X-axis is years. So every point (x,y) on the graph corresponds to a particular year and a particular temperature represented by a single dot. What a regression line does is find (using something called a least squares approach) a function which generates a line so that for every x value (every year), the line will give us the best approximate y-value. In other words, it's a function f(x)=mx+b (which is the equation for a straight line) which takes ALL the data points as a whole and finds a single, straight line which will best approximate every y-value given any x-value.

So why does this matter? Well, for one thing I can always generate a line, even when the scores show no trend whatsoever. However, the problem here (that is, the problem with that brown line), is that because it is a linear (a straight line), it skews the real picture. If you look at the more recent satellite graph that I posted, you'll notice a wavy line instead of a curved line. The reason for this is because the trend isn't linear. It doesn't consistently go up or down. However, a linear regression line, like that brown line, CAN'T show this.

To summarize (and perhaps simplify): let's say (as is true) that the temperatures went up from 1979-1998, but then did not increase on average (i.e., no trend) until your graph's brown line ends somewhere in 2010. That would mean the record showed almost 20 years of warming, followed by about ~12 years with no warming or cooling trend. The problem with using a regression line here is that it can't curve. The warming trend was twice as long as the period without a trend, so a regression line will have to show a steady increase.

That's why it's misleading to use it. It gives the impression of a steady rise from 1979 to somewhere in 2010, but that's inaccurate. In reality (as you can see from the nonlinear regression line in my graph), the temperature trend after 1998 isn't there. There's a slight rise and a slight decline, but there is no real warming trend from ~1998 onwards.

But, this is just one method of measuring temperatures; what about ocean temps,

The problem with other measurements is two-fold (usually). First, all other measurement data is local. That is, the only measure the temperature around the instrument. So unless we put measuring devices on every square foot of the globe, averaging these sets isn't a simple matter. They aren't evenly distributed, they are subject to all kinds of non-related local temperature changes, etc. The first great thing about satellites is while they (like all other instruments) can malfunction, they have excellent spatial resolution. They are the only devices capable of non-local global measurement (i.e., measurements which reflect not the temperature around the device but the temperature of the atmosphere).

The other big advantage is that AGW theory involves atmospheric warming (specifically the lower atmospher). In other words, the surface should warm less than the lower troposphere, where the "radiative blanket" of ever-thickening co2 emits infrared enegery. In fact, a big problem is that while the satellite data and the surface temperature data are pretty well correlated, they shouldn't be. The satellite data should show more warming (according to model predictions) but it doesn't (despite numerous corrections to the data set which have all pushed the trend up).
 

LegionOnomaMoi

Veteran Member
Premium Member
continued from previous post
sea level rise, the retreat of Arctic sea ice and the Greenland ice sheet?
These things are even more problematic. Not only are our measurements also limited to local readings (hence the several studies showing a steady increase in ice in this or that location), the measurements are much more difficult. At least with temperature data all you need for a local reading is a good thermometer. But there's a lot of sea ice constantly shifting in location and in order to measure actual volume decrease (or increase) you have to do tests on site. Finally, sea-ice is affected by everything from temperature to quakes. So sea-ice measurements don't correlate that well with temperatures.

For example, here's a recent (2009) study on the Greenland ice sheet by Wake et al. (Annals of Glaciology) going back to 1866 which finds that 1) the loss of sea ice isn't exceptional and 2) doesn't seem to have much to do with the warming

Surface mass-balance changes of the Greenland ice sheet since 1866


So why not look at what's happening to the oceans, since increased temperatures eventually make an impact with what's happening to our weather:
In terms of temperature, there are a few reasons:

1) Our data sets are terrible. ARGO is a terrific improvement, but it's new. Before that, we simply have very spotty measurements
2) This is especially problematic considering that the oceans don't reflect short-term varations because of their slow response (global circulation in the atmosphere occurs on ~1 year time-scale, while the ocean's is almost a thousand times longer). Ocean variation caused by climate changes can be anywhere from several decades to centuries.
3) Finally, even though what we know of ocean temperatures seems to indicate that, for example, natural forcings (such as solar radiative forcing) can't be behind the ocean temperature increases, there are studies showing otherwise. Interestingly enough, they use the same logic which is the basis for AGW theory: some mechanism is magnifying the effect. For example, here's a paper from 2008 (Journal of Geophysical Research) which attempts to show that a mechanism does exist (although what it isn't is only hypothesized: GCRs):

Using the oceans as a calorimeter to quantify the solar radiative forcing

Personally, the above research seems far-fetched to me.



If it was just natural trends, and there was no effects of adding to greenhouse gas levels, we should be in a significant cooling trend right now

Well that's just it. What "should" we be seeing? So far, all of our predictions have been off. To me, that's the greatest significance of the current trend (which again, is not a cooling trend, it just isn't a warming trend either). Not that temperatures didn't rise, but that we didn't predict they wouldn't. We're counting on our knowledge of the climate to be accurate enough to predict what will happen in 2100, but we've had to explain the recent trend in hindsight. And again, it seems strange that every trend which doesn't fit AGW theory has a ready explanation as to what natural forcings caused it, but the same theories which posit that some, much, or most of the AGW period was actually natural are dismissed. Science is supposed to be about proving yourself wrong, not finding evidence to fit your theory. It seems as if so much research is conducted to make the data fit the theory, rather than really understand the climate. There's no doubt in my mind that humans are causing problems (not just with co2) for the global environment which need to be addressed. I'm just afraid that the current approach will result in missing important aspects of the climate which could then result in "solutions" that don't work.

I'm not up on what AGW theory is or is not supposed to predict....who's theory are we talking about anyway. I'm aware that increased greenhouse gas levels should trap heat in the lower atmosphere and lead to cooling in the stratosphere....which would be the opposite if the Sun was causing global warming....but I'm not familar with this surface temps vs. troposphere temperatures, or why there should be difference. If there is, then it's time to change the model.
First, about the sun comment. It's not actually true. Again, I haven't come across any recent research that I can remember which suggests that the sun's infrared energy itself was the main driver of climate during the AGW period. Rather, the hypotheses concern an indirect effect (much like mainstreadm AGW and co2): the changes in the "protective magnetic shield" provided by the sun cause an increase in elements which change global cloud cover. Second, the sun's heat doesn't simply hit us and then then gradually dissapate. The basis of AGW theory is capacity of the lower troposphere to trap and then emit the sun's heat. GHG's aren't heating up the planet because they are hot themselves. According to AGW theory the increase in GHG concentration (of which co2 both is a part and is causing through feedbacks) traps a greater amount of the sun's heat. But if the atmospheric GHG content was held steady, and the the sun's "heat" which hit the earth increased, we'd still see it in the same place: where GHG's are concentrated and can "trap" it.

Finally, about the surfaces. Again, it's well known that lot's of things can effect the temperature at the surface levels. For example, just look at the temperature record of a highly populated city compared with a very low populated and largely uncultivated/processed town/village a few miles away. You'll see a tremendous difference. Concrete buildings, changes in vegetation from farming, and other surface processes effect local temperatures but not global temperatures (they aren't due to climate change). However, 1) most of our surface data comes from places which are heavily effected by these changes and 2) the basis of AGW theory is an increase of lower atmospheric heat, which then "trickles down" to the surface. In other words, the AGW models predict greater warming in the lower troposphere where the satellites give us temperature data, not the surface. Yet the surface data shows either equal or greater warming than the satellite data, which is counter to AGW theory.
 

shawn001

Well-Known Member
Satellites have their own issues in making measurements which are problematic, which are not being discussed here.


Grant Foster​
1 and Stefan Rahmstorf2

1​
Tempo Analytics, 303 Campbell Road, Garland, ME 04939, USA

2​
Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam, Germany

Received 27 September 2011
Accepted for publication 16 November 2011
Published 6 December 2011
Online at​
stacks.iop.org/ERL/6/044022

Abstract​
We analyze five prominent time series of global temperature (over land and ocean) for their common time interval since 1979: three surface temperature records (from NASA/GISS, NOAA/NCDC and HadCRU) and two lower-troposphere (LT) temperature records based on satellite microwave sensors (from RSS and UAH). All five series show consistent global warming trends ranging from 0.014 to 0.018 K yr
��1. When the data are adjusted to remove the estimated impact of known factors on short-term temperature variations (El Ni˜no/southern oscillation, volcanic aerosols and solar variability), the global warming signal becomes even more evident as noise is reduced. Lower-troposphere temperature responds more strongly to El Ni˜no/southern oscillation and to volcanic forcing than surface temperature data. The adjusted data show warming at very similar rates to the unadjusted data, with smaller probable errors, and the warming rate is steady over the whole time interval. In all adjusted series, the

two hottest years are 2009 and 2010.


global_adjusted_temp.png




http://iopscience.iop.org/1748-9326/6/4/044022/pdf/1748-9326_6_4_044022.pdf


Work in progress, I think your on the right path in the oceans heating up and becoming more acidic from absorbing more carbon, which creates carbonic acid.

Carbon release to atmosphere 10 times faster than in the past

"Rather than the 20,000 years of the PETM which is long enough for ecological systems to adapt, carbon is now being released into the atmosphere at a rate 10 times faster," said Kump. "It is possible that this is faster than ecosystems can adapt."

Astrobiology Magazine



If the oceans weren't sucking up the co2 how much co2 would be in the atmosphere now?


Ocean Acidification: Some Organisms Already Experiencing Ocean Acidification Levels Not Predicted to Be Reached Until 2100
ScienceDaily (Dec. 22, 2011) — A group of 19 scientists from five research organizations have conducted the broadest field study of ocean acidification to date using sensors developed at Scripps Institution of Oceanography, UC San Diego.

Ocean acidification: Some organisms already experiencing ocean acidification levels not predicted to be reached until 2100


Global Land and Ocean Temperture Index

Climate Change: Key Indicators

Also Work in Progress

Move the slider on the interactive

The time series at right shows the progression of changing global surface temperatures from 1884 to 2010. Dark blue indicates areas cooler than average. Dark red indicates areas warmer than average.

Climate Change: Key Indicators


The global temperture in up 1.5 degrees F since 1880.
 
Top