#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

# Global warming in the news

2»

• Options
51.
edited February 2013

Jim, make sure "Markdown+ltex" is checked and use this format

[text](URL)

Example (click Source to see what I did):

And see the Help link next to "Markdown+ltex" for further instructions.

Comment Source:Jim, make sure "Markdown+ltex" is checked and use this format ~~~~ [text](URL) ~~~~ Example (click Source to see what I did): [link to image](http://greenbusinesswatch.org/blog/wp-content/uploads/2012/12/how-skeptics-view-global-warming.gif) And see the Help link next to "Markdown+ltex" for further instructions.
• Options
52.
edited February 2013

Thanks Graham. I've got Md+Itex selected and looked at Andrew's cheat sheet but I didn't find any mention of images.

Corrected above at #50.

Comment Source:Thanks Graham. I've got Md+Itex selected and looked at Andrew's cheat sheet but I didn't find any mention of images. Corrected above at #50.
• Options
53.

So seeing that picture I can't help inventing more work (for myself). I'll bet there are papers and blogs with the same cherry-picking.from the usual suspects for at least some of these "flat" periods. Normally I ignore these peope but this seems like a graphic argument worth making.

Comment Source:So seeing that picture I can't help inventing more work (for myself). I'll bet there are papers and blogs with the same cherry-picking.from the usual suspects for at least some of these "flat" periods. Normally I ignore these peope but this seems like a graphic argument worth making.
• Options
54.

It looks like the R code first applies a 12-month smoothing filter to the data before computing differences. That is, the value for each month is replaced by the average of that month, the preceding 5 months, and the following 6 months. Then the smoothed data are differenced (i.e. differences between successive smooth months are computed). All the rest of the code in the diff12() function, including the rounding, are technical details to convert between year-month date formats.

Comment Source:nad, It looks like the R code first applies a 12-month smoothing filter to the data before computing differences. That is, the value for each month is replaced by the average of that month, the preceding 5 months, and the following 6 months. Then the smoothed data are differenced (i.e. differences between successive smooth months are computed). All the rest of the code in the diff12() function, including the rounding, are technical details to convert between year-month date formats.
• Options
55.

Nathan wrote:

It looks like the R code first applies a 12-month smoothing filter to the data before computing differences. That is, the value for each month is replaced by the average of that month, the preceding 5 months, and the following 6 months. Then the smoothed data are differenced (i.e. differences between successive smooth months are computed). All the rest of the code in the diff12() function, including the rounding, are technical details to convert between year-month date formats.

Nathan I placed the answer to cour comment in the thread for the discussion about GHG concentrations and temperature concentrations.

Comment Source:Nathan wrote: >It looks like the R code first applies a 12-month smoothing filter to the data before computing differences. That is, the value for each month is replaced by the average of that month, the preceding 5 months, and the following 6 months. Then the smoothed data are differenced (i.e. differences between successive smooth months are computed). All the rest of the code in the diff12() function, including the rounding, are technical details to convert between year-month date formats. Nathan I placed the answer to cour comment in the <a href="forum.azimuthproject.org/discussion/1178/temperature-vs-ghg-concentrations/">thread for the discussion about GHG concentrations and temperature concentrations.</a>
• Options
56.
edited March 2013

John wrote:

Oh! That was the problem: I’d sent an invitation to david.tanzer@gmail.com! Now I’ve sent one to you… it says the invitation was sent by email, so you should look for it… but I see something on G+ that says you’ve been invited, and it shows your picture, so that’s reassuring.

Cool. I accepted the invite, and now can run as Azimuth.

The very concrete is again taking most of my time. The contractor purchased a replacement joist, but it has...knots! The engineer had said it must be knot free. Lot's of back and forth, with photos, questions, and irritation. Now the engineer is saying that knots are not critical if they are in the "compression zone" of the beam, rather than the "tension zone." From what I could gather, the compression zone is the top half of the beam, and the tension zone is the bottom half. I saw an illustration of a curved beam under load, which showed, naturally, the top curve as being shortened, and the bottom curve as being lengthened. Then I tried to imagine the response of a knot under these different stress conditions. Well, my understanding is very vague about this. Can the stress tensor field help us to understand it? (What about a psychological stress tensor field?) The email chain feels like a trial, with evidence, theories, and diplomacy.

Well the good news is that today, after I pointed out inconsistencies in the testimony of the engineer re: knots, he has clearly recommended that we abandon the use of natural wood, and instead use an engineered wood product called LVL. (This is for one short "header beam," in front of a chimney, which supports two full-length joists that cannot be supported by the brick wall due to the presence of a chimney.)

Theoretical musing: Is there a massive differential equation, for the stresses at all points in the structure of a house, whose solutions would show the breakdown points? (Does the "stress-strain" equation do the job?)

I have been working on the next blog article for Petri Net programming (it's not on the Azimuth site yet), which I'll post when things settle down. away. For Google+, it will take me some time to clear my mind before I can get into a real-time discussion mode.

Comment Source:John wrote: > Oh! That was the problem: I’d sent an invitation to david.tanzer@gmail.com! Now I’ve sent one to you… it says the invitation was sent by email, so you should look for it… but I see something on G+ that says you’ve been invited, and it shows your picture, so that’s reassuring. Cool. I accepted the invite, and now can run as Azimuth. The very concrete is again taking most of my time. The contractor purchased a replacement joist, but it has...knots! The engineer had said it must be knot free. Lot's of back and forth, with photos, questions, and irritation. Now the engineer is saying that knots are not critical if they are in the "compression zone" of the beam, rather than the "tension zone." From what I could gather, the compression zone is the top half of the beam, and the tension zone is the bottom half. I saw an illustration of a curved beam under load, which showed, naturally, the top curve as being shortened, and the bottom curve as being lengthened. Then I tried to imagine the response of a knot under these different stress conditions. Well, my understanding is very vague about this. Can the stress tensor field help us to understand it? (What about a psychological stress tensor field?) The email chain feels like a trial, with evidence, theories, and diplomacy. Well the good news is that today, after I pointed out inconsistencies in the testimony of the engineer re: knots, he has clearly recommended that we abandon the use of natural wood, and instead use an engineered wood product called LVL. (This is for one short "header beam," in front of a chimney, which supports two full-length joists that cannot be supported by the brick wall due to the presence of a chimney.) Theoretical musing: Is there a massive differential equation, for the stresses at all points in the structure of a house, whose solutions would show the breakdown points? (Does the "stress-strain" equation do the job?) I have been working on the next blog article for Petri Net programming (it's not on the Azimuth site yet), which I'll post when things settle down. away. For Google+, it will take me some time to clear my mind before I can get into a real-time discussion mode.
• Options
57.
edited March 2013

Sorry I have no idea why the comment is invalid XML.

You included a lot of R code, which is full of > and < symbols which play a fundamental role in HTML. I partially fixed your comment by wrapping the code with the Markdown symbol for 'code', namely a left quotation mark. However now the line breaks are gone, so the code is hard to read. I haven't posted comments containing programs here on the Forum, so I don't know the best solution.

Comment Source:nad wrote: > Sorry I have no idea why the comment is invalid XML. You included a lot of R code, which is full of > and < symbols which play a fundamental role in HTML. I partially fixed your comment by wrapping the code with the Markdown symbol for 'code', namely a left quotation mark. However now the line breaks are gone, so the code is hard to read. I haven't posted comments containing programs here on the Forum, so I don't know the best solution.
• Options
58.
edited March 2013

Sorry but this explanation is not enough for me: “the mean of a quantity for 12 months, minus the mean for the preceding 12 month period”

I imagine something like this. We take a function of time $f(t)$ and average it for 12 months, for example

$$F(t) = \frac{1}{12} \int^t_{t-12} f(s) \, d s$$ Then we compute

$$F(t) - F(t-12)$$ Of course in doing this with measured data the integrals need to be approximated somehow; I don't know how they did that.

Comment Source:Nad wrote: > Sorry but this explanation is not enough for me: “the mean of a quantity for 12 months, minus the mean for the preceding 12 month period” I imagine something like this. We take a function of time $f(t)$ and average it for 12 months, for example $$F(t) = \frac{1}{12} \int^t_{t-12} f(s) \, d s$$ Then we compute $$F(t) - F(t-12)$$ Of course in doing this with measured data the integrals need to be approximated somehow; I don't know how they did that.
• Options
59.

I mentioned in the other thread what was done in the R code. They do a 12-month average, centered on the current month (not backward as in your integral), and then take straight differences between each filtered month and the corresponding month in the following year, as in your forumla above. The integrals are just sums, since it's discrete data.

Comment Source:I mentioned in [the other thread](http://forum.azimuthproject.org/discussion/1178/temperature-vs-ghg-concentrations/) what was done in the R code. They do a 12-month average, centered on the current month (not backward as in your integral), and then take straight differences between each filtered month and the corresponding month in the following year, as in your forumla above. The integrals are just sums, since it's discrete data.
• Options
60.
edited March 2013

You included a lot of R code, which is full of > and < symbols which play a fundamental role in HTML. I partially fixed your comment by wrapping the code with the Markdown symbol for ’code’, namely a left quotation mark.

Thanks John for trying to fix the HTML.

I imagine something like this...

As you can see in the fixed post I had a similar first guess, but as Nathan said, it seems their averaging is different. That specific average seems to be defined in a rcode built-in function called "filter" and it seems to be done in this line if (!is.null(wfl)) x <- filter(x,rep(1,wfl)/wfl)´ in the routine diff12

Comment Source:>You included a lot of R code, which is full of > and < symbols which play a fundamental role in HTML. I partially fixed your comment by wrapping the code with the Markdown symbol for ’code’, namely a left quotation mark. Thanks John for trying to fix the HTML. >I imagine something like this... As you can see in the fixed post I had a similar first guess, but as Nathan said, it seems their averaging is different. That specific average seems to be defined in a rcode built-in function called "filter" and it seems to be <a href="http://www.realclimate.org/images//HSS2012.txt">done</a> in this line if (!is.null(wfl)) x <- filter(x,rep(1,wfl)/wfl)´ in the routine diff12
• Options
61.

Okay. It's good to to clarify these issues. I wasn't very concerned with these issues, since they didn't seem to be the main point. I was more interested in how 1) slowly rising CO2 causes slowly rising temperatures, but 2) short-term fluctuations in temperatures come before short-term fluctuations in CO2. This is what the RealClimate article explained.

Comment Source:Okay. It's good to to clarify these issues. I wasn't very concerned with these issues, since they didn't seem to be the main point. I was more interested in how 1) slowly rising CO<sub>2</sub> causes slowly rising temperatures, but 2) short-term fluctuations in temperatures come before short-term fluctuations in CO<sub>2</sub>. This is what the RealClimate article explained.
• Options
62.

The best way I know to include code is to put it between sets of 4 tildes. click on Source to see what I did here:

code
more code

not code

Comment Source:The best way I know to include code is to put it between sets of 4 tildes. click on Source to see what I did here: ~~~~ code more code ~~~~ not code
• Options
63.

Thanks, Graham! I'll do that for Nad's comment. For a little bit of code backwards quotes are good, but for

lots
of lines
of codes

it's 4 tildes.

Comment Source:Thanks, Graham! I'll do that for Nad's comment. For a little bit of code backwards quotes are good, but for ~~~~ lots of lines of codes ~~~~ it's 4 tildes.
• Options
64.
edited May 2013

Added a section for general news articles, and included this recent item:

Also included this item,

which I found through an old forum post by John.

Once again, I know that this is no representative selection from the world news, but it's what I have at hand. All contributions to make it well rounded are encouraged.

Comment Source:Added a section for general news articles, and included this recent item: * May 11, 2013. [Heat-trapping gas passes milestone, raising fears](http://www.nytimes.com/2013/05/11/science/earth/carbon-dioxide-level-passes-long-feared-milestone.html), New York Times. The milestone is 400 ppm carbon dioxide. As of May 12, there are 700 comments on the article. Also included this item, * Nov 18, 2010. [GOP Rep. Bob Inglis Slams His Party On Climate Change (VIDEO)](http://www.huffingtonpost.com/2010/11/18/bob-inglis-climate-change-denial_n_785404.html), Huffington Post. which I found through an old forum post by John. Once again, I know that this is no representative selection from the world news, but it's what I have at hand. All contributions to make it well rounded are encouraged.
• Options
65.
edited March 2014

• Mar 29, 2014. As Seas Rise, Millions Cling to Borrowed Time and Dying Land, New York Times. "As the world's top scientists meet in Yokohama, Japan, this week, at the top of the agenda is the prediction that global sea levels could rise by as much as three feet by 2100. ... Climate scientists have concluded that widespread burning of fossil fuels is releasing heat-trapping gases that are warming the planet." Online version: Borrowed Time on Disappearing Land

This is not "news" to us, but it is sociologically significant. Even in the homeland of climate change denial, with all the funding given to reach a pre-determined conclusion, science has the last word when it comes to the assessment of reality. The effects of global warming are now palpably evident -- the rising of the sea levels -- and so climate change "skepticism," if it still be maintained, can only be either hypocrisy, or philosophical skepticism about the existence of reality.

Comment Source:Added: * Mar 29, 2014. As Seas Rise, Millions Cling to Borrowed Time and Dying Land, New York Times. "As the world's top scientists meet in Yokohama, Japan, this week, at the top of the agenda is the prediction that global sea levels could rise by as much as three feet by 2100. ... Climate scientists have concluded that widespread burning of fossil fuels is releasing heat-trapping gases that are warming the planet." Online version: [Borrowed Time on Disappearing Land]( http://www.nytimes.com/2014/03/29/world/asia/facing-rising-seas-bangladesh-confronts-the-consequences-of-climate-change.html) This is not "news" to us, but it is sociologically significant. Even in the homeland of climate change denial, with all the funding given to reach a pre-determined conclusion, science has the last word when it comes to the assessment of reality. The effects of global warming are now palpably evident -- the rising of the sea levels -- and so climate change "skepticism," if it still be maintained, can only be either hypocrisy, or philosophical skepticism about the existence of reality.
• Options
66.
Comment Source:[uk-parliament-slams-the-bbc-for-succumbing-to-false-balance-on-climate](http://arstechnica.com/science/2014/04/uk-parliament-slams-the-bbc-for-succumbing-to-false-balance-on-climate/#p3)
• Options
67.

Thanks for the news! I would like to get some guest posts about the new IPCC report. Steve Easterbrook has a very nice about the 'physical science basis' part, which came out last year:

I've been meaning to get him to let me copy that to Azimuth. But the other parts, which came out more recently, are perhaps even more interesting!

Comment Source:Thanks for the news! I would like to get some guest posts about the new IPCC report. Steve Easterbrook has a very nice about the 'physical science basis' part, which came out last year: * Steve Easterbrook, [What Does the New IPCC Report Say About Climate Change?](http://www.easterbrook.ca/steve/2013/10/what-does-the-new-ipcc-report-say-about-climate-change/), _Serendipity_, 8 October 2013. I've been meaning to get him to let me copy that to Azimuth. But the other parts, which came out more recently, are perhaps even more interesting!
• Options
68.

Jim said:

"Here's a great picture of denier interpolation. Perhaps it should be called "Lindzen's ruler"?"

I have a question. Who would you consider the most highly-regarded and well-credentialed AGW skeptic? I was thinking about the one that is most intimidating to other climate scientists? Would you agree that it is Richard Lindzen?

He has actually contributed to IPCC reports and acts as a kind of a bully against "climate alarmism".

It looks as if another step in Lindzen's ruler is occurring, thanks to the El Nino.

Comment Source:Jim said: > "Here's a great picture of denier interpolation. Perhaps it should be called "Lindzen's ruler"?" I have a question. Who would you consider the most highly-regarded and well-credentialed AGW skeptic? I was thinking about the one that is most intimidating to other climate scientists? Would you agree that it is [Richard Lindzen](https://en.wikipedia.org/wiki/Richard_Lindzen)? He has actually contributed to IPCC reports and acts as a kind of a bully against "climate alarmism". --- It looks as if another step in Lindzen's ruler is occurring, thanks to the El Nino. ![ruler](http://1.bp.blogspot.com/-rRPfFl2DQw0/Vk03ci-V68I/AAAAAAAALJY/8SRy3hPk3v4/s400/global%2Bmean%2Bsurface%2Btemperature%2Bhadcrut4%2Band%2Bmodel.png)
• Options
69.

It could be Lindzen, or it could be Judith Curry.

Comment Source:It could be Lindzen, or it could be Judith Curry.
• Options
70.
edited November 2015

Who would you consider the most highly-regarded and well-credentialed AGW skeptic?

Freeman Dyson? He may no longer be that active, but he is certainly a highly respected scientist and apparently did some serious work on climate from the 70s on

Comment Source:> Who would you consider the most highly-regarded and well-credentialed AGW skeptic? Freeman Dyson? He may no longer be that active, but he is certainly a highly respected scientist and apparently did some serious work on climate from the 70s on
• Options
71.
edited December 2015

I don't think Dyson did any serious work on climate science. At least I haven't seen it. He's certainly well-credentialed in many other ways, though.

(He came to give a talk at my college when I was an undergrad at Princeton. I was very intimidated, probably more than most of the students, since I knew what he'd done!)

Comment Source:I don't think Dyson did any serious work on climate science. At least I haven't seen it. He's certainly well-credentialed in many other ways, though. (He came to give a talk at my college when I was an undergrad at Princeton. I was very intimidated, probably more than most of the students, since I knew what he'd done!)
• Options
72.
edited December 2015

I did say "apparently" :). Wikipedia claims that

Around 1979, Dyson worked with the Institute for Energy Analysis on climate studies. This group, under the direction of Alvin Weinberg, pioneered multidisciplinary climate studies, including a strong biology group. Also during the 1970s, he worked on climate studies conducted by the JASON defense advisory group.

They cite a NY times article that I have not actually read as the source.

Comment Source:I did say "apparently" :). Wikipedia claims that > Around 1979, Dyson worked with the Institute for Energy Analysis on climate studies. This group, under the direction of Alvin Weinberg, pioneered multidisciplinary climate studies, including a strong biology group. Also during the 1970s, he worked on climate studies conducted by the JASON defense advisory group. They cite a NY times [article](http://www.nytimes.com/2009/03/29/magazine/29Dyson-t.html) that I have not actually read as the source.
• Options
73.
edited December 2015

Who would you consider the most highly-regarded and well-credentialed AGW skeptic?

Well of course global warming depends on how you define "global warming." There are of course spots on earth where there is no warming or even cooling. Like in the US there are spots where it seems there was no warming. The question is how to count those against the ones which were/are warming.

The integral (=sum of values) of the anomaly curve for the respective measurement time is 0, as per definition of an anomaly (monthly deviation of monthly average). So if I divide the time interval in two segments the integral on one part will have a certain value and the integral on the other part will have negative that value, because the integral of both parts has to add up to 0.

In the below images I computed the difference of the integral of the timeinterval before endyearofmeasurement-20 and after. So negative values mean that it was cooler in the last 20 years of measurement and likewise positive values that there was a warming in the last 20 years of the respective measurement. The size of the integral says something about the steepness or "how strong" warming or cooling took place. So here I used the CRUTEM 4 collection of temperature measurements from 4634 stations around the world. I plotted all anomalies with differences smaller than -300. There where 13 stations where that difference was smaller than -300, most data seems though rather unusable - but in the last curve in Australia temperature anomalies went visibly down:

(Time of measurement indicated by orange line) There where 1267 stations where the difference was bigger than 300 (alone Oslo Blindern had about 600 as you can see below at the number beneath the name of the station). Here a screenshot of some of them, clearly showing a warming

It is unclear how many of the 1267 "steep warming" stations are unusable but alone this small glimpse into the temperature file gives a rather strong indication that "some sort of global" warming is rather visible.

Comment Source:>Who would you consider the most highly-regarded and well-credentialed AGW skeptic? Well of course global warming depends on how you define "global warming." There are of course spots on earth where there is no warming or even cooling. Like in the US there are spots where it seems there was no warming. The question is how to count those against the ones which were/are warming. The integral (=sum of values) of the anomaly curve for the respective measurement time is 0, as per definition of an anomaly (monthly deviation of monthly average). So if I divide the time interval in two segments the integral on one part will have a certain value and the integral on the other part will have negative that value, because the integral of both parts has to add up to 0. In the below images I computed the difference of the integral of the timeinterval before endyearofmeasurement-20 and after. So negative values mean that it was cooler in the last 20 years of measurement and likewise positive values that there was a warming in the last 20 years of the respective measurement. The size of the integral says something about the steepness or "how strong" warming or cooling took place. So here I used the CRUTEM 4 collection of temperature measurements from 4634 stations around the world. I plotted all anomalies with differences smaller than -300. There where 13 stations where that difference was smaller than -300, most data seems though rather unusable - but in the last curve in Australia temperature anomalies went visibly down: ![anomalies smaller than minus 300](http://www.randform.org/blog/wp-content/2015/12/13anomsmallerminus300.jpg) (Time of measurement indicated by orange line) There where 1267 stations where the difference was bigger than 300 (alone Oslo Blindern had about 600 as you can see below at the number beneath the name of the station). Here a screenshot of some of them, clearly showing a warming ![anomalies bigger that 300](http://www.randform.org/blog/wp-content/2015/12/1264anombigger300.jpg) It is unclear how many of the 1267 "steep warming" stations are unusable but alone this small glimpse into the temperature file gives a rather strong indication that "some sort of global" warming is rather visible.