Options

Quasi-biennial oscillation

In her blog, Nadja Kutz wrote:

This concerns a discussion on Azimuth. I found that the temperature anomaly curve, which describes the global combined land [CRUTEM4] and marine [sea surface temperature (SST)] temperature anomalies (an anomaly is a deviation from a mean temperature) over time (HADCRUT4-GL) has a two-year periodicity (for more details click here). The dots in the above image shall display, why I think so. The dark line drawn over the jagged anomaly curve is the mean curve. The grey strips are one year in width. A dot highlights a peak (or at least an upward bump) in the mean curve. More precisely there are:

18 red dots which describe peaks within grey 2-year interval

5 yellow dots which describe peaks out of grey 2-year interval (two yellow peaks are rather close together)

1 uncolored dot which describes no real peak, but just a bump

4 blue dots which describe small peaks within ditches

One sees that the red and yellow dots describe more or less all peaks in the curve (the blue dots care about the minor peaks, and there is just one bump, which is not a full peak). The fact that the majority of the red and yellow dots is red, means that there is a peak every 2 years, with a certain unpreciseness which is indicated by the width of the interval.

Upon writing this post I saw that I forgot one red dot. Can you spot where?

Especially after doing this visualization this periodicity appears to me meanwhile so visible that I think this should be a widely known phenomenon, however at Azimuth nobody has heard yet about it. If its not a bug then I could imagine that it could at least partially be due to differences in the solar irradiance for northern and southern hemisphere, but this is so far just a wild guess and would need further investigations, which would cost me a lot of (unpaid) time and brain. So if you know how this phenomenon is called then please drop a line. If its not a bug then this phenomen appears to me as an important fact which may amongst others enter the explanation for El Niño.

I just learned about this phenomenon, which could be the answer:

This is a:

Time–height plot of monthly-mean, zonal-mean equatorial zonal wind (u) in m/s between about 20 and 35 km (22 mi) altitude above sea level over a ten-year period. Positive values denote westerly winds and the contour line is at 0 m/s.

«1

Comments

  • 1.

    I put a stubby article on the wiki:

    Comment Source:I put a stubby article on the wiki: * [[Quasi-biennial oscillation]]
  • 2.
    nad
    edited June 2014

    I just learned about this phenomenon, which could be the answer:

    Quasi-biennial oscillation

    The quasi-biennial oscillation (QBO) is a quasiperiodic oscillation of the equatorial zonal wind between easterlies and westerlies in the tropical stratosphere with a mean period of 28 to 29 months.

    There is no wikipedia entry for equatorial zonal wind, anyways it seems to be a wind which is on top of the trade winds, which are responsible for El Nino.

    Too bad that the measurements of those FU sets which you provided are only rather over a rather brief period. On this diagram they actually look allmost triannual! Thanks to wikipedia user Pierre cb but this time–height plot from FU Berlin was unfortunately not overly constructive. That is it would be more instructive to have a look at the overall currents together with their placements. In which direction do these circulations go? FU Berlin has some diagrams but whats displayed there? !?! I don't know are there other measurements ? The QBO data set is a closed link at FU Berlin. I think something should be done about this.

    Anyway John thanks for the link and the attention for these oscillations. This QBO is an indication that my observation is not a bug. However I don't see at the moment that the QBO provides an explanation.

    Comment Source:>I just learned about this phenomenon, which could be the answer: Quasi-biennial oscillation >The quasi-biennial oscillation (QBO) is a quasiperiodic oscillation of the equatorial zonal wind between easterlies and westerlies in the tropical stratosphere with a mean period of 28 to 29 months. There is no wikipedia entry for equatorial zonal wind, anyways it seems to be a wind which is on top of the trade winds, which are responsible for El Nino. Too bad that the measurements of those FU sets which you provided are only rather over a rather brief period. On this diagram they actually look allmost triannual! Thanks to wikipedia user Pierre cb but this time–height plot from FU Berlin was unfortunately not overly constructive. That is it would be more instructive to have a look at the overall currents together with their placements. In which direction do these circulations go? FU Berlin has <a href="http://strat-www.met.fu-berlin.de/products/cdrom/html/section5.html">some diagrams</a> but whats displayed there? !?! I don't know are there other measurements ? The <a href="http://www.pa.op.dlr.de/CCMVal/Forcings/WMO2010/qbo_ccmval2/qbo_index.html">QBO data set </a> is a closed link at FU Berlin. I think something should be done about this. Anyway John thanks for the link and the attention for these oscillations. This QBO is an indication that my observation is not a bug. However I don't see at the moment that the QBO provides an explanation.
  • 3.

    I guess the URL'S are posing the XML problem. Is it the underline? The below markdown+itex help doesn't work.

    Comment Source:I guess the URL'S are posing the XML problem. Is it the underline? The below markdown+itex help doesn't work.
  • 4.

    The problem was that you included an extra </a>, which I removed.

    Comment Source:The problem was that you included an extra `</a>`, which I removed.
  • 5.
    edited June 2014

    FU Berlin has some diagrams but what's displayed there? !?!

    Do you mean these?

    You can see what these are by reading the webpage.

    These charts display monthly mean zonal winds as a function of time (horizontal axis) and height (vertical axis) at these locations: Canton Island, 3°S/172°W (Jan 1953 - Aug 1967), Gan/Maledive Islands, 1°S/73°E (Sep 1967 - Dec 1975) and Singapore, 1°N/104°E (since Jan 1976). The contour lines are at 10 m/s intervals, and westerlies (= winds blowing east) are shaded, so the unshaded regions are winds blowing west.

    It looks like these winds occur at quite high altitudes, since the lowest altitude marked is 18 kilometers.

    This data is from

    • Naujokat, B., 1986: An update of the observed quasi-biennial oscillation of the stratospheric winds over the tropics. J. Atmos. Sci., 43, 1873-1877.

    I can get this abstract, but when I try to get the article I get an error message even when I have cookies enabled.

    Comment Source:> FU Berlin has [some diagrams](http://strat-www.met.fu-berlin.de/products/cdrom/html/section5.html) but what's displayed there? !?! Do you mean these? <img src = "http://strat-www.met.fu-berlin.de/products/cdrom/fig/fig30.gif" alt = ""/> You can see what these are by reading the webpage. These charts display monthly mean zonal winds as a function of time (horizontal axis) and height (vertical axis) at these locations: Canton Island, 3°S/172°W (Jan 1953 - Aug 1967), Gan/Maledive Islands, 1°S/73°E (Sep 1967 - Dec 1975) and Singapore, 1°N/104°E (since Jan 1976). The contour lines are at 10 m/s intervals, and westerlies (= winds blowing east) are shaded, so the unshaded regions are winds blowing west. It looks like these winds occur at quite high altitudes, since the lowest altitude marked is 18 kilometers. This data is from * Naujokat, B., 1986: An update of the observed quasi-biennial oscillation of the stratospheric winds over the tropics. J. Atmos. Sci., 43, 1873-1877. I can get this [abstract](http://adsabs.harvard.edu/abs/1986JAtS...43.1873N), but when I try to get the article I get an error message even when I have cookies enabled.
  • 6.
    edited June 2014

    However I don’t see at the moment that the QBO provides an explanation.

    I don't understand much about climate science, so I can't say how the QBO explains the biennial variation you're seeing... but I predict that it will turn out to be the explanation.

    Comment Source:> However I don’t see at the moment that the QBO provides an explanation. I don't understand much about climate science, so I can't say _how_ the QBO explains the biennial variation you're seeing... but I predict that it will turn out to be the explanation.
  • 7.
    nad
    edited June 2014

    Do you mean these?

    You can see what these are by reading the webpage.

    These charts display monthly mean zonal winds as a function of time (horizontal axis) and height (vertical axis) at these locations: Canton Island, 3°S/172°W (Jan 1953 - Aug 1967), Gan/Maledive Islands, 1°S/73°E (Sep 1967 - Dec 1975) and Singapore, 1°N/104°E (since Jan 1976). The contour lines are at 10 m/s intervals, and westerlies (= winds blowing east) are shaded, so the unshaded regions are winds blowing west.

    I meant that "what's displayed there?" in a more general sense. That is I have problems to extract (at least for me) useful information from those plots. If I understand correctly mean wind speeds of three rather differently geolocated locations are indicated by seperating them via contour plots (where I can hardly read some numbers, which seem to indicate those speeds, moreover they look as located on the contour lines than within the contours and then there are images without numbers.), so in principle those plots seem to want to say which mean wind speed (over three rather different places!) was at which height at a given time. Should I infer from this that the windspeed at a certain height is about the same at the Maledives, Canton and Singapore, because otherwise this plot would appear to me as rather absurd?

    I don’t understand much about climate science, so I can’t say how the QBO explains the biennial variation you’re seeing… but I predict that it will turn out to be the explanation.

    I don't understand neither much about climate science, but I think the QBO will not be the explanation. First the temperature oscillation seems to be every two years. For the QBO I am not so sure, since the above time interval is rather short and as said it looks as if the QBO oscillation is more than two years. Secondly if they are correlated then the (major) cause I think (as already said) will be somewhere else, like as I think some planetary cause. I have though no idea which cycle has a 2-year frequency, however at this stage I wouldn't want to exclude that it is the usual annual solar cycle, which is then "blurred" for example by some complicated interaction with methan, methanogens and methanotrophs (we had the discussion about those species in conjunction with climate earlier and you had just made a google+ post about that).

    Comment Source:>Do you mean these? >You can see what these are by reading the webpage. >These charts display monthly mean zonal winds as a function of time (horizontal axis) and height (vertical axis) at these locations: Canton Island, 3°S/172°W (Jan 1953 - Aug 1967), Gan/Maledive Islands, 1°S/73°E (Sep 1967 - Dec 1975) and Singapore, 1°N/104°E (since Jan 1976). The contour lines are at 10 m/s intervals, and westerlies (= winds blowing east) are shaded, so the unshaded regions are winds blowing west. I meant that "what's displayed there?" in a more general sense. That is I have problems to extract (at least for me) useful information from those plots. If I understand correctly mean wind speeds of three rather differently geolocated locations are indicated by seperating them via contour plots (where I can hardly read some numbers, which seem to indicate those speeds, moreover they look as located on the contour lines than within the contours and then there are images without numbers.), so in principle those plots seem to want to say which mean wind speed (over three rather different places!) was at which height at a given time. Should I infer from this that the windspeed at a certain height is about the same at the Maledives, Canton and Singapore, because otherwise this plot would appear to me as rather absurd? >I don’t understand much about climate science, so I can’t say how the QBO explains the biennial variation you’re seeing… but I predict that it will turn out to be the explanation. I don't understand neither much about climate science, but I think the QBO will not be the explanation. First the temperature oscillation seems to be every two years. For the QBO I am not so sure, since the above time interval is rather short and as said it looks as if the QBO oscillation is more than two years. Secondly if they are correlated then the (major) cause I think (as already said) will be somewhere else, like as I think some planetary cause. I have though no idea which cycle has a 2-year frequency, however at this stage I wouldn't want to exclude that it is the usual annual solar cycle, which is then "blurred" for example by some complicated interaction with methan, methanogens and methanotrophs (we had the discussion about those species in conjunction with climate earlier and you had just made a google+ post about that).
  • 8.

    The wind speeds were measured at Canton Island, 3°S/172°W from Jan 1953 to Aug 1967), at the Gan/Maledive Islands, 1°S/73°E from Sep 1967 - Dec 1975, and at Singapore, 1°N/104°E after Jan 1976. You can look for discontinuities in 1967 and 1976 if you want... I don't see any... so I guess winds at these extremely high (stratospheric) altitudes are not extremely variable as a function of longitude, if we fix a given latitude. This seems reasonable to me, since there isn't much "weather" in the stratosphere.

    Anyway, separate from this issue, I think it would be great for you to write an Azimuth blog post about the 2-year periodicity you seem to have discovered. If the periodicity is really 24 months instead of 28-29, then it's probably not the QBO. I don't know any 2-year periodic phenomenon in the Earth's climate, but if you ask maybe some expert will show up and tell us what's happening.

    Comment Source:The wind speeds were measured at Canton Island, 3°S/172°W from Jan 1953 to Aug 1967), at the Gan/Maledive Islands, 1°S/73°E from Sep 1967 - Dec 1975, and at Singapore, 1°N/104°E after Jan 1976. You can look for discontinuities in 1967 and 1976 if you want... I don't see any... so I guess winds at these extremely high (stratospheric) altitudes are not extremely variable as a function of longitude, if we fix a given latitude. This seems reasonable to me, since there isn't much "weather" in the stratosphere. Anyway, separate from this issue, I think it would be great for you to write an Azimuth blog post about the 2-year periodicity you seem to have discovered. If the periodicity is really 24 months instead of 28-29, then it's probably _not_ the QBO. I don't know any 2-year periodic phenomenon in the Earth's climate, but if you ask maybe some expert will show up and tell us what's happening.
  • 9.

    If you want to write a short blog post on Azimuth, you could just include a nice graph, explain how you got the data to create that graph, and ask what's going on. After all, this 2-year periodicity is interesting quite separately from the original question of whether greenhouse gas concentrations lead or lag temperature rises.

    This image is too small for me to see well, but otherwise it's nice:

    I think it would be even nicer if you left out the colored dots.

    I had sent you the image in the normal way, as I always do but I had lately quite some problems with my email I haven’t yet figured out what to do about it, so frankly I don’t know now how to send you the image. I currently think about wether I should upload it somewhere else and where, so that it can be linked to.

    You had included the image in the email itself, instead of creating an "attachment". If you can send it as an attachment, or put it anywhere and tell me the URL, that would be great.

    Comment Source:If you want to write a _short_ blog post on Azimuth, you could just include a nice graph, explain how you got the data to create that graph, and ask what's going on. After all, this 2-year periodicity is interesting quite separately from the original question of whether greenhouse gas concentrations lead or lag temperature rises. This image is too small for me to see well, but otherwise it's nice: <a href = "http://www.randform.org/blog/?p=5572"> <img src = "http://www.randform.org/blog/wp-content/2014/06/TempAnom2Year450.jpg" alt = ""/> </a> I think it would be even nicer if you left out the colored dots. > I had sent you the image in the normal way, as I always do but I had lately quite some problems with my email I haven’t yet figured out what to do about it, so frankly I don’t know now how to send you the image. I currently think about wether I should upload it somewhere else and where, so that it can be linked to. You had included the image in the email itself, instead of creating an "attachment". If you can send it as an attachment, or put it _anywhere_ and tell me the URL, that would be great.
  • 10.

    These charts display monthly mean zonal winds

    ...

    You can look for discontinuities in 1967 and 1976 if you want…

    so there should be one value per month. so there was some interpolation going on, which may cover discontinuities.

    Anyway, separate from this issue, I think it would be great for you to write an Azimuth blog post about the 2-year periodicity you seem to have discovered.

    I wrote already one on our blog one could copy this if you want.

    Comment Source:>These charts display monthly mean zonal winds ... > You can look for discontinuities in 1967 and 1976 if you want… so there should be one value per month. so there was some interpolation going on, which may cover discontinuities. >Anyway, separate from this issue, I think it would be great for you to write an Azimuth blog post about the 2-year periodicity you seem to have discovered. I wrote already one on <a href="http://www.randform.org/blog/?p=5572">our blog</a> one could copy this if you want.
  • 11.
    edited June 2014

    If you can give me a bigger version of that graph, I'll do it. I'll probably rewrite the article a bit and show it to you (and everyone).

    Comment Source:If you can give me a bigger version of that graph, I'll do it. I'll probably rewrite the article a bit and show it to you (and everyone).
  • 12.
    nad
    edited June 2014

    I think it would be even nicer if you left out the colored dots.

    yes because you know whats's meant by "a peak of the mean curve falls into every grey interval".

    that is the perception of nice or not nice depends on the audience.

    since there are eventually also some not so science language trained people I tried to explain that sentence with the dots. that is a lot readers won't be sure what is meant by mean curve and peak, some might even not know what is meant by the grey time interval and what it means that a peak may fall into an interval. I am not sure wether my explication makes it more understandable for a huge share of those who do not understand, but at least I try. If I would have more time for this I would eventually do this in other ways. but apart from that - with the dots you have an easily visible quantification of how good the peaks fall into the intervals.

    anyways as said this periodicity fact appears to me as that it should be widely known and I expect that some comment to such a blog post could provide a link to the lecture notes of a first year course in "major components in understanding global warming" or what do I know - if people would bother to comment at all. They might because you have some voice in the science community.
    And frankly, if it wouldn't be the case that no Azimuthers sofar seems to have heard about it , I would still leave it as a side comment in that what I had written. I mean you don't want to pester other people with trivialities. These are very busy people who seem to receive in the US even death threats and in Australia ? It seems:

    CLAIMS that some of Australia's leading climate change scientists were subjected to death threats as part of a vicious and unrelenting email campaign have been debunked by the Privacy Commissioner.

    that is in Australia scientists where subject to "only" what accounts as "the danger to life or physical safety in this case to be only a possibility, not a real chance."

    This all sounds quite nerve and time consuming.

    Comment Source:>I think it would be even nicer if you left out the colored dots. yes because you know whats's meant by "a peak of the mean curve falls into every grey interval". that is the perception of nice or not nice depends on the audience. since there are eventually also some not so science language trained people I tried to explain that sentence with the dots. that is a lot readers won't be sure what is meant by mean curve and peak, some might even not know what is meant by the grey time interval and what it means that a peak may fall into an interval. I am not sure wether my explication makes it more understandable for a huge share of those who do not understand, but at least I try. If I would have more time for this I would eventually do this in other ways. but apart from that - with the dots you have an easily visible quantification of how good the peaks fall into the intervals. anyways as said this periodicity fact appears to me as that it should be widely known and I expect that some comment to such a blog post could provide a link to the lecture notes of a first year course in "major components in understanding global warming" or what do I know - if people would bother to comment at all. They might because you have some voice in the science community. And frankly, if it wouldn't be the case that no Azimuthers sofar seems to have heard about it , I would still leave it as a side comment in that what I had written. I mean you don't want to pester other people with trivialities. These are very busy people who seem to receive in the <a href="http://www.bloomberg.com/news/2012-09-10/climate-scientists-face-organized-harassment-in-u-s-.html">US even death threats</a> and in Australia ? It seems: >CLAIMS that some of Australia's leading climate change scientists were subjected to death threats as part of a vicious and unrelenting email campaign have been debunked by the Privacy Commissioner. that is in Australia scientists where subject to <a href="http://blogs.telegraph.co.uk/news/jamesdelingpole/100155441/lying-climate-scientists-lie-again-about-death-threats-this-time/">"only"</a> what accounts as "the danger to life or physical safety in this case to be only a possibility, not a real chance." This all sounds quite nerve and time consuming.
  • 13.
    edited June 2014

    yes because you know whats’s meant by “a peak of the mean curve falls into every grey interval”.

    Actually it's because the picture is so small that I can barely see those dots: they just make it harder for me to see what's going on. If the picture were 4 times bigger the dots might look good. (It would be too wide for the Azimuth blog, but we could do the old "Click to enlarge" trick).

    anyways as said this periodicity fact appears to me as that it should be widely known and I expect that some comment to such a blog post could provide a link to the lecture notes of a first year course in “major components in understanding global warming” or what do I know - if people would bother to comment at all.

    I think this would be good. I've never heard of this 2-year-periodic phenomenon. I'm far from an expert on climate science, but I think I know the most of the material in that first year course. So if this effect really exists, it's something we should talk about: it's not something that everyone knows about.

    I hope someone will take this temperature data, perhaps subtract off the linear trend, and then run it through a Fourier transform or windowed Fourier transform and see what frequencies have a lot of power. If I get better at using R, I could do this pretty quickly. I think my student Blake could already do it quickly.

    Comment Source:> yes because you know whats’s meant by “a peak of the mean curve falls into every grey interval”. Actually it's because the picture is so small that I can barely see those dots: they just make it harder for me to see what's going on. If the picture were 4 times bigger the dots might look good. (It would be too wide for the Azimuth blog, but we could do the old "Click to enlarge" trick). > anyways as said this periodicity fact appears to me as that it should be widely known and I expect that some comment to such a blog post could provide a link to the lecture notes of a first year course in “major components in understanding global warming” or what do I know - if people would bother to comment at all. I think this would be good. I've never heard of this 2-year-periodic phenomenon. I'm far from an expert on climate science, but I think I know the most of the material in that first year course. So if this effect really exists, it's something we should talk about: it's not something that _everyone_ knows about. I hope someone will take this temperature data, perhaps subtract off the linear trend, and then run it through a Fourier transform or windowed Fourier transform and see what frequencies have a lot of power. If I get better at using R, I could do this pretty quickly. I think my student Blake could already do it quickly.
  • 14.
    edited June 2014

    You got your temperature data from here, right?

    How does this file work? It starts like this:

     1850 -0.690 -0.279 -0.728 -0.565 -0.322 -0.215 -0.130 -0.234 -0.439 -0.455 -0.191 -0.265 -0.374
     1850     22     20     18     19     18     20     22     22     23     23     24     25
     1851 -0.295 -0.346 -0.468 -0.441 -0.304 -0.189 -0.216 -0.159 -0.111 -0.054 -0.021 -0.056 -0.219
     1851     23     22     20     21     20     20     21     23     18     20     18     20
     1852 -0.301 -0.455 -0.492 -0.561 -0.206 -0.040 -0.013 -0.205 -0.132 -0.218 -0.193  0.092 -0.223
     1852     23     22     22     23     23     23     23     23     21     21     22     25
     1853 -0.169 -0.324 -0.312 -0.346 -0.272 -0.176 -0.060 -0.148 -0.404 -0.363 -0.247 -0.424 -0.268
     1853     25     26     26     24     25     27     26     29     28     29     26     27
     1854 -0.352 -0.274 -0.271 -0.346 -0.228 -0.209 -0.230 -0.167 -0.114 -0.193 -0.366 -0.229 -0.243
    

    What are the big numbers like 25 and what are the little numbers like -0.352?

    Comment Source:You got your temperature data from here, right? * [http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT4-gl.dat](http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT4-gl.dat) How does this file work? It starts like this: ~~~~ 1850 -0.690 -0.279 -0.728 -0.565 -0.322 -0.215 -0.130 -0.234 -0.439 -0.455 -0.191 -0.265 -0.374 1850 22 20 18 19 18 20 22 22 23 23 24 25 1851 -0.295 -0.346 -0.468 -0.441 -0.304 -0.189 -0.216 -0.159 -0.111 -0.054 -0.021 -0.056 -0.219 1851 23 22 20 21 20 20 21 23 18 20 18 20 1852 -0.301 -0.455 -0.492 -0.561 -0.206 -0.040 -0.013 -0.205 -0.132 -0.218 -0.193 0.092 -0.223 1852 23 22 22 23 23 23 23 23 21 21 22 25 1853 -0.169 -0.324 -0.312 -0.346 -0.272 -0.176 -0.060 -0.148 -0.404 -0.363 -0.247 -0.424 -0.268 1853 25 26 26 24 25 27 26 29 28 29 26 27 1854 -0.352 -0.274 -0.271 -0.346 -0.228 -0.209 -0.230 -0.167 -0.114 -0.193 -0.366 -0.229 -0.243 ~~~~ What are the big numbers like 25 and what are the little numbers like -0.352?
  • 15.
    edited June 2014

    The little numbers are 12 monthly temperature anomalies, followed by an annual temperature anomaly, relative to 1961-1990. The big numbers are the corresponding data coverage as a percentage of the Earth's surface area.

    Comment Source:The little numbers are 12 monthly temperature anomalies, followed by an annual temperature anomaly, relative to 1961-1990. The big numbers are the corresponding data coverage as a percentage of the Earth's surface area.
  • 16.

    Thanks, Nathan!

    Comment Source:Thanks, Nathan!
  • 17.
    edited June 2014

    As practice doing basic stuff with R, I put the HadCRUT4-gl monthly global temperature anomalies into a comma-separated-value file here:

    So, this is mainly a list of 1968 = 12 × 164 numbers, which are monthly temperature anomalies from 1850 to 2013.

    I will now enjoy doing some stuff with these numbers.

    Comment Source:As practice doing basic stuff with R, I put the HadCRUT4-gl monthly global temperature anomalies into a comma-separated-value file here: * [http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/HadCRUT4_global_monthly_temperatures_1850-2014.csv](http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/HadCRUT4_global_monthly_temperatures_1850-2014.csv) So, this is mainly a list of 1968 = 12 &times; 164 numbers, which are monthly temperature anomalies from 1850 to 2013. I will now enjoy doing some stuff with these numbers.
  • 18.
    edited June 2014

    Here's what they look like, as plotted by Alok Tiwari:

    I wonder if the seemingly more jagged nature of the left part of the graph is due to more spotty coverage of world temperatures.

    Comment Source:Here's what they look like, as plotted by Alok Tiwari: <a href = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/"> <img width = "700" src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/HadCRUT4_global_monthly_temperatures_1850-2014.png" alt = ""/> </a> I wonder if the seemingly more jagged nature of the left part of the graph is due to more spotty coverage of world temperatures.
  • 19.
    nad
    edited June 2014

    As practice doing basic stuff with R, I put the HadCRUT4-gl monthly global temperature anomalies into a comma-separated-value file here

    I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same.

    Comment Source:>As practice doing basic stuff with R, I put the HadCRUT4-gl monthly global temperature anomalies into a comma-separated-value file here I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same.
  • 20.
    nad
    edited June 2014

    By the way the four values for 2014 0.507 0.304 0.544 0.641

    are already quite high, would be interesting to see the may value, do you know when this is posted?

    Comment Source:By the way the four values for 2014 0.507 0.304 0.544 0.641 are already quite high, would be interesting to see the may value, do you know when this is posted?
  • 21.

    By the way concerning your R investigations.

    There is a fringe conference to the OKFestival called csv conf, a talk deals with an R package, that might be useful for you:

    In this talk I will demo a new R package, testdat, that provides a suite of functions that allow users to unit test tabular data, much like unit testing for code. Our package allows researchers to write expectations, as one would do with code, and quickly identify cryptic issues, especially when reading large numbers of files. We also describe the major functionality of testdat along with a few use-cases and related tools. (full abstract)

    Both events are though rather expensive. Too expensive for me.

    Comment Source:By the way concerning your R investigations. There is a <a href="http://2014.okfestival.org/okfestival-fringe-events/">fringe conference</a> to the OKFestival called <a href="http://csvconf.com/">csv conf</a>, a talk deals with an R package, that might be useful for you: >In this talk I will demo a new R package, testdat, that provides a suite of functions that allow users to unit test tabular data, much like unit testing for code. Our package allows researchers to write expectations, as one would do with code, and quickly identify cryptic issues, especially when reading large numbers of files. We also describe the major functionality of testdat along with a few use-cases and related tools. (full abstract) Both events are though rather expensive. Too expensive for me.
  • 22.

    Yes, I think the large early variance is observation error; they publish error bars, so you can check this.

    Comment Source:Yes, I think the large early variance is observation error; they publish error bars, so you can check this.
  • 23.

    I wrote:

    I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same.

    I am not sure whether I expressed myself understandable. what I say here is that the fourier transform of the mean of the temp anomalies in the here mentioned HADCRUT4 anomalies set display a periodictiy of 1 and/or 2 years (this is a discrete transform), if there is no bug (which there rather could be as it is not overly checked) which would confirm in a sort of mathematical way that what I wrote above about what one can see in the visualization.

    Comment Source:I wrote: >I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same. I am not sure whether I expressed myself understandable. what I say here is that the fourier transform of the mean of the temp anomalies in the here mentioned HADCRUT4 anomalies set display a periodictiy of 1 and/or 2 years (this is a discrete transform), if there is no bug (which there rather could be as it is not overly checked) which would confirm in a sort of mathematical way that what I wrote above about what one can see in the visualization.
  • 24.
    The QBO appears more related to ENSO. I have been thinking that ENSO is a fluid dynamics oscillation while QBO is an aero-dynamics oscillation with possibly the same root periodic forcing function.
    The fundamental period of QBO appears very close to the alias or folded period of the synodic or sidereal lunar month as I describe here:
    http://contextearth.com/2014/06/17/the-qbom/
    Comment Source:The QBO appears more related to ENSO. I have been thinking that ENSO is a fluid dynamics oscillation while QBO is an aero-dynamics oscillation with possibly the same root periodic forcing function. The fundamental period of QBO appears very close to the alias or folded period of the synodic or sidereal lunar month as I describe here: http://contextearth.com/2014/06/17/the-qbom/
  • 25.

    Nad wrote:

    I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same.

    I put the material you sent here:

    You can link to these if you want.

    I changed the picture file from .jpeg to .jpg. It's this:

    I'm worried that the large signal with period 2 years is just part of a general spike at high frequencies. If you only show the Fourier transform for periods that are integer multiples of years, a spike at high frequencies will show up as a spike at periods 1 and 2. Since HadCRUT gives monthly temperatures, we are able to study its Fourier transform at any period that's an integer multiple of months.

    Comment Source:Nad wrote: > I just sent you a very unchecked just-hacked-in fourier version of those temp mean values, however done in javascript. It displays an annual and biannual peak. Maybe you can check whether you get the same. I put the material you sent here: * [http://math.ucr.edu/home/baez/ecological/nad/](http://math.ucr.edu/home/baez/ecological/nad/) You can link to these if you want. I changed the picture file from .jpeg to .jpg. It's this: <img src = "http://math.ucr.edu/home/baez/ecological/nad/Fourierscreenshot.jpg" alt = ""/> I'm worried that the large signal with period 2 years is just part of a general spike at high frequencies. If you only show the Fourier transform for periods that are integer multiples of years, a spike at high frequencies will show up as a spike at periods 1 and 2. Since HadCRUT gives monthly temperatures, we are able to study its Fourier transform at any period that's an integer multiple of months.
  • 26.

    Blake Pollard did some more Fourier transforms of HadCRUT4-gl 1948-1979 global temperature anomalies after removing the linear trend. You can find these files and related ones at:

    and they're explained in the README file. Blake's are these:

    5) TempAnomaliesPlotsWithAndWithoutTrend.jpg is a plot of the temperature anomalies showing a linear trend line, and a plot of the "detrended" temperature anomalies - that is, with the linear trend subtracted.

    6) DetrendedNotSmoothedFFTModulus.jpg is a plot of modulus of the Fourier transform of the detrended temperature anomalies.

    1. ZoomedInDetrendedNotSmoothedFFTModulus.jpg is a zoomed-in version of the image in 6.

    1. DetrendedSmoothedFFTModulus.jpg is a plot of modulus of the Fourier transform of a smoothed version of the detrended temperature anomalies. Smoothing tends to damp the Fourier transform at high frequencies.

    1. ZoomedInDetrendedSmoothedFFTModulus is a zoomed-in version of the image in 8.

    Blake Pollard writes:

    If I am interpreting the results correctly, that first big peak after the low-frequency stuff is at about .0834 in units of 1/months. This corresponds to a 11.99 month period, so a year. The next peak is at about .1646 1/months or at about a 6 month period.

    There is too much going on at low-frequencies. If you look at the 'ZoomedInDetrendedNotSmoothedFFTModulus', you'll see a plot zoomed into low frequencies. A two year period would be around .042. There are some things going on there but nothing compelling. The two big peaks above .4 right next to each other are at a frequency of .023 1/month or 43.25 month period.

    Comment Source:[[Blake Pollard]] did some more Fourier transforms of HadCRUT4-gl 1948-1979 global temperature anomalies after removing the linear trend. You can find these files and related ones at: * [http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/](http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/) and they're explained in the [README file](http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/README.txt). Blake's are these: 5) TempAnomaliesPlotsWithAndWithoutTrend.jpg is a plot of the temperature anomalies showing a linear trend line, and a plot of the "detrended" temperature anomalies - that is, with the linear trend subtracted. <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/TempAnomaliesPlotsWithAndWithoutTrend.jpg" alt = ""/> 6) DetrendedNotSmoothedFFTModulus.jpg is a plot of modulus of the Fourier transform of the detrended temperature anomalies. <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/DetrendedNotSmoothedFFTModulus.jpg" alt = ""/> 7. ZoomedInDetrendedNotSmoothedFFTModulus.jpg is a zoomed-in version of the image in 6. <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/ZoomedInDetrendedNotSmoothedFFTModulus.jpg" alt = ""/> 8. DetrendedSmoothedFFTModulus.jpg is a plot of modulus of the Fourier transform of a smoothed version of the detrended temperature anomalies. Smoothing tends to damp the Fourier transform at high frequencies. <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/DetrendedSmoothedFFTModulus.jpg" alt = ""/> 9. ZoomedInDetrendedSmoothedFFTModulus is a zoomed-in version of the image in 8. <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/ZoomedInDetrendedNotSmoothedFFTModulus.jpg" alt = ""/> Blake Pollard writes: > If I am interpreting the results correctly, that first big peak after the low-frequency stuff is at about .0834 in units of 1/months. This corresponds to a 11.99 month period, so a year. The next peak is at about .1646 1/months or at about a 6 month period. > There is too much going on at low-frequencies. If you look at the 'ZoomedInDetrendedNotSmoothedFFTModulus', you'll see a plot zoomed into low frequencies. A two year period would be around .042. There are some things going on there but nothing compelling. The two big peaks above .4 right next to each other are at a frequency of .023 1/month or 43.25 month period.
  • 27.
    nad
    edited June 2014

    Thanks for looking at this.

    A two year period would be around .042.

    I don't know what this FFT Modulus algorithm does exactly but if I refer to what is written on the x-axis and when I assume that the values on the x-axis are all equidistant then there is a peak around 0.42 and it is slightly higher than the 6 month one, even if it is not as big as the annual. In particular left to the 0.05 value there are five distinguishable peaks, which are around (I have to use a ruler on the screen...) 0.047, 0.042, 0.036, 0.031 with 0.036 being the largest of that group, which would rather point to a frequency of 28 months than 24 (I.e. 2 years), which could point to this CBO, which WebHubTelescope is favouring. and then the fifth is the biggest at 0.028, which points to a 3 year periodicity. The next big peak at around 0.023 is around 3.7 years, which could be interpreted as rather belonging to a 4 year periodicity, measuring in higher and smaller frequencies makes probably not so much sense. ?

    By the way why doesn't the FFT Modulus start at zero?

    Comment Source:Thanks for looking at this. >A two year period would be around .042. I don't know what this FFT Modulus algorithm does exactly but if I refer to what is written on the x-axis and when I assume that the values on the x-axis are all equidistant then there is a peak around 0.42 and it is slightly higher than the 6 month one, even if it is not as big as the annual. In particular left to the 0.05 value there are five distinguishable peaks, which are around (I have to use a ruler on the screen...) 0.047, 0.042, 0.036, 0.031 with 0.036 being the largest of that group, which would rather point to a frequency of 28 months than 24 (I.e. 2 years), which could point to this CBO, which WebHubTelescope is favouring. and then the fifth is the biggest at 0.028, which points to a 3 year periodicity. The next big peak at around 0.023 is around 3.7 years, which could be interpreted as rather belonging to a 4 year periodicity, measuring in higher and smaller frequencies makes probably not so much sense. ? By the way why doesn't the FFT Modulus start at zero?
  • 28.
    edited June 2014

    Hello John and hello everyone

    I was bored today so decided to read some code here, I hope it is ok.

    R code: http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/BiennialOscillation.R

    Q1:Where are these libraries used in the code? library("Rwave")

    library("WaveletCo")

    Q2: 'lm' is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(temp$x ~ temp$X)

    x<-resid(lintemp)

    Q3: Why fitting a Spline to Residuals of the linear regression? then taking the FFT? Is Spline used for Trend-ing? smoothtemp<-spline(temp$X, resid(lintemp), 984/2)

    smoothft<-fft(smoothtemp$y-mean(smoothtemp$y))

    I hope you do not mind me reading code, I do that all the time, best way to learn.

    Dara

    Comment Source:Hello John and hello everyone I was bored today so decided to read some code here, I hope it is ok. R code: [http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/BiennialOscillation.R](http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/BiennialOscillation.R) Q1:Where are these libraries used in the code? library("Rwave") library("WaveletCo") Q2: 'lm' is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(temp$x ~ temp$X) x<-resid(lintemp) Q3: Why fitting a Spline to Residuals of the linear regression? then taking the FFT? Is Spline used for Trend-ing? smoothtemp<-spline(temp$X, resid(lintemp), 984/2) smoothft<-fft(smoothtemp$y-mean(smoothtemp$y)) I hope you do not mind me reading code, I do that all the time, best way to learn. Dara
  • 29.
    edited June 2014

    Hello John and hello everyone

    QBO Anomalies Wavelet Analysis

    I did what the Pollard fellow was attempting to do in his R code though I found no wavelet calls, but mine completely based on Wavelets, two kinds i.e. Gabor and DGaussian.

    As you can see the data is noisy and there are multiple trends present. To handle that one has to use high degrees for Gabor and DGaussian, correspondingly 30 and 80.

    I plotted that actual wavelets so you could see how they work, they look like fast fading vibrations and are so to capture the minute frequencies in the noisy data.

    Then I ran the wavelets both on RAW signal and the denoised signal. Results are consistent in all cases. DGaussian gives stripped bands but analysis is similar in results.

    You get frequency bands with periods of 30, 38-50 and 70+ months. Ignoring the 70+ we have some idea about the periodicity. (CORRECTION: The 38-50 periods are more apparent closer to 2013)

    Dara

    Comment Source:Hello John and hello everyone [QBO Anomalies Wavelet Analysis](http://files.lossofgenerality.com/qboANOMALIESpdf.pdf) I did what the Pollard fellow was attempting to do in his R code though I found no wavelet calls, but mine completely based on Wavelets, two kinds i.e. Gabor and DGaussian. As you can see the data is noisy and there are multiple trends present. To handle that one has to use high degrees for Gabor and DGaussian, correspondingly 30 and 80. I plotted that actual wavelets so you could see how they work, they look like fast fading vibrations and are so to capture the minute frequencies in the noisy data. Then I ran the wavelets both on RAW signal and the denoised signal. Results are consistent in all cases. DGaussian gives stripped bands but analysis is similar in results. You get frequency bands with periods of 30, 38-50 and 70+ months. Ignoring the 70+ we have some idea about the periodicity. (CORRECTION: The 38-50 periods are more apparent closer to 2013) Dara
  • 30.

    Hello Blake and Dara,

    Dara wrote:

    Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend?

    I am just guessing, he may have wanted to get rid of the fouriercoefficients, which are due to a linear trend. I think though that this doesn't affect peaks much per se but more or less just their height in a continous way.

    Blake and Dara:

    I currently don't have access to a computer algebra system. I found R rather repelling and apart from this I had already extracted the HADCRUT4 temperature values with javascript and had done some calculations with it. So I had now tried to implement a fast fourier transform into my code, I didn't want to make it necessary to download math.js and I found it useful to do so for checking a bit in more detail how pure javascript is usable for math. As written previously I had found peaks at the biannual and the annual periodicities with this, however there was an error, so actually currently I can't really see the annual trend anymore, but the biannual is however still there. However at this stage I still don't fully trust the code. So if you find the time to run a crosscheck that would be interesting to see. I am still very convinced that there is a biannual trend, because of the visualization, but it could be that it is only roughly biannual, like e.g. the OCB. Blake's analysis seems to indicate this. I didn't deduce the linear trend as Blake did and I applied the FFT to the annual mean over the temperture values. Moreover the biannual peak is better visible with the temperature values starting around the nineteenfifties, as these are more precise. The biannual peak is even better visible in the diff12 values (by the way also visually, I find). The definition of annual mean (called "filter") and diff12 are here.

    Comment Source:Hello Blake and Dara, Dara wrote: >Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? I am just guessing, he may have wanted to get rid of the fouriercoefficients, which are due to a linear trend. I think though that this doesn't affect peaks much per se but more or less just their height in a continous way. Blake and Dara: I currently don't have access to a computer algebra system. I found R rather repelling and apart from this I had already extracted the HADCRUT4 temperature values with javascript and had done some calculations with it. So I had now tried to implement a fast fourier transform into my code, I didn't want to make it necessary to download math.js and I found it useful to do so for checking a bit in more detail how pure javascript is usable for math. As written previously I had found peaks at the biannual and the annual periodicities with this, however there was an error, so actually currently I can't really see the annual trend anymore, but the biannual is however still there. However at this stage I still don't fully trust the code. So if you find the time to run a crosscheck that would be interesting to see. I am still very convinced that there is a biannual trend, because of the visualization, but it could be that it is only roughly biannual, like e.g. the OCB. Blake's analysis seems to indicate this. I didn't deduce the linear trend as Blake did and I applied the FFT to the annual mean over the temperture values. Moreover the biannual peak is better visible with the temperature values starting around the nineteenfifties, as these are more precise. The biannual peak is even better visible in the diff12 values (by the way also visually, I find). The definition of annual mean (called "filter") and diff12 are <a href="http://www.daytar.de/art/co2ch4TempViz/index.html">here.</a>
  • 31.
    edited June 2014

    Hello John

    1. problem with using 'lm' is not this case, I am developing code that will run on the server on millions of such pieces of data once the GPM and other satellites start issuing hourly data as such. Therefore we need to deploy algorithms and computations that are general enough, in this particular case of 1-D well-behave signal with reasonable noise the lm option is allright, if you move to 2D and higher data (Grid Dara for El Nino) you see lm won't work, if noise churned up lm based code might give confusing results.

    2. The outputs I just posted for QBO Anomalies showes ~24 + frequency bands

    3. John do you want to use Mathematica?

    4. John do you want me to program the FFT in Mathematica and check the results against R?

    DAra

    Comment Source:Hello John 1. problem with using 'lm' is not this case, I am developing code that will run on the server on millions of such pieces of data once the GPM and other satellites start issuing hourly data as such. Therefore we need to deploy algorithms and computations that are general enough, in this particular case of 1-D well-behave signal with reasonable noise the lm option is allright, if you move to 2D and higher data (Grid Dara for El Nino) you see lm won't work, if noise churned up lm based code might give confusing results. 2. The outputs I just posted for QBO Anomalies showes ~24 + frequency bands 3. John do you want to use Mathematica? 4. John do you want me to program the FFT in Mathematica and check the results against R? DAra
  • 32.

    Dear John

    If you like to compare curves, please see a sample of machine learning techniques I used for comparing stock prices (1D time-series). Look at the plots you get the ideas:

    Knn, Trend, Noise & Correlation

    This is how it works:

    1. Breakdown the signal to Trend + Noise
    2. Treat the time-series as a vector in a normed vector space of arrays or functional space (does not matter which way to look at it)
    3. Choose a metric in normed vector space of #2 e.g. I chose Correlation Distance (see p2)
    4. Run Knn algorithm for Nearest neighbors i.e. find the n nearest vectors to a chosen vector (recall vectors are curves)
    5. If you do not know how to chose a single vector then do K-Means clustering (see p7)

    I did experiments with Pepsi stock price curve, review the results it is fun to look at the plots.

    I looked at Graham's code for correlation and covariance matrices, might I suggest that we do same code but combine it with K-nn and K-Means clustering.

    BTW I cook up a lot of these ideas with a dear friend called Michael Thorek, just wanted to give him credit for his talents & contributions.

    Dara

    Comment Source:Dear John If you like to compare curves, please see a sample of machine learning techniques I used for comparing stock prices (1D time-series). Look at the plots you get the ideas: [Knn, Trend, Noise & Correlation](http://files.lossofgenerality.com/mTrust2.pdf) This is how it works: 1. Breakdown the signal to Trend + Noise 2. Treat the time-series as a vector in a normed vector space of arrays or functional space (does not matter which way to look at it) 3. Choose a metric in normed vector space of #2 e.g. I chose Correlation Distance (see p2) 4. Run Knn algorithm for Nearest neighbors i.e. find the n nearest vectors to a chosen vector (recall vectors are curves) 5. If you do not know how to chose a single vector then do K-Means clustering (see p7) I did experiments with Pepsi stock price curve, review the results it is fun to look at the plots. I looked at Graham's code for correlation and covariance matrices, might I suggest that we do same code but combine it with K-nn and K-Means clustering. BTW I cook up a lot of these ideas with a dear friend called Michael Thorek, just wanted to give him credit for his talents & contributions. Dara
  • 33.
    nad
    edited June 2014

    John,

    Could you please ask Dara with Mathematica and Blake with R, if they have enough spare time, to do a FFT from the mean of the HADCRUT4 temperature anomalies for all years, all years starting in 1950 (including 1950) and from the corresponding diff12 values? I think it makes sense to have a look at the spectrum of the temperature anomalies, even if this had already been done elsewhere. It is good to compare with different softwares. This is a rather important information. One should also look at other temperature anomalies. There could still be a bug in the HADCRUT4 anomaly data.

    Comment Source:John, Could you please ask Dara with Mathematica and Blake with R, if they have enough spare time, to do a FFT from the mean of the HADCRUT4 temperature anomalies for all years, all years starting in 1950 (including 1950) and from the corresponding diff12 values? I think it makes sense to have a look at the spectrum of the temperature anomalies, even if this had already been done elsewhere. It is good to compare with different softwares. This is a rather important information. One should also look at other temperature anomalies. There could still be a bug in the HADCRUT4 anomaly data.
  • 34.

    Oh I figured why Pollard used Spline in his code, I got the idea from my huskies who have two different size leashes and I try to manage them walk.

    Let's say we have C02 data of length 5000 numbers, and for the same period Methane data of only 200 numbers. In order to do covariance and correlation the vectors need to have same length (for dot products to make sense).

    Therefore what you do is, Spline Methane, and then re-discreetize the Spline curve but this time with 5000 subdivisions of interpolation parameter. Now you can do any operation that requires dot products or Sigma (Summation) of same index range

    D

    Comment Source:Oh I figured why Pollard used Spline in his code, I got the idea from my huskies who have two different size leashes and I try to manage them walk. Let's say we have C02 data of length 5000 numbers, and for the same period Methane data of only 200 numbers. In order to do covariance and correlation the vectors need to have same length (for dot products to make sense). Therefore what you do is, Spline Methane, and then re-discreetize the Spline curve but this time with 5000 subdivisions of interpolation parameter. Now you can do any operation that requires dot products or Sigma (Summation) of same index range D
  • 35.
    edited June 2014

    Hello John and hello everyone

    With regards to Pollard's code and analysis, Pollard wrote (posted by John in this thread):

    If I am interpreting the results correctly, that first big peak after the low-frequency stuff is at about .0834 in units of 1/months. This corresponds to a 11.99 month period, so a year. The next peak is at about .1646 1/months or at about a 6 month period.

    I read his code carefully and re-coded in Mahtematica and got similar plots, hopefully if John requires publish it here. But there is an odd issue with Pollard's analysis above:

    He is visually estimating the PERIODICITY OF PERIODICITY! not the PERIODICITY of the original data.

    The plots he presents are, according to his R code :

    1. x-axis: frequency = month/N N= total_number_of_months
    2. y-axis, amplitude/N or norm/N of the complex number returned by FFT , so you get an idea how much of that frequency in #1 contributes to the original data/signal.

    So if you try to measure the distance between the amplitude peaks on x-axis you estimate the periodicity of when high amplitude frequencies re-appear in future.

    And the latter is not the periodicity of the original temperature anomaly data.

    His R code is ok insofar as figuring out the frequencies of a somehow denoised data, but his analysis seems flawed, or I could be misunderstanding his English.

    Dara

    Comment Source:Hello John and hello everyone With regards to Pollard's code and analysis, Pollard wrote (posted by John in this thread): >If I am interpreting the results correctly, that first big peak after the low-frequency stuff is at about .0834 in units of 1/months. This corresponds to a 11.99 month period, so a year. The next peak is at about .1646 1/months or at about a 6 month period. I read his code carefully and re-coded in Mahtematica and got similar plots, hopefully if John requires publish it here. But there is an odd issue with Pollard's analysis above: He is visually estimating the PERIODICITY OF PERIODICITY! not the PERIODICITY of the original data. The plots he presents are, according to his R code : 1. x-axis: frequency = month/N N= total_number_of_months 2. y-axis, amplitude/N or norm/N of the complex number returned by FFT , so you get an idea how much of that frequency in #1 contributes to the original data/signal. So if you try to measure the distance between the amplitude peaks on x-axis you estimate the periodicity of when high amplitude frequencies re-appear in future. And the latter is not the periodicity of the original temperature anomaly data. His R code is ok insofar as figuring out the frequencies of a somehow denoised data, but his analysis seems flawed, or I could be misunderstanding his English. Dara
  • 36.
    edited June 2014

    Blake Pollard (one of my grad students) is certainly not looking for "periodicity in periodicity". He's just looking for peaks in the modulus of the Fourier transform of the detrended monthly HadCRUT4-gl temperature data. He's not claiming that these peaks occur in a periodic manner!

    He found peaks at frequencies of .0834 and .1646 in units of 1/months. These correspond to periods of 11.99 months and 6.07 months, so these are things we'd expect from an annual cycle. (An annual cycle that's not a sine wave will have "overtones" - peaks at frequencies that are integer multiples of 1/year.)

    Comment Source:Blake Pollard (one of my grad students) is certainly not looking for "periodicity in periodicity". He's just looking for peaks in the modulus of the Fourier transform of the detrended monthly HadCRUT4-gl temperature data. He's not claiming that these peaks occur in a periodic manner! He found peaks at frequencies of .0834 and .1646 in units of 1/months. These correspond to periods of 11.99 months and 6.07 months, so these are things we'd expect from an annual cycle. (An annual cycle that's not a sine wave will have "overtones" - peaks at frequencies that are integer multiples of 1/year.)
  • 37.
    edited June 2014

    I will try to get Blake to visit and answer these questions, but in the meantime I will answer them:

    Q1:Where are these libraries used in the code?

    library(“Rwave”)

    library(“WaveletCo”)

    Maybe not at all. I asked Blake to do a Fourier transform or windowed Fourier transform of the detrended monthly HadCRUT4-gl temperature data. He considered doing a windowed Fourier transform (Gabor transform), but he hasn't done it yet.

    Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(tempx~tempx ~ tempX) x<-resid(lintemp)

    I asked him to subtract the linear trend. Perhaps that was dumb, but I wanted him to do something simple. Fancy curve-fitting will tend to affect higher-frequency components of the Fourier transform, so then one has to be more careful that one isn't destroying the data one is trying to analyze! For example, some of the deviations from a linear trend could be due to the Atlantic Multidecadal Oscillation.... while others are due to changes in human carbon emissions.

    Q3: Why fitting a Spline to Residuals of the linear regression? then taking the FFT?

    I don't think smoothing the temperature data with a spline before taking the Fourier transform is a good idea. It seems like an undisciplined way of damping out the Fourier transform at high frequencies. I don't know why that would be a good thing to do.

    Let’s say we have CO${}_2$ data of length 5000 numbers, and for the same period Methane data of only 200 numbers. In order to do covariance and correlation the vectors need to have same length (for dot products to make sense).

    That might (or might not) be a good idea, but Blake was not studying several different time series, just one: the detrended monthly HadCRUT4-gl temperature data from here.

    Comment Source:I will try to get Blake to visit and answer these questions, but in the meantime I will answer them: > Q1:Where are these libraries used in the code? > library(“Rwave”) > library(“WaveletCo”) Maybe not at all. I asked Blake to do a Fourier transform or windowed Fourier transform of the detrended monthly HadCRUT4-gl temperature data. He considered doing a windowed Fourier transform (Gabor transform), but he hasn't done it yet. > Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(tempx~tempx ~ tempX) x<-resid(lintemp) I asked him to subtract the linear trend. Perhaps that was dumb, but I wanted him to do something simple. Fancy curve-fitting will tend to affect higher-frequency components of the Fourier transform, so then one has to be more careful that one isn't destroying the data one is trying to analyze! For example, some of the deviations from a linear trend could be due to the Atlantic Multidecadal Oscillation.... while others are due to changes in human carbon emissions. > Q3: Why fitting a Spline to Residuals of the linear regression? then taking the FFT? I don't think smoothing the temperature data with a spline before taking the Fourier transform is a good idea. It seems like an undisciplined way of damping out the Fourier transform at high frequencies. I don't know why that would be a good thing to do. > Let’s say we have CO${}_2$ data of length 5000 numbers, and for the same period Methane data of only 200 numbers. In order to do covariance and correlation the vectors need to have same length (for dot products to make sense). That might (or might not) be a good idea, but Blake was not studying several different time series, just one: the detrended monthly HadCRUT4-gl temperature data from [here](http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/).
  • 38.
    edited June 2014

    Nad wrote:

    Could you please ask Dara with Mathematica and Blake with R, if they have enough spare time, to do a FFT from the mean of the HadCRUT4 temperature anomalies for all years, all years starting in 1950 (including 1950) and from the corresponding diff12 values?

    What do you mean by "do a FFT from the mean of the temperature anomalies for all years". Do you mean 1) get the file HadCRUT4-gl, 2) take at the mean temperature anomaly for each year (which actually the last number in each odd-numbered row, and then 3) take the Fourier transform of the resulting series of numbers?

    Dara is here, so you can ask him to do things... I can ask Blake to do something like this when I'm sure I understand what you want. Maybe even I could do it!

    I think it makes sense to have a look at the spectrum of the temperature anomalies, even if this had already been done elsewhere.

    Blake already did this starting from the monthly data, and I think the monthly data is much better than the annual data, for a reason I've already mentioned: if we use annual data, a discrete-time Fourier transform can only detect periodicity are integer multiples of years. So, you might see a 2-year cycle when it's really a 1.8-year cycle.

    The main way I think Blake might improve his work so far is to not "detrend" the data: not subtract a linear trend. It was my idea to subtract that linear trend.

    There could still be a bug in the HadCRUT4 anomaly data.

    I doubt there's a serious "bug" in this data, because this is one of the most carefully scrutinized pieces of data in the world - or at least, in climate science. But it would be fun to compare things with the other biggies, like GISTEMP and Berkeley Earth.

    Comment Source:Nad wrote: > Could you please ask Dara with Mathematica and Blake with R, if they have enough spare time, to do a FFT from the mean of the HadCRUT4 temperature anomalies for all years, all years starting in 1950 (including 1950) and from the corresponding diff12 values? What do you mean by "do a FFT from the mean of the temperature anomalies for all years". Do you mean 1) get the file [HadCRUT4-gl](http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT4-gl.dat), 2) take at the mean temperature anomaly for each year (which actually the last number in each odd-numbered row, and then 3) take the Fourier transform of the resulting series of numbers? Dara is here, so you can ask him to do things... I can ask Blake to do something like this when I'm sure I understand what you want. Maybe even I could do it! > I think it makes sense to have a look at the spectrum of the temperature anomalies, even if this had already been done elsewhere. Blake already did this starting from the monthly data, and I think the monthly data is much better than the annual data, for a reason I've [already mentioned](http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11181#Comment_11181): if we use annual data, a discrete-time Fourier transform can only detect periodicity are integer multiples of years. So, you might see a 2-year cycle when it's really a 1.8-year cycle. The main way I think Blake might improve his work so far is to not "detrend" the data: not subtract a linear trend. It was my idea to subtract that linear trend. > There could still be a bug in the HadCRUT4 anomaly data. I doubt there's a serious "bug" in this data, because this is one of the most [carefully scrutinized](https://en.wikipedia.org/wiki/HadCRUT#History_of_CRU_Climate_Data) pieces of data in the world - or at least, in climate science. But it would be fun to compare things with the other biggies, like [GISTEMP](http://data.giss.nasa.gov/gistemp/) and [Berkeley Earth](http://berkeleyearth.org/land-and-ocean-data).
  • 39.
    edited June 2014

    Nad wrote:

    By the way why doesn’t the FFT Modulus start at zero?

    It does! I hope you know that "modulus" means "absolute value" and "FFT" means "fast Fourier transform". So, it starts at zero.

    I think you should be asking "why is 0 above the horizontal axis in these graphs?"

    Comment Source:Nad wrote: > By the way why doesn’t the FFT Modulus start at zero? It does! I hope you know that "modulus" means "absolute value" and "FFT" means "fast Fourier transform". So, it starts at zero. I think you should be asking "why is 0 above the horizontal axis in these graphs?" <img src = "http://math.ucr.edu/home/baez/ecological/HadCRUT4-gl/DetrendedNotSmoothedFFTModulus.jpg" alt = ""/>
  • 40.

    I'll try to clarify some things.

    John's answers were spot on here.

    Q1:Where are these libraries used in the code? library(“Rwave”) library(“WaveletCo”)

    I didn't use those to create the posted images.

    Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(tempx~tempx ~ tempX) x<-resid(lintemp)

    That was to detrend. The residuals are then the original data minus the linear trend. Might not be the best idea, especially because it looks like the linear trend really only picks up about halfway through the time series. The global coverage of the data also increases as time goes on, so as John mentioned earlier regarding the jaggedness of the earlier data.

    Q3 Why spline?

    A cubic spline was a cheap way of smoothing the data, not for reconciling vectors of different length. I guess nad had done some smoothing and found a peak at a period of 2-years. I wanted to see if somehow it made the 2-year peak stand out more. It didn't change much. I think R has other spline based smoothing that has some more parameters to play around with. The blessing and the curse of using a package-rich language such as R is that you can do lots of different things quickly, but since you don't write the implementations you really don't know what is happening to the data. You can read the code for the functions, but that is like trying to learn something really complicated from someone else's notes instead of figuring it out for yourself.

    The 'desmoothed' plots don't use any splines.

    I was just trying to do some quick analysis to see if I could reproduce evidence for a 2-year cycle. I also tried using windowed Fourier transforms, but I need to think about the parameters (window, frequency resolution) to get something that makes sense. I think wavelets are a good idea. I also think something like splitting the data into a trend, oscillation, and noise is a good idea and shouldn't be too hard to do.

    I think the real thing to do is to think about what it is we are trying to explain/understand about the data and then try to figure out what kind of analysis is appropriate for that.

    Comment Source:I'll try to clarify some things. John's answers were spot on here. >Q1:Where are these libraries used in the code? library(“Rwave”) library(“WaveletCo”) I didn't use those to create the posted images. >Q2: ’lm’ is a linear regressor and I wonder why it was used? Most of the data is seriously nonlinear. Is it supposed to serve as a rough trend? lintemp<-lm(tempx~tempx ~ tempX) x<-resid(lintemp) That was to detrend. The residuals are then the original data minus the linear trend. Might not be the best idea, especially because it looks like the linear trend really only picks up about halfway through the time series. The global coverage of the data also increases as time goes on, so as John mentioned earlier regarding the jaggedness of the earlier data. >Q3 Why spline? A cubic spline was a cheap way of smoothing the data, not for reconciling vectors of different length. I guess nad had done some smoothing and found a peak at a period of 2-years. I wanted to see if somehow it made the 2-year peak stand out more. It didn't change much. I think R has other spline based smoothing that has some more parameters to play around with. The blessing and the curse of using a package-rich language such as R is that you can do lots of different things quickly, but since you don't write the implementations you really don't know what is happening to the data. You can read the code for the functions, but that is like trying to learn something really complicated from someone else's notes instead of figuring it out for yourself. The 'desmoothed' plots don't use any splines. I was just trying to do some quick analysis to see if I could reproduce evidence for a 2-year cycle. I also tried using windowed Fourier transforms, but I need to think about the parameters (window, frequency resolution) to get something that makes sense. I think wavelets are a good idea. I also think something like splitting the data into a trend, oscillation, and noise is a good idea and shouldn't be too hard to do. I think the real thing to do is to think about what it is we are trying to explain/understand about the data and then try to figure out what kind of analysis is appropriate for that.
  • 41.

    So if you try to measure the distance between the amplitude peaks on x-axis you estimate the periodicity of when high amplitude frequencies re-appear in future.

    It is true that high-frequency peaks re-appear in time, similar to peaks in the auto-correlation function. I think the relative amplitude of the peaks can tell you something, but I'm not an expert on this stuff yet.

    Comment Source:>So if you try to measure the distance between the amplitude peaks on x-axis you estimate the periodicity of when high amplitude frequencies re-appear in future. It is true that high-frequency peaks re-appear in time, similar to peaks in the auto-correlation function. I think the relative amplitude of the peaks can tell you something, but I'm not an expert on this stuff yet.
  • 42.

    THank you John and Mr. Pollard, I did some code to match Pollard's computations but in Mathematica, some more stuff for Nad.

    However I do not know how to post images here, nor know how to post code.

    I will post pdf and jpg links from my ftp server meanwhile today.

    John these are important topics how to smooth the data and remove frequencies or add frequencies to the signal in order to preprocess the signal for forecast and correlation.

    It will become quite difficult when we need to do the GRIDs which are 2D to 3D signals, fortunately the FFT and wavelet libraries do support multidimensional data, but hard to code correctly

    Will be back shortly, please SHOOT HOLES into my arguments and code :)

    D

    Comment Source:THank you John and Mr. Pollard, I did some code to match Pollard's computations but in Mathematica, some more stuff for Nad. However I do not know how to post images here, nor know how to post code. I will post pdf and jpg links from my ftp server meanwhile today. John these are important topics how to smooth the data and remove frequencies or add frequencies to the signal in order to preprocess the signal for forecast and correlation. It will become quite difficult when we need to do the GRIDs which are 2D to 3D signals, fortunately the FFT and wavelet libraries do support multidimensional data, but hard to code correctly Will be back shortly, please SHOOT HOLES into my arguments and code :) D
  • 43.
    edited June 2014

    Hello John

    As Nad asked, I wrote a similar Mathematica code to that of Pollard's R script, save the spline usage:

    PDF

    QBO Anomaly FFT Filtering

    CDF

    QBO Anomaly FFT Filtering It might take a few seconds to load...

    I added some code in the beginning to show how the Gaussian convolution could be used to make a filter to denoise the signal using FFT. This is how the image processing and signal processing classical applications worked.

    I then proceeded and wrote a little FFT BANDPASS FILTER which starts from 0 and moves towards 1 and filters frequencies and you could see that around 0.03 a nice denoised curve shows up which you could count its pick and obtain about 40 months periodicity. Same numbers range was found by the wavelet code which I posted earlier LINK.

    The latter number of months 30-40 is a range and there are other ranges depending on how the signal is denoised, which if you study the signal you notice that it has several trends therefore the estimates for the periodicity need to take the multi-trend ranges into consideration and issue a more sophisticated estimate vs. a single number.

    Dara

    Comment Source:Hello John As Nad asked, I wrote a similar Mathematica code to that of Pollard's R script, save the spline usage: PDF [QBO Anomaly FFT Filtering](http://files.lossofgenerality.com/qboFFTPDF.pdf) CDF [QBO Anomaly FFT Filtering](http://mathematica.lossofgenerality.com/2014/06/28/qbo-anomaly-fft-filtering/) It might take a few seconds to load... I added some code in the beginning to show how the Gaussian convolution could be used to make a filter to denoise the signal using FFT. This is how the image processing and signal processing classical applications worked. I then proceeded and wrote a little FFT BANDPASS FILTER which starts from 0 and moves towards 1 and filters frequencies and you could see that around 0.03 a nice denoised curve shows up which you could count its pick and obtain about 40 months periodicity. Same numbers range was found by the wavelet code which I posted earlier [LINK](http://forum.azimuthproject.org/discussion/1367/darya-shaydas-el-nino-visualization/?Focus=11146#Comment_11146). The latter number of months 30-40 is a range and there are other ranges depending on how the signal is denoised, which if you study the signal you notice that it has several trends therefore the estimates for the periodicity need to take the multi-trend ranges into consideration and issue a more sophisticated estimate vs. a single number. Dara
  • 44.

    I took the dynamic step function from 0 to 1 and multiplied by the FFT(qbo) which qbo is the raw temperature anomalies, then did inverse FFT on the resultant multiplication and only used the RE real part to reconstrcut a denoised signal. This whole thing is a bandpass filter that would allow the investigator to generate huge number of denoised signals and pick the one with most visual and appropriate shape to understand the periodicity of the raw signal.

    I suspect then you could count the peaks in the Month~Temperature space for periodicity and no need to count peaks at Frequency~Amplitude space.

    Comment Source:I took the dynamic step function from 0 to 1 and multiplied by the FFT(qbo) which qbo is the raw temperature anomalies, then did inverse FFT on the resultant multiplication and only used the RE real part to reconstrcut a denoised signal. This whole thing is a bandpass filter that would allow the investigator to generate huge number of denoised signals and pick the one with most visual and appropriate shape to understand the periodicity of the raw signal. I suspect then you could count the peaks in the Month~Temperature space for periodicity and no need to count peaks at Frequency~Amplitude space.
  • 45.
    nad
    edited June 2014

    John wrote here

    I’m worried that the large signal with period 2 years is just part of a general spike at high frequencies. If you only show the Fourier transform for periods that are integer multiples of years, a spike at high frequencies will show up as a spike at periods 1 and 2. Since HadCRUT gives monthly temperatures, we are able to study its Fourier transform at any period that’s an integer multiple of months.

    As I said, it was hacked in, and this diagram was not intended for publication, but more or less just an update about what I am currently doing. As I wrote you I would have liked the visualization with the dots to be put into the wiki and not that intermediate diagram. In particular as I had explained in the email the diagram is supposed to be only the real part of the discrete fourier transform, i.e. it might be that not all peaks appear and that they don't appear as visible as with the full transform. The reason for displaying only the real part was for getting fastly some image without installing a package. I have meanwhile started to implement also the imaginary part. Moreover after having fastly implemented the code I was so tired that I mistook high and low frequencies. That is I think the annual and biannual peaks are not even in the diagram (I currently think they should be at N/12 and N/24, but I don't remember the N I used inthe diagram) and I had already corrected that in an email to you. Please ask me before you publish email content from me to you.

    John wrote here:

    I hope you know that “modulus” means “absolute value” and “FFT” means “fast Fourier transform”. So, it starts at zero.

    I actually knew somewhat, in particular I knew that FFT means Fast Fourier Transform but I crosschecked on wikipedia. But I still haven't looked much at the various algorithms, and I currently don't feel like doing so.... I often crosscheck math terms with wikipedia and unfortunately they are not always correct and/or the same as I think they should. In particular I told you for the discrete fourier transform which is mimicked in the FFT that I have actually remembered a normalization factor of $\frac{1}{\sqrt(N)}$, which wikipedia doesn't use. So I am a bit unsure about this. It doesn't affect the behaviour of peaks though.

    So, it starts at zero.

    No it doesn't, because I it looks very unlikely that the zero'th term is zero, it might rather get too big, so that's may be why it had been clipped.

    Dara is here, so you can ask him to do things… I can ask Blake to do something like this when I’m sure I understand what you want. Maybe even I could do it!

    Dara didn't initially respond to my question here. Moreover I thought that Blake might not be reading anymore the forum, because you had posted his calculations.

    What do you mean by “do a FFT from the mean of the temperature anomalies for all years”. Do you mean 1) get the file HadCRUT4-gl, 2) take at the mean temperature anomaly for each year (which actually the last number in each odd-numbered row, and then 3) take the Fourier transform of the resulting series of numbers?

    1) I posted a link here what I mean with a mean here. It is what is I think called a running mean, which if I remember had been dubbed filter in the Humlum et al paper:

    $$filter(q_i):=\frac{1}{12}*\sum_{j=0}^11 q_{i-5+j}$$ That is I do no spline but this type of averaging.

    2) So it is this running mean I was referring to and not the annual mean:

    $$ \langle q \rangle = \frac{1}{12}*\sum_{j=0}^11 q_j \;\; \;\; \;\; \;\; q_j \;\; being \; a \; january \; value$$ 3) I wanted to see the fourier transform of this running mean and of the diff12 values.

    Dara wrote:

    As Nad asked, I wrote a similar Mathematica code to that of Pollard’s R script, save the spline usage

    Thanks Dara, I looked at the file, but I have troubles to understand, what you had been doing there, is QBO the HADCRUT4 temperature anomaly data (i.e. the values in every second row, without the last one in the HADCRUT4 file)?

    My husband is here for the weekend and I eventually I get to use his mathematica to make a comparision myself but I am not sure. I currently feel that I had enough of this. I still think though that this what looks as 2-year cylce might be important.

    Comment Source:John wrote <a href="http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11181#Comment_11181">here</a> >I’m worried that the large signal with period 2 years is just part of a general spike at high frequencies. If you only show the Fourier transform for periods that are integer multiples of years, a spike at high frequencies will show up as a spike at periods 1 and 2. Since HadCRUT gives monthly temperatures, we are able to study its Fourier transform at any period that’s an integer multiple of months. As I said, it was hacked in, and this diagram was not intended for publication, but more or less just an update about what I am currently doing. As I wrote you I would have liked the visualization with the dots to be put into the wiki and not that intermediate diagram. In particular as I had explained in the email the diagram is supposed to be only the real part of the discrete fourier transform, i.e. it might be that not all peaks appear and that they don't appear as visible as with the full transform. The reason for displaying only the real part was for getting fastly some image without installing a package. I have meanwhile started to implement also the imaginary part. Moreover after having fastly implemented the code I was so tired that I mistook high and low frequencies. That is I think the annual and biannual peaks are not even in the diagram (I currently think they should be at N/12 and N/24, but I don't remember the N I used inthe diagram) and I had already corrected that in an email to you. Please ask me before you publish email content from me to you. John wrote <a href="http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11220#Comment_11220">here:</a> > I hope you know that “modulus” means “absolute value” and “FFT” means “fast Fourier transform”. So, it starts at zero. I actually knew somewhat, in particular I knew that FFT means Fast Fourier Transform but I crosschecked on wikipedia. But I still haven't looked much at the various algorithms, and I currently don't feel like doing so.... I often crosscheck math terms with wikipedia and unfortunately they are not always correct and/or the same as I think they should. In particular I told you for the <a href="http://en.wikipedia.org/wiki/FFT">discrete fourier transform</a> which is mimicked in the FFT that I have actually remembered a normalization factor of $\frac{1}{\sqrt(N)}$, which wikipedia doesn't use. So I am a bit unsure about this. It doesn't affect the behaviour of peaks though. >So, it starts at zero. No it doesn't, because I it looks very unlikely that the zero'th term is zero, it might rather get too big, so that's may be why it had been clipped. >Dara is here, so you can ask him to do things… I can ask Blake to do something like this when I’m sure I understand what you want. Maybe even I could do it! Dara didn't initially respond to my question <a href="http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11204#Comment_11204">here.</a> Moreover I thought that Blake might not be reading anymore the forum, because you had posted his calculations. >What do you mean by “do a FFT from the mean of the temperature anomalies for all years”. Do you mean 1) get the file HadCRUT4-gl, 2) take at the mean temperature anomaly for each year (which actually the last number in each odd-numbered row, and then 3) take the Fourier transform of the resulting series of numbers? 1) I posted a link <a href="http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11204#Comment_11204">here</a> what I mean with a mean here. It is what is I think called a running mean, which if I remember had been dubbed filter in the Humlum et al paper: $$filter(q_i):=\frac{1}{12}*\sum_{j=0}^11 q_{i-5+j}$$ That is I do no spline but this type of averaging. 2) So it is this running mean I was referring to and not the annual mean: $$ \langle q \rangle = \frac{1}{12}*\sum_{j=0}^11 q_j \;\; \;\; \;\; \;\; q_j \;\; being \; a \; january \; value$$ 3) I wanted to see the fourier transform of this running mean and of the diff12 values. Dara <a href="http://forum.azimuthproject.org/discussion/1375/quasibiennial-oscillation/?Focus=11231#Comment_11231">wrote:</a> >As Nad asked, I wrote a similar Mathematica code to that of Pollard’s R script, save the spline usage Thanks Dara, I looked at the file, but I have troubles to understand, what you had been doing there, is QBO the HADCRUT4 temperature anomaly data (i.e. the values in every second row, without the last one in the HADCRUT4 file)? My husband is here for the weekend and I eventually I get to use his mathematica to make a comparision myself but I am not sure. I currently feel that I had enough of this. I still think though that this what looks as 2-year cylce might be important.
  • 46.

    yes Nad it is the every second row data, if you use the CDF file in the browser, in the middle of the file you see the interactive bandpass filter that clarifies everything.

    Then you could question the counting of the peaks and so on

    Comment Source:yes Nad it is the every second row data, if you use the CDF file in the browser, in the middle of the file you see the interactive bandpass filter that clarifies everything. Then you could question the counting of the peaks and so on
  • 47.

    you also do not need mathematica to view the CDF file, just the FREE CDF player.

    The Wavelet analysis I posted earlier shows there is a band of frequency in the data towards our time now, which has a 2 year cycle, the FFT for the entire period shows 40 months cycle (if I count the peaks).

    Because the analysis assumes waves in the data, we could have several waves of different frequency which add up to the original data i.e. you could have both 2 year an d40 months cycles at the same time, varying through the life of the data.

    FFT is deficient in that regard since it assumes infinite (entire time of the data) support for the waves, Wavelets assume compact support (localized in time). Therefore the latter model the data better.

    Furthermore Wavelets and FFT show multiple Trends for the data, this needs special thinking i.e. the data is not simply TREND+NOISE, the Trend breaks into more Trends. This perhaps explains much more complex dynamics than we understand by cursory visualization of data.

    D

    Comment Source:you also do not need mathematica to view the CDF file, just the FREE CDF player. The Wavelet analysis I posted earlier shows there is a band of frequency in the data towards our time now, which has a 2 year cycle, the FFT for the entire period shows 40 months cycle (if I count the peaks). Because the analysis assumes waves in the data, we could have several waves of different frequency which add up to the original data i.e. you could have both 2 year an d40 months cycles at the same time, varying through the life of the data. FFT is deficient in that regard since it assumes infinite (entire time of the data) support for the waves, Wavelets assume compact support (localized in time). Therefore the latter model the data better. Furthermore Wavelets and FFT show multiple Trends for the data, this needs special thinking i.e. the data is not simply TREND+NOISE, the Trend breaks into more Trends. This perhaps explains much more complex dynamics than we understand by cursory visualization of data. D
  • 48.

    The CDF file is 20 MB, quite much for our current internet connection. Therefore I didn't even download this, but this HadCRUT4-gl.dat. With "in the middle of file" you mean your pdf? It seems you multiply the frequencies with some modified gaussian and then you do the fourier transform backwards. Thats some kind of filter - I am not sure whether it would be called a band-pass filter, since I didn't understand your cutting procedure. But apart from this I don't understand why using a band pass filter would clarify everything.

    Comment Source:The CDF file is <a href="http://www.cru.uea.ac.uk/cru/data/temperature/">20 MB</a>, quite much for our current internet connection. Therefore I didn't even download this, but this <a href="http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT4-gl.dat">HadCRUT4-gl.dat.</a> With "in the middle of file" you mean your <a href="http://files.lossofgenerality.com/qboFFTPDF.pdf">pdf</a>? It seems you multiply the frequencies with some modified gaussian and then you do the fourier transform backwards. Thats some kind of filter - I am not sure whether it would be called a <a href="http://en.wikipedia.org/wiki/Band-pass_filter">band-pass filter</a>, since I didn't understand your cutting procedure. But apart from this I don't understand why using a band pass filter would clarify everything.
  • 49.

    Nad

    As John had mentioned and I am more than willing to do is to code and use some of the interactive techs to investigate equations or data or dynamical behaviours of some part of this research's data, please don't be shy to ask, I love doing this part of trying to make sense of data.

    What I do not like to do is to interpret the results and make a theory for what happens in nature, because I need to look at large amounts data for many examples of conditions to gain experience currently I lack that. Dara

    Comment Source:Nad As John had mentioned and I am more than willing to do is to code and use some of the interactive techs to investigate equations or data or dynamical behaviours of some part of this research's data, please don't be shy to ask, I love doing this part of trying to make sense of data. What I do not like to do is to interpret the results and make a theory for what happens in nature, because I need to look at large amounts data for many examples of conditions to gain experience currently I lack that. Dara
  • 50.

    Nad

    CDF I am referring to is here:

    http://mathematica.lossofgenerality.com/2014/06/28/qbo-anomaly-fft-filtering/

    only 400KB.

    For the pdf, please go to page 9 to see what I mean by the bandpass filter.

    Ignore the Gaussian convolution, I was just trying to show methods used by old style investigations (kinda showing off ...)

    D

    Comment Source:Nad CDF I am referring to is here: [http://mathematica.lossofgenerality.com/2014/06/28/qbo-anomaly-fft-filtering/](http://mathematica.lossofgenerality.com/2014/06/28/qbo-anomaly-fft-filtering/) only 400KB. For the pdf, please go to page 9 to see what I mean by the bandpass filter. Ignore the Gaussian convolution, I was just trying to show methods used by old style investigations (kinda showing off ...) D
Sign In or Register to comment.