#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

# Blog - El Niño project (part 4)

I'm starting a blog article summarizing Graham's attempts to replicate the work of Ludescher et al. This should include details of how to download the necessary data, what R programs Graham used or wrote, known ways in which his work differs from that of Ludescher et al, and a side-by-side comparison of the graphs.

There's nothing here yet, so I'll say when there is.

• Options
1.

It might be more informative to display that data as scatterplots of signal vs nino-index N months later.

Comment Source:It might be more informative to display that data as scatterplots of signal vs nino-index N months later.
• Options
2.
edited July 2014

True! This blog article has very limited goals - nothing about evaluating the algorithm that Ludescher et al propose, merely saying how Graham replicated it, and how close his replication comes to the original. But we eventually want to create our own better algorithm, and then what you describe might be interesting.

(I don't think there's anything special about "6 months" in Ludescher's approach: they simply say that if S exceeds a certain threshold there will be an El Niño in the next calendar year. That's a rather loose kind of prediction.)

Comment Source:True! This blog article has very limited goals - nothing about evaluating the algorithm that Ludescher _et al_ propose, merely saying how Graham replicated it, and how close his replication comes to the original. But we eventually want to create our own better algorithm, and then what you describe might be interesting. (I don't think there's anything special about "6 months" in Ludescher's approach: they simply say that if S exceeds a certain threshold there will be an El Ni&ntilde;o in the next calendar year. That's a rather loose kind of prediction.)
• Options
3.

Graham: I'd like to put on GitHub a version of netcdf-convertor.R that takes their settings as defaults, instead of Scotland 1957-1958. This is not to denigrate the importance of Scotland! I just want to make it really easy for people to understand how you're replicating Ludescher et al.

I have put such a version here:

but you maybe you can check that I haven't messed it up. I've tried to change just this part:

###############################################################################
###############################################################################

# You should be able to use this by editing this section only.
# The defaults here are those used to replicate the paper by
# Ludescher et al, 2013.

# Choose your working directory for R here:

setwd("C:/YourWorkingDirectory")

# Choose your latitude and longitude range

lat.range <- 24:50
lon.range <- 48:116

# Supply the years

firstyear <- 1950
lastyear <- 1979

# 4. Supply the output name as a text string. paste0() concatenates strings
# which you may find handy:

outputfilename <- paste0("Pacific-", firstyear, "-", lastyear, ".txt")

###############################################################################
###############################################################################

#                  Explanation

# 1. Use setwd() to set the working directory to the one
# containing the .nc files such as air.sig995.1951.nc.
# Example:
# setwd("C:/Users/Work/AAA/Programming/ProgramOutput/Nino")

# 2. Supply the latitude and longitude range.  The NOAA data is
# every 2.5 degrees. The ranges are supplied as the number of steps of this
# size. For latitude, 1 means North Pole, 73 means South Pole. For longitude,
# 1 means 0 degrees East, 37 is 90E, 73 is 180, 109 is 90W or 270E, 144 is 2.5W.

# These roughly cover Scotland.
# lat.range <- 13:14
# lon.range <- 142:143

# These are the area used by Ludescher et al, 2013. It is 27x69 points
# which then subsampled to 9 by 23.
#lat.range <- 24:50
#lon.range <- 48:116

# 3. Supply the years.  These are the years for Ludescher et al:
# firstyear <- 1950
# lastyear <- 1979

# 4. Supply the output name as a text string. paste0() concatenates strings
# which you may find handy:
# outputfilename <- paste0("Pacific-", firstyear, "-", lastyear, ".txt")

###############################################################################
###############################################################################

#                      Example of output

# S024E048 S024E049 S024E050 S024E051 S024E052 S024E053 S024E054 [etc.]
# Y1950P001 277.85 279.8 281.95 282.77 283.7 285.57 287.37 288.2 [etc.]

# There is one row for each day, and 365 days in each year (leap days are
# omitted). In each row, you have temperatures in Kelvin for each grid
# point in a rectangle.


The main downside is that the example output, what you get with these parameters, consists of very long lines.

Comment Source:Graham: I'd like to put on GitHub a version of netcdf-convertor.R that takes their settings as defaults, instead of Scotland 1957-1958. This is not to denigrate the importance of Scotland! I just want to make it really easy for people to understand how you're replicating Ludescher _et al_. I have put such a version here: * [netcdf-converter-ludescher.R](http://math.ucr.edu/home/baez/ecological/el_nino/netcdf-convertor-ludescher.R) but you maybe you can check that I haven't messed it up. I've tried to change just this part: ~~~~ ############################################################################### ############################################################################### # You should be able to use this by editing this section only. # The defaults here are those used to replicate the paper by # Ludescher et al, 2013. # Choose your working directory for R here: setwd("C:/YourWorkingDirectory") # Choose your latitude and longitude range lat.range <- 24:50 lon.range <- 48:116 # Supply the years firstyear <- 1950 lastyear <- 1979 # 4. Supply the output name as a text string. paste0() concatenates strings # which you may find handy: outputfilename <- paste0("Pacific-", firstyear, "-", lastyear, ".txt") ############################################################################### ############################################################################### # Explanation # 1. Use setwd() to set the working directory to the one # containing the .nc files such as air.sig995.1951.nc. # Example: # setwd("C:/Users/Work/AAA/Programming/ProgramOutput/Nino") # 2. Supply the latitude and longitude range. The NOAA data is # every 2.5 degrees. The ranges are supplied as the number of steps of this # size. For latitude, 1 means North Pole, 73 means South Pole. For longitude, # 1 means 0 degrees East, 37 is 90E, 73 is 180, 109 is 90W or 270E, 144 is 2.5W. # These roughly cover Scotland. # lat.range <- 13:14 # lon.range <- 142:143 # These are the area used by Ludescher et al, 2013. It is 27x69 points # which then subsampled to 9 by 23. #lat.range <- 24:50 #lon.range <- 48:116 # 3. Supply the years. These are the years for Ludescher et al: # firstyear <- 1950 # lastyear <- 1979 # 4. Supply the output name as a text string. paste0() concatenates strings # which you may find handy: # outputfilename <- paste0("Pacific-", firstyear, "-", lastyear, ".txt") ############################################################################### ############################################################################### # Example of output # S024E048 S024E049 S024E050 S024E051 S024E052 S024E053 S024E054 [etc.] # Y1950P001 277.85 279.8 281.95 282.77 283.7 285.57 287.37 288.2 [etc.] # There is one row for each day, and 365 days in each year (leap days are # omitted). In each row, you have temperatures in Kelvin for each grid # point in a rectangle. ~~~~ The main downside is that the example output, what you get with these parameters, consists of very long lines.
• Options
4.
edited July 2014

The angle brackets are defined as a running average over the 365 days, so this quantity involves data going back twice as long: 730 days. Furthermore, the ‘link strength’ involves the above expression where $\tau$ goes up to 200 days.

So, taking their definitions at face value, Ludescher et al could not actually compute their ‘link strength’ until 930 days after the surface temperature data first starts at the beginning of 1948. That would be late 1950. But their graph of the link strength starts at the beginning of 1950!

Your program ludescher.R wants to eat data from a file Pacific-1950-1979.txt. So I guess you using temperature data that starts in 1950, rather than 1948.

That goes along with what Ludescher et al say: they're using "atmospheric temperature data for the 1950–2011 period." But even if they didn't fall into the "running average of running averages" trap, with data starting in 1950 their graph could only start 565 days later! Right?

That would be around 1952. And your graph does start in 1952, right?

So, my theory is that despite what they say, they actually use data starting in 1948! That would let their graph start in 1950.

I suppose all this is more annoying than important. But I'm running ludescher.R now, so this stuff is on my mind.

Comment Source:Graham - remember this business? > The angle brackets are defined as a running average over the 365 days, so this quantity involves data going back twice as long: 730 days. Furthermore, the ‘link strength’ involves the above expression where $\tau$ goes up to 200 days. > So, taking their definitions at face value, Ludescher _et al_ could not actually compute their ‘link strength’ until 930 days after the surface temperature data first starts at the beginning of 1948. That would be late 1950. But their graph of the link strength starts at the beginning of 1950! Your program [ludescher.R](https://github.com/azimuth-project/el-nino/blob/master/R/grj/ludescher.R) wants to eat data from a file Pacific-1950-1979.txt. So I guess you using temperature data that starts in 1950, rather than 1948. That goes along with what Ludescher _et al_ say: they're using "atmospheric temperature data for the 1950–2011 period." But even if they didn't fall into the "running average of running averages" trap, with data starting in 1950 their graph could only start 565 days later! Right? That would be around 1952. And your graph _does_ start in 1952, right? So, my theory is that despite what they _say_, they actually use data starting in 1948! That would let their graph start in 1950. I suppose all this is more annoying than important. But I'm running ludescher.R now, so this stuff is on my mind.
• Options
5.

I've been using 1950-1979, but not because I thought Ludescher et al did, rather because I didn't care much about exactly what they did. For the blog article, maybe best to use exactly what they did, which I think is the 33 years 1948-1980.

Comment Source:I've been using 1950-1979, but not because I thought Ludescher et al did, rather because I didn't care much about exactly what they did. For the blog article, maybe best to use exactly what they did, which I think is the 33 years 1948-1980.
• Options
6.

Okay, I'll do that. Where is the file nino34-anoms.txt?

Comment Source:Okay, I'll do that. Where is the file nino34-anoms.txt?
• Options
7.

I would like to make the default versions of ludescher.R and netcdf-convertor.R mimic Ludescher et al's work as closely as possible, e.g. using data from 1948 to 1980. It's not ultimately very important, but it will look nicer if we can replicate them well before moving on.

Comment Source:I would like to make the default versions of ludescher.R and netcdf-convertor.R mimic Ludescher _et al_'s work as closely as possible, e.g. using data from 1948 to 1980. It's not ultimately very important, but it will look nicer if we can replicate them well before moving on.
• Options
8.
edited July 2014

Where is the file nino34-anoms.txt?

The URL is the first line in the file. Which is handy if you have the file, but not so good for you, sorry! Here is the link:

http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt

It begins:

YR   MON  TOTAL ClimAdjust ANOM
1950   1   24.84   26.26   -1.42
1950   2   25.22   26.53   -1.31
1950   3   26.04   27.09   -1.04


... and because I added a first line, I added the skip parameter in

find.nino.plotting.info <- function(firstyear, lastyear, miny, maxy) {


You could either add a line to nino34-anoms.txt, or remove , skip=1.

Comment Source:> Where is the file nino34-anoms.txt? The URL is the first line in the file. Which is handy if you have the file, but not so good for you, sorry! Here is the link: [http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt](http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt) It begins: ~~~~ YR MON TOTAL ClimAdjust ANOM 1950 1 24.84 26.26 -1.42 1950 2 25.22 26.53 -1.31 1950 3 26.04 27.09 -1.04 ~~~~ ... and because I added a first line, I added the skip parameter in ~~~~ find.nino.plotting.info <- function(firstyear, lastyear, miny, maxy) { nini <- read.table("nino34-anoms.txt", skip=1, header=TRUE) ~~~~ You could either add a line to nino34-anoms.txt, or remove , skip=1.
• Options
9.
edited July 2014

test comment. I've been getting errors when posting comments. Can't reproduce them when I try. Something about invalid comment in red letters.

Comment Source:test comment. I've been getting errors when posting comments. Can't reproduce them when I try. Something about invalid comment in red letters.
• Options
10.
edited July 2014

The file nino34-anoms.txt (with URL in first line) is now at

https://github.com/azimuth-project/el-nino/blob/master/R/grj/nino34-anoms.txt

Comment Source:The file nino34-anoms.txt (with URL in first line) is now at [https://github.com/azimuth-project/el-nino/blob/master/R/grj/nino34-anoms.txt](https://github.com/azimuth-project/el-nino/blob/master/R/grj/nino34-anoms.txt)
• Options
11.

Thanks!

I would like to change ludescher.R and netcdf-convertor.R as mentioned earlier and put the changed versions on github. I bet I can figure out how to do it, but what's the etiquette of this? Should I somehow create my own directory and put these versions there?

Comment Source:Thanks! I would like to change ludescher.R and netcdf-convertor.R as mentioned earlier and put the changed versions on github. I bet I can figure out how to do it, but what's the etiquette of this? Should I somehow create my own directory and put these versions there?
• Options
12.

I am no expert on gitiquette, but a directory alongside grj and tanzer in R seems the obvious thing to do.

Comment Source:I am no expert on gitiquette, but a directory alongside grj and tanzer in R seems the obvious thing to do.
• Options
13.
edited July 2014

John and Graham

Took the data set above for anomalies:

El Nino 34 Anomalies: Wavelet Analysis

John please review the 2D plots under Scalograms. They render the decomposition of the 1-D signal into a set of decompositions indexed e.g. {0,0,1}. There is not meaning to the latter except specifies how deep in the decomposition tree.

Then I added the AMPLITUDE for the said decomposition index, this is similar to the amplitude and decompositions into waves in Quantum Mechanics. So you should feel at home developing theories and computations based upon the Wavelets. Except Wavelets have compact support.

y-axis for the Scalograms are the periods in months and x-axis time in months.

You could visually estimate the periods by matching the blotchs of colors on the Scalograms to the y-axis.

In reality these decompositions are done 1000s of times per night for incoming data from the satellites and research organizations. We have automated the process you could decomposed any data of any size and dimension as such and generate the documention/computations in real-time.

Dara

Comment Source:John and Graham Took the data set above for anomalies: [El Nino 34 Anomalies: Wavelet Analysis ](http://files.lossofgenerality.com/nino34_nomalyPDF.pdf) John please review the 2D plots under Scalograms. They render the decomposition of the 1-D signal into a set of decompositions indexed e.g. {0,0,1}. There is not meaning to the latter except specifies how deep in the decomposition tree. Then I added the AMPLITUDE for the said decomposition index, this is similar to the amplitude and decompositions into waves in Quantum Mechanics. So you should feel at home developing theories and computations based upon the Wavelets. Except Wavelets have compact support. y-axis for the Scalograms are the periods in months and x-axis time in months. You could visually estimate the periods by matching the blotchs of colors on the Scalograms to the y-axis. In reality these decompositions are done 1000s of times per night for incoming data from the satellites and research organizations. We have automated the process you could decomposed any data of any size and dimension as such and generate the documention/computations in real-time. Dara
• Options
14.
edited July 2014

The idea is that in these "Blog" threads we discuss and correct blog articles that are being written on the wiki.

Comment Source:Hi, Darya! Thanks! I've copied your comment to a new thread; this thread is all about * [[Blog - El Niño project (part 4)]] The idea is that in these "Blog" threads we discuss and correct blog articles that are being written on the wiki.
• Options
15.
edited July 2014

Graham: how did you first find this Niño 3.4 data, and why did you choose to work with it?

I can't find it here:

a website that lists a variety of indices including Niño 3.4. I'm curious because the data you're using may have been "detrended" in some way, and Ludescher et al don't seem to say where they got their Niño 3.4 data.

Comment Source:Graham: how did you first find this Ni&ntilde;o 3.4 data, and why did you choose to work with it? * [http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt](http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt) I can't find it here: * Monthly atmospheric and SST indices, NCEP-NOAA, [http://www.cpc.ncep.noaa.gov/data/indices/](http://www.cpc.ncep.noaa.gov/data/indices/) a website that lists a variety of indices including Ni&ntilde;o 3.4. I'm curious because the data you're using may have been "detrended" in some way, and Ludescher _et al_ don't seem to say where they got their Ni&ntilde;o 3.4 data.
• Options
16.

Let me compare Graham's Niño 3.4 data

The second has more data: besides Niño 3.4 it has Niño 1+2, 3, and 4. But all I care about is whether their Niño 3.4 agrees!

Comment Source:Let me compare Graham's Ni&ntilde;o 3.4 data * [http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt](http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/detrend.nino34.ascii.txt) to this: * [http://www.cpc.ncep.noaa.gov/data/indices/ersst3b.nino.mth.81-10.ascii](http://www.cpc.ncep.noaa.gov/data/indices/ersst3b.nino.mth.81-10.ascii), linked to [here](http://www.cpc.ncep.noaa.gov/data/indices/) under "Monthly ERSST.V3B (1981-2010 base period) Niño 1+2 (0-10°South)(90°West-80°West) Niño 3 (5°North-5°South)(150°West-90°West) Niño 4 (5°North-5°South) (160°East-150°West) Niño 3.4 (5°North-5°South)(170-120°West)". The second has more data: besides Niño 3.4 it has Niño 1+2, 3, and 4. But all I care about is whether their Niño 3.4 agrees!
• Options
17.
edited July 2014

The short answer is that they don't agree. The first one, which Graham used, starts like this:

YR   MON  TOTAL ClimAdjust ANOM
1950   1   24.84   26.26   -1.42
1950   2   25.22   26.53   -1.31
1950   3   26.04   27.09   -1.04


The second starts like this:

 YR   MON  NINO1+2  ANOM   NINO3    ANOM   NINO4    ANOM   NINO3.4  ANOM
1950   1   23.11   -1.57   23.74   -2.03   27.03   -1.29   24.83   -1.84
1950   2   24.20   -1.91   24.92   -1.57   27.15   -1.02   25.20   -1.63
1950   3   25.37   -1.13   26.33   -0.91   27.06   -1.23   26.03   -1.30


TOTAL in the first gives

24.84, 25.22, 26.04


while NINO3.4 in the second gives

24.83, 25.20, 26.03


Those are very slightly different - no big deal, though worth understanding.

But if we look at ANOM for both, the difference is much bigger. The first has

-1.42,  -1.31, -1.04


while the second has

-1.84, -1.63, -1.30


These are significantly different!

Comment Source:The short answer is that they don't agree. The first one, which Graham used, starts like this: ~~~~ YR MON TOTAL ClimAdjust ANOM 1950 1 24.84 26.26 -1.42 1950 2 25.22 26.53 -1.31 1950 3 26.04 27.09 -1.04 ~~~~ The second starts like this: ~~~~ YR MON NINO1+2 ANOM NINO3 ANOM NINO4 ANOM NINO3.4 ANOM 1950 1 23.11 -1.57 23.74 -2.03 27.03 -1.29 24.83 -1.84 1950 2 24.20 -1.91 24.92 -1.57 27.15 -1.02 25.20 -1.63 1950 3 25.37 -1.13 26.33 -0.91 27.06 -1.23 26.03 -1.30 ~~~~ TOTAL in the first gives ~~~~ 24.84, 25.22, 26.04 ~~~~ while NINO3.4 in the second gives ~~~~ 24.83, 25.20, 26.03 ~~~~ Those are very slightly different - no big deal, though worth understanding. But if we look at ANOM for both, the difference is much bigger. The first has ~~~~ -1.42, -1.31, -1.04 ~~~~ while the second has ~~~~ -1.84, -1.63, -1.30 ~~~~ These are significantly different!
• Options
18.
edited July 2014

There's a third source of Niño 3.4 data here:

Let me compare it. The obvious difference is that this goes back to 1870! But let's compare the first three months of 1950 to the two previous sources.

1950    25.53   25.26   26.18   26.86   26.56   26.82   26.14   26.24   25.65   25.90   25.27   25.55


So, the first three months give

25.53, 25.26, 26.18


which are rather different from what we saw before - about 0.7 degrees higher in the first month! Also this chart only lists temperatures, not "anomalies".

MORAL: when someone says "Niño 3.4", sometimes we need to ask: "Which Niño 3.4?"

I'm curious how Graham found his Niño 3.4, and why he chose that one.

Comment Source:There's a third source of Ni&ntilde;o 3.4 data here: * Niño 3.4 SST Index calculated from the HadISST1, [http://www.esrl.noaa.gov/psd/gcos_wgsp/Timeseries/Data/nino34.long.data](http://www.esrl.noaa.gov/psd/gcos_wgsp/Timeseries/Data/nino34.long.data). Let me compare it. The obvious difference is that this goes back to 1870! But let's compare the first three months of 1950 to the two previous sources. ~~~~ 1950 25.53 25.26 26.18 26.86 26.56 26.82 26.14 26.24 25.65 25.90 25.27 25.55 ~~~~ So, the first three months give ~~~~ 25.53, 25.26, 26.18 ~~~~ which are rather different from what we saw before - about 0.7 degrees higher in the first month! Also this chart only lists temperatures, not "anomalies". **MORAL:** when someone says "Ni&ntilde;o 3.4", sometimes we need to ask: "_Which_ Ni&ntilde;o 3.4?" I'm curious how Graham found his Ni&ntilde;o 3.4, and why he chose that one.
• Options
19.

I'm afraid I can't remember how I found the Niño 3.4 index that I used. I hadn't learned your MORAL. It seems very close to what Ludescher et al used, but I am only looking at graphs, not actual numbers.

Comment Source:I'm afraid I can't remember how I found the Niño 3.4 index that I used. I hadn't learned your **MORAL**. It seems very close to what Ludescher et al used, but I am only looking at graphs, not actual numbers.
• Options
20.
edited July 2014

Okay. You mentioned small discrepancies, and this could be a source. It probably doesn't matter much. But sometimes being nitpicky has taught me interesting lessons, and this time it's that MORAL: people don't agree on what Niño 3.4 is.

The frustrating thing is that I haven't found any website that links to the Niño 3.4 index that you used! For the purposes of this blog article I'm willing to work with any reputable source of Niño 3.4, but saying "we found it on a NOAA ftp site" seems unsatisfying. What if someone's kid put it there?

Comment Source:Okay. You mentioned small discrepancies, and this could be a source. It probably doesn't matter much. But sometimes being nitpicky has taught me interesting lessons, and this time it's that **MORAL**: people don't agree on what Ni&ntilde;o 3.4 is. The frustrating thing is that I haven't found any website that links to the Ni&ntilde;o 3.4 index that you used! For the purposes of this blog article I'm willing to work with any reputable source of Ni&ntilde;o 3.4, but saying "we found it on a NOAA ftp site" seems unsatisfying. What if someone's kid put it there? <img src = "http://math.ucr.edu/home/baez/emoticons/tongue2.gif" alt = ""/>
• Options
21.

The discrepancies I was talking about earlier were in the calculated signal S(t), not the Niño 3.4 index.

Comment Source:The discrepancies I was talking about earlier were in the calculated signal S(t), not the Niño 3.4 index.
• Options
22.

Okay.

Comment Source:Okay.
• Options
23.
edited July 2014

I managed to find how I found it. There's a link to the data from http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml

(Just noticed ONI_change in the URL.)

Comment Source:I managed to find how I found it. There's a link to the data from [http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml](http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml) (Just noticed **ONI_change** in the URL.)
• Options
24.
edited July 2014

Thanks! Great!

I had just been searching under "ClimAdjust", a term in the file you used... and that led me to a a conversation in which someone said this:

I've been using Mark's marine weather forecasts for over 15 years now I've found  them to very accurate & reliable. Nonetheless, I agree that the values cited by Mark are a bit too low but only SLIGHTLY. The historical values of the ONI computed by NOAA's CPC have slightly changed due to an update in climatology (I can't say for sure but I believe this is why the values cited by Mark were low). Below is the monthly Nino 3.4 index for 1997/1998 using the new strategy. For a detailed description of changes to the ONI you can check out the following link.

http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml

Monthly Niño-3.4 index

1997   1   26.09   26.68   -0.58
1997   2   26.47   26.84   -0.38
1997   3   27.15   27.34   -0.19

...


I checked and noticed that this data matched "your" file...

This is like detective work.

Comment Source:Thanks! Great! <img src = "http://math.ucr.edu/home/baez/emoticons/thumbsup.gif" alt = ""/> I had just been searching under "ClimAdjust", a term in the file you used... and that led me to a [a conversation](http://forum.arctic-sea-ice.net/index.php?topic=730.910;wap2) in which someone said this: ~~~~ I've been using Mark's marine weather forecasts for over 15 years now I've found them to very accurate & reliable. Nonetheless, I agree that the values cited by Mark are a bit too low but only SLIGHTLY. The historical values of the ONI computed by NOAA's CPC have slightly changed due to an update in climatology (I can't say for sure but I believe this is why the values cited by Mark were low). Below is the monthly Nino 3.4 index for 1997/1998 using the new strategy. For a detailed description of changes to the ONI you can check out the following link. http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml Monthly Niño-3.4 index YR MON TOTAL ClimAdjust ANOM 1997 1 26.09 26.68 -0.58 1997 2 26.47 26.84 -0.38 1997 3 27.15 27.34 -0.19 ... ~~~~ I checked and noticed that this data matched "your" file... This is like detective work.
• Options
25.
edited July 2014

Okay, good! When I first compared "Graham's" data (from the US National Weather Service) to the NCEP-NOAA data, I said:

But if we look at ANOM for both, the difference is much bigger.

This is now explained: the US National Weather Service is computing the anomaly in a way that takes global warming into account: it's the temperature minus the average temperature on that day in a 30-year moving window!

I'm putting notes on this stuff here:

so we don't go insane as we learn about more and more kinds of climate data.

Okay, back to writing the blog article... which means, running a version of Graham's software that's adjusted to match Ludescher et al as closely as I can manage.

Comment Source:Okay, good! When I first compared "Graham's" data (from the US National Weather Service) to the NCEP-NOAA data, I said: > But if we look at ANOM for both, the difference is much bigger. This is now explained: the US National Weather Service is computing the anomaly in a way that [takes global warming into account](http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/ONI_change.shtml): it's the temperature minus the average temperature on that day in a 30-year moving window! <img src = "http://www.cpc.noaa.gov/products/analysis_monitoring/ensostuff/30yrbaseperiods_Nino34.png" alt = ""/> I'm putting notes on this stuff here: * [ENSO - Data](http://www.azimuthproject.org/azimuth/show/ENSO#Data) so we don't go insane as we learn about more and more kinds of climate data. Okay, back to writing the blog article... which means, running a version of Graham's software that's adjusted to match Ludescher _et al_ as closely as I can manage.
• Options
26.

I need to remember to mention how you take the temperatures and "subsample" them on 23 × 9 grid. I didn't say anything about that in the last article.

Comment Source:I need to remember to mention how you take the temperatures and "subsample" them on 23 &times; 9 grid. I didn't say anything about that in the last article.
• Options
27.

I take means over 3x3 squares from the NOAA data. Is that clear?

Comment Source:I take means over 3x3 squares from the NOAA data. Is that clear?
• Options
28.
edited July 2014

Yes, I think so - I was mainly just reminding myself to talk about it.

Tonight I rewrote ludescher.R so it uses data from 1948 to 1980 instead of 1950 to 1979. At first I ran into a bug. I had changed your mention of Pacific-1950-1979.txt to Pacific-1948-1980.txt, and changed

firstyear <- 1952
lastyear <- 1979


to

firstyear <- 1950
lastyear <- 1980


but the program didn't work until I also changed

w <- seq(from = (firstyear-1950)*365, to=dim(SAvals.3D.3x3)[1], by=step)


to

w <- seq(from = (firstyear-1948)*365, to=dim(SAvals.3D.3x3)[1], by=step)


I hope this was the right thing to do. I guess it was.

If this is right, I'm ready to write the blog article. I got this graph:

Note the slick-looking $\theta$ and ñ. A pretence to sophistication on my part, gleaned from a book on R.

Comment Source:Yes, I think so - I was mainly just reminding myself to talk about it. Tonight I rewrote ludescher.R so it uses data from 1948 to 1980 instead of 1950 to 1979. At first I ran into a bug. I had changed your mention of Pacific-1950-1979.txt  to Pacific-1948-1980.txt, and changed ~~~~ firstyear <- 1952 lastyear <- 1979 ~~~~ to ~~~~ firstyear <- 1950 lastyear <- 1980 ~~~~ but the program didn't work until I also changed ~~~~ w <- seq(from = (firstyear-1950)*365, to=dim(SAvals.3D.3x3)[1], by=step) ~~~~ to ~~~~ w <- seq(from = (firstyear-1948)*365, to=dim(SAvals.3D.3x3)[1], by=step) ~~~~ I hope this was the right thing to do. I guess it was. If this is right, I'm ready to write the blog article. I got this graph: <img src = "http://math.ucr.edu/home/baez/ecological/el_nino/ludescher_replication.jpg" alt = ""/> Note the slick-looking $\theta$ and &ntilde;. A pretence to sophistication on my part, gleaned from a book on R.
• Options
29.

Yes, you were right to change 1950 to 1948 where you did.

Note the slick-looking θ and ñ.

And the degree sign! And you've playing with colours too!

Comment Source:Yes, you were right to change 1950 to 1948 where you did. > Note the slick-looking θ and ñ. And the degree sign! And you've playing with colours too!
• Options
30.
edited July 2014

I didn't play with the colors... sorry, not that sophisticated yet. They just came out looking different than yours, I guess.

Comment Source:I didn't play with the colors... sorry, not _that_ sophisticated yet. They just came out looking different than yours, I guess.
• Options
31.
edited July 2014

Okay, this blog article is ready for comment and criticism:

At the end I have some feeble guesses about why our value of $S$ doesn't match Ludescher et al's. Unless you think these are really stupid, it would be nice to save discussion until when I publish the blog article. Chatting about such minutiae on the blog is actually a good way to get people interested. I know that's weird, but it's true: many people like to talk about nitpicky details!

Comment Source:Okay, this blog article is ready for comment and criticism: * [[Blog - El Nino project (part 4)]] At the end I have some feeble guesses about why our value of $S$ doesn't match Ludescher <i>et al</i>'s. Unless you think these are <i>really</i> stupid, it would be nice to save discussion until when I publish the blog article. Chatting about such minutiae on the blog is actually a good way to get people interested. I know that's weird, but it's true: many people like to talk about nitpicky details!
• Options
32.

Looks good.

Today I’ll explain this stuff to people who know their way around computers. But I’m not one of those people! So, next time I’ll explain the nitty-gritty details in a way that may be helpful to people more like me.

I think that's a good approach, so good I was going to suggest it myself.

smaller squares that are 1.5° × 1.5° in size:

should be 2.5 not 1.5

9 × 69

23 by 69

Same typo twice in both cases I think.

Comment Source:Looks good. > Today I’ll explain this stuff to people who know their way around computers. But I’m not one of those people! So, next time I’ll explain the nitty-gritty details in a way that may be helpful to people more like me. I think that's a good approach, so good I was going to suggest it myself. > smaller squares that are 1.5° × 1.5° in size: should be 2.5 not 1.5 > 9 × 69 23 by 69 Same typo twice in both cases I think.
• Options
33.

Thanks for catching these typos!

23? That's not divisible by 3. You must mean 27, since we've got

lat.range <- 24:50
lon.range <- 48:116

Comment Source:Thanks for catching these typos! 23? That's not divisible by 3. You must mean 27, since we've got ~~~~ lat.range <- 24:50 lon.range <- 48:116 ~~~~
• Options
34.
edited July 2014

Okay, I polished it up a bit more and posted it here:

Next, some comic relief as I explain how I started using R in El Nino project (part 5).

After that I may post an article on covariances and correlations, featuring some of the images Graham and David generated. Also, some wavelet / principal component analysis images of the sort Dara and Daniel are creating.

Then maybe some general articles on El Niño prediction and/or climate networks - reviewing the literature as I get up to speed on it.

Comment Source:Okay, I polished it up a bit more and posted it here: * [El Ni&ntilde;o project (part 4)](http://johncarlosbaez.wordpress.com/2014/07/08/el-nino-project-part-4/), Azimuth Blog. Next, some comic relief as I explain how I started using R in [[El Nino project (part 5)]]. After that I may post an article on covariances and correlations, featuring some of the images Graham and David generated. Also, some wavelet / principal component analysis images of the sort Dara and Daniel are creating. Then maybe some general articles on El Ni&ntilde;o prediction and/or climate networks - reviewing the literature as I get up to speed on it.