For the Universal ENSO Proxy data, I am finding that a tool such as Eureqa is having a hard time generating solution fits much above 0.4 for a correlation coefficient.
And any solution that nears 0.5 has a very high complexity factor. My understanding is that the proxy records appear to contain significant noise and whatever is in the remaining 0.5 correlation fraction is verging on unfittable white noise.
The amount of noise is estimated by looking at the variation in the data sets that go into UEP. They do a good job of averaging amongst the ensemble reconstructions but the standard deviation is high -- about the same value as the mean excursion:
Dara, good luck seeing what you can do, as this is a real signal processing challenge -- to be able to extract meaning out of what looks like randomness. If it was easy, someone would probably have already done the task :)
But remember that we have a somewhat solid baseline to compare against (both in the SOI and in the ENSO proxy data) and some physical rationale for what may be happening. I would recommend not seeding any prior solutions to see what an unbiased differential evolution strategy will produce.