I found a reference to a technique called Slow Feature Analysis
>Input signal of the simple example described in the text. The panels on the left and center show the two individual input components, x1(t) and x2(t) . On the right, the joint 2D trajectory x(t)=(x1(t),x2(t)) is shown.
It falls under the category of an unsupervised learning tool and essentially pulls out patterns from what looks like either noisy or highly erratic signals.
I have a feeling this is related to what I am doing with my ENSO analysis. The key step I find in fitting the ENSO model is to modulate the original signal with a biennially peaked periodic function. This emphasizes the forcing function in a way that is compatible with the way one would formulate the sloshing physics. In other words, sloshing requires a Mathieu equation formulation -- and that includes a time-varying modulation implicit in the DiffEq.
As it happens, the QBO analysis requires a similar approach. With QBO, the key is to model the acceleration of the wind, and not the velocity. The acceleration is important because that is what the atmospheric flow physics uses!
No one in climate science is doing these kinds of transformations -- they all seem to do exactly what everyone else is doing and thus getting stuck in a lock-step dead-end.
I recently watched the congressional EPA hearings -- "Make the EPA great again". The one witness who was essentially schooling the Republican-know-nothings was Rush Holt PhD, who is now CEO of the AAAS but at one time was a physicist congressman from New Jersey. My favorite bit of wisdom he imparted was that science isn't going to make any progress by looking at the same data over and over again the same way, but by *"approaching the problem with a new perspective"*
Watch it here, set to 101 minutes (1:41 mark) into the hearing
We have to look at the data in new ways -- that's science.