It looks like you're new here. If you want to get involved, click one of these buttons!
I have a basic question about moments of time series (say, the average temperature at some place)
is there a mathematically rigorous way to define time-dependent moments of a time series? E.g. I've heard the claim that "since 1987 the average temperature of place X has become one degree higher" but I've got some problems with comparing the time average before the year $T$ to the time average after the year $T$, because I guess it depends on $T$. (in the case of the claim I suppose that $T=1987$ yields the maximal temperature difference)
I've read that for climate models one can use the ensemble mean. One can run an ensemble of models (and assume ergodicity to relate it to the time average) and take moments with respect to this ensemble. In this case one can notice that the ensemble moments exhibit time-dependent behaviour.
But I'm wondering if something similar is possible for just one time-series. Is there a mathematically rigorous and meaningful way to examine if some moments of a certain time series are time-dependent?