It looks like you're new here. If you want to get involved, click one of these buttons!
Consider an empirical "random variable," like the temperature T2000 at the North Pole on Jan 1, 2000.
To what extent can this be truly modeled as a random variable, in the technical sense of probability theory? For that we need to have a sample space S consisting of "experimental outcomes," a sigma-algebra of events (subsets) on S, and a probability measure on S; a random variable then has to be a measurable function on S.
So what's the probability space underlying our variable T2000? Would S consist of all "conceivable" histories of the world, and T2000 the function which picks off the temperature at that point in space and time? But this would be a purely fictional construction -- who's to say what's in S and what's not -- and even more artificial would be the assignment of a probability measure to the events in S.
Yet without an underlying probability space, there's no way that we could speak of, say the variance of T2000.