> It's confusing!
It's not _very_ confusing, because he is deliberately using the nonstandard notation \\(a . f\\) to mean "do first \\(a\\), then \\(f\\)". That's just a variant of the computer scientist's \\(a ; f\\). It only gets _very_ confusing when people use \\(a \circ f\\) to mean "do first \\(a\\), then \\(f\\)", because 99.5% of mathematicians use it to mean "do first \\(f\\), then \\(a\\)".
I'll admit to having been one of the 0.5%. But that was when I was young and didn't choose my battles wisely!