Recently Jacob Biamonte, Mauro Faccin and myself have been working a bit more on this stuff.
We found that in the special case where the infinitesimal generator of the Markov process happens to be a Dirichlet operator, we regain the usual quantum version of Noether's theorem. The proof goes something like this:

Assume $H$ is an infinitesimal stochastic generator, i.e. a real-valued square matrix with nonnegative offdiagonal entries, and columns which sum to zero,
and $|\psi(t)\rangle$ is a normalized probability vector obeying the master equation
\[
\partial_t |\psi(t)\rangle = H |\psi(t)\rangle.
\]
$H$ generates a continuous-time Markovian process $e^{Ht}$ which maps the space of normalized probability vectors to itself:
$|\psi(t)\rangle = e^{Ht} |\psi(0)\rangle$.
Furthermore let $O$ be a real diagonal matrix representing an arbitrary observable.
The expected value of this observable is
\[
\langle O\rangle_{\psi} = \sum_{i \in X} \langle i | O |\psi\rangle = \langle \hat{O} | \psi \rangle,
\]
where $|\hat{O}\rangle$ is the diagonal vector of $O$.

###Noether's theorem for symmetric Markov processes###

If $H$ is an infinitesimal stochastic generator with $H = H^T$,
and $O$ is an observable, we have
\[
[O, H] = 0 \quad \iff \quad \partial_t \langle O \rangle_{\psi(t)} = 0
\quad \forall \: |\psi(0) \rangle, \forall t \ge 0.
\]
Proof.

Since $H$ is symmetric we may use a spectral decomposition
$H = \sum_k E_k |\upsilon_k \rangle \langle \upsilon_k |$.

As in John's proof, the $\implies$ direction is easy to show. In the other direction, we have
\[
\partial_t \langle O \rangle_{\psi(t)}
= \langle \hat{O} | H e^{Ht} |\psi(0) \rangle = 0 \quad \forall \: |\psi(0) \rangle, \forall t \ge 0
\]
\[
\iff \quad
\langle \hat{O}| H e^{Ht} = 0 \quad \forall t \ge 0
\]
\[
\iff \quad
\sum_k \langle \hat{O} | \upsilon_k \rangle E_k e^{t E_k} \langle \upsilon_k| = 0 \quad \forall t \ge 0
\]
\[
\iff \quad
\langle \hat{O} | \upsilon_k \rangle E_k e^{t E_k} = 0 \quad \forall t \ge 0
\]
\[
\iff \quad
|\hat{O} \rangle \in \Span\{|\upsilon_k \rangle | E_k = 0\} = \ker H,
\]
where the third equivalence is due to $\{ |\upsilon_k \rangle \}_k$ being a linearly independent set of vectors.


For any infinitesimal stochastic generator $H$,
iff the corresponding transition graph consists of $m$ connected components, we may
reorder (permute) the states of the system such that $H$ becomes block-diagonal with $m$ blocks.
Now it is easy to see that the kernel of $H$ is spanned by $m$ eigenvectors, one for each block.
Since $H$ is also symmetric, the elements of each such vector can be chosen to be ones within the block and zeros outside it.

Consequently $|\hat{O} \rangle \in \ker H$ implies that we can choose the eigenbasis of $O$ to be $\{|\upsilon_k \rangle\}_k$,
which implies $[O, H] = 0$.

Alternatively, $|\hat{O} \rangle \in \ker H$ implies
\[
|\hat{O^2} \rangle \in \ker H
\quad \iff \quad \ldots \quad \iff \quad
\partial_t \langle O^2 \rangle_{\psi(t)} = 0 \quad \forall \: |\psi(0)\rangle, \forall t \ge 0,
\]
where we have used
the above sequence of equivalences backwards.
Now, using John's original proof, we obtain $[O, H] = 0$.

As an additional observation we find that the only observables $O$ that remain invariant under a symmetric $H$ are
of the very limited type described above, where the observable has to have the same value in every state of a connected component, as Tomi mentioned in post #2.