Options

The reverse of a continuous time Markov process

I wanted to break this up into parts, talking about one question at a time, and hopefully get others to talk with me about this. It would be nice to talk to others about this stuff. So here's perhaps the first question which we would want to ask, when considering the type of 'stochastic mechanics' we've been talking about on Azimuth.

When is the reverse of a continuous time stochastic process also a valid continuous time stochastic process?

For every directed graph, we have an adjacency matrix $A$. This is defined by fixing a labelling to the nodes of the graph, and fixing an ordering of this labelling. Permuting the ordering, lifts to a permutation of $A$.

To define a stochastic process on a directed graph (with non-negative edge weights), we need to consider the combinatorial lapacian. We will consider 'walks' on this graph.

This is defined in terms of the generator

$$ L = A - D $$ where

$$ D_{ii} := -\sum_i A_{ij} $$ and so the generator satisfies

$$ L_{i\neq j} \geq 0 $$ and $\forall j$

$$ \sum_i L_{ij} = 0 $$ The stochastic generator is then defined for all times $t\in R^+$ as

$$ U_{ij} = \exp(t L)_{ij} $$ and for $\psi(0)$ an initial vector of probabilities of finding a walker on a node at time $t=0$, we want to determine the probability of finding a walker at a node $k$ at a later time $t$. Here we mean that $\psi_j(0)$ is the jth entry of $\psi(0)$, which is the probability of finding a walker at the jth node. We have that $\sum_j \psi_j(0) = 1$.
The probability of finding a walker at node $k$ is given by the kth component of $\psi(t)$ as

$$ \psi(t) = \exp(t L)\psi(0) $$ Now, we want to reverse all of the arrows in our graph, which is represented by taking the transpose of $A$. So the question is,

  • Provided that $L$ is a valid stochastic generator, when is $L^\top$?

To understand this, we should return to the defining properties of what it means to be a stochastic generator.
We have that

$$ L^\top = A^\top - D $$ Here, since $D$ is diagonal, the transpose of course does not do anything. So the question is then ''if $L^\top$ still satisfies the definitions, making it a valid stochastic generator?''. In general, this is not the case. However, there are some cases when it does work still and these are interesting. The elementary case is of course given by $A=A^\top$, but we can actually still say a bit more than that. It turns out to be enough if the following is true, for all $j$

$$ \sum_i A_{ij} = \sum_i A_{ji} $$ There are a few interesting cases of this. Here's one we talked a bit about before.

balanced graph

these graphs are called balanced.

Comments

  • 1.

    This is interesting... make me think about it hard sometime!

    My student Blake Pollard is supposed to generalize a bunch of Brendan Fong's work on electrical circuits to Markov processes - basically generalizing certain stuff from Dirichlet operators to infinitesimal stochastic operators. When we get into this, I'll become really really interested in your question here.

    For now I just suggest writing down the conditions that say $L$ is infinitesimal stochastic, writing down the conditions that say $L^\top$ is infinitesimal stochastic, and fiddling around with all these equations to find the nicest way to express all of them! I guess they say $L$ is a matrix where the off-diagonal terms are nonnegative and all rows sum to zero and all columns sum to zero. Anything more to say than that?

    Comment Source:This is interesting... make me think about it hard sometime! My student Blake Pollard is supposed to generalize a bunch of [Brendan Fong's work on electrical circuits](http://math.ucr.edu/home/baez/Brendan_Fong_Transfer_Report.pdf) to Markov processes - basically generalizing certain stuff from Dirichlet operators to infinitesimal stochastic operators. When we get into this, I'll become really really interested in your question here. For now I just suggest writing down the conditions that say $L$ is infinitesimal stochastic, writing down the conditions that say $L^\top$ is infinitesimal stochastic, and fiddling around with all these equations to find the nicest way to express all of them! I guess they say $L$ is a matrix where the off-diagonal terms are nonnegative and all rows sum to zero and all columns sum to zero. Anything more to say than that?
  • 2.

    I guess these are equivalent conditions for a square real matrix $L$:

    • $L$ and $L^\top$ are infinitesimal stochastic.

    • the off-diagonal terms of $L$ are nonnegative, and each row and column of $L$ sums to zero.

    • the off-diagonal terms of $L$ are nonnegative, each row sums to zero, and the sum of the off-diagonal entries of the $i$th row equals the sum of the off-diagonal entries of the $i$th column.

    Comment Source:I guess these are equivalent conditions for a square real matrix $L$: * $L$ and $L^\top$ are infinitesimal stochastic. * the off-diagonal terms of $L$ are nonnegative, and each row and column of $L$ sums to zero. * the off-diagonal terms of $L$ are nonnegative, each row sums to zero, and the sum of the off-diagonal entries of the $i$th row equals the sum of the off-diagonal entries of the $i$th column.
  • 3.
    edited January 2014

    These equivalent conditions are interesting and helpful for my understanding, thanks John.

    Now I think we are actually back right where you started. Consider the definition of a doubly stochastic propagator

    $$ \sum_i U_{ij} = \sum_j U_{ij} = 1 $$ We can, by Stones theorem, consider this as a snap-shot of a continuous time Markov process. The question is if this condition on propagators implies that the generator generates a process where this is always true.

    $$ \sum_i [e^{t L}]{ij} = \sum_j [e^{t L}]{ji} = 1 $$ $$ \partial_t(\sum_i [e^{t L}]{ij} = \sum_j [e^{t L^\top}]{ij} = 1) $$ We are interested in $L$ which is constant for all $t$. So it's enough to consider $t=0$, yielding

    $$ \sum_i L_{ij} = \sum_i L_{ji} = 0 $$ So we're back at the doubly stochastic condition, which in fact John mentioned on his blog comments --- as Ville Bergholm pointed out to me.

    My student Blake Pollard is supposed to generalize a bunch of Brendan Fong’s work on electrical circuits to Markov processes - basically generalizing certain stuff from Dirichlet operators to infinitesimal stochastic operators.

    I think I understand something about what you're saying regarding how electrical networks enter the game. Just as has already been talked about here in the network theory series, graphs with positive weighted edges, can be thought of as networks of conductances. These are in fact Dirichlet operators, e.g. they generate both valid stochastic and quantum processes. Let's take one of these conductance networks.

    In this connected network of conductances, select two nodes and label them with voltages $v_+$ and $v_-$. This, under evident conditions, will give rise to current flows on the edges of the graph. At each node, the sum of the currents flowing in and out must vanish. So such a symmetric scenario gives rise, considering the positive direction of the currents, to a balanced graph. So to find a balanced graph, you need resistors and a voltage source, and then you need to know the currents on each edge.

    Comment Source:These equivalent conditions are interesting and helpful for my understanding, thanks John. Now I think we are actually back right where you started. Consider the definition of a doubly stochastic propagator $$ \sum_i U_{ij} = \sum_j U_{ij} = 1 $$ We can, by Stones theorem, consider this as a snap-shot of a continuous time Markov process. The question is if this condition on propagators implies that the generator generates a process where this is always true. $$ \sum_i [e^{t L}]_{ij} = \sum_j [e^{t L}]_{ji} = 1 $$ $$ \partial_t(\sum_i [e^{t L}]_{ij} = \sum_j [e^{t L^\top}]_{ij} = 1) $$ We are interested in $L$ which is constant for all $t$. So it's enough to consider $t=0$, yielding $$ \sum_i L_{ij} = \sum_i L_{ji} = 0 $$ So we're back at the doubly stochastic condition, which in fact John mentioned on his blog [comments](http://johncarlosbaez.wordpress.com/2012/08/24/more-second-laws-of-thermodynamics/) --- as Ville Bergholm pointed out to me. > My student Blake Pollard is supposed to generalize a bunch of Brendan Fong’s work on electrical circuits to Markov processes - basically generalizing certain stuff from Dirichlet operators to infinitesimal stochastic operators. I think I understand something about what you're saying regarding how electrical networks enter the game. Just as has already been talked about [here](http://math.ucr.edu/home/baez/networks/networks_16.html) in the network theory series, graphs with positive weighted edges, can be thought of as networks of conductances. These are in fact Dirichlet operators, e.g. they generate both valid stochastic and quantum processes. Let's take one of these conductance networks. In this connected network of conductances, select two nodes and label them with voltages $v_+$ and $v_-$. This, under evident conditions, will give rise to current flows on the edges of the graph. At each node, the sum of the currents flowing in and out must vanish. So such a symmetric scenario gives rise, considering the positive direction of the currents, to a balanced graph. So to find a balanced graph, you need resistors and a voltage source, and then you need to know the currents on each edge.
  • 4.
    edited January 2014

    Working out what I posted yesterday, I arrive at a few statements that relate electrical networks and balanced graphs. I'm not fully satisfied with this yet.

    Consider the edge of a connected planar graph to be a voltage source in series with a resistor.

    From such a contraption, a solution for $\psi_i$ the voltage at each graph node exists. The currents flowing on the edges between the nodes can also be determined.

    We then define the current matrix $I$ as

    $$ I_{ij} := G_{ij}(\psi_i - \psi_j) $$ for all $\psi_i \geq \psi_j$ and zero otherwise.

    The reciprocal of resistance is conductance, and here we define $G_{ij}$ to be the conductance between nodes $i$ and $j$, which is symmetric. We note that $I_{ii} = 0$ independent of $G$. We can then elevate $G$ to $\widetilde{G}$, a Dirichlet operator appearing in the equation defining $I$.

    $$ I_{ij} := \widetilde{G}_{ij}(\psi_i - \psi_j) $$ for all $\psi_i \geq \psi_j$ and zero otherwise.

    The current matrix $I$ represents a balanced graph and so gives rise to $\widetilde{I}$, a time-reversible stochastic generator. Moreover, every plannar graph with voltage souces in series with a resistor on its edges gives rise to the adjacency matrix of a balanced graph, formed by considering the current matrix. To flip the directions of all the currents, you just change the polarity of all sources (that is, exchange plus and minus).

    Comment Source:Working out what I posted yesterday, I arrive at a few statements that relate electrical networks and balanced graphs. I'm not fully satisfied with this yet. Consider the edge of a connected planar graph to be a voltage source in series with a resistor. From such a contraption, a solution for $\psi_i$ the voltage at each graph node exists. The currents flowing on the edges between the nodes can also be determined. We then define the current matrix $I$ as $$ I_{ij} := G_{ij}(\psi_i - \psi_j) $$ for all $\psi_i \geq \psi_j$ and zero otherwise. The reciprocal of resistance is conductance, and here we define $G_{ij}$ to be the conductance between nodes $i$ and $j$, which is symmetric. We note that $I_{ii} = 0$ independent of $G$. We can then elevate $G$ to $\widetilde{G}$, a Dirichlet operator appearing in the equation defining $I$. $$ I_{ij} := \widetilde{G}_{ij}(\psi_i - \psi_j) $$ for all $\psi_i \geq \psi_j$ and zero otherwise. The current matrix $I$ represents a balanced graph and so gives rise to $\widetilde{I}$, a time-reversible stochastic generator. Moreover, every plannar graph with voltage souces in series with a resistor on its edges gives rise to the adjacency matrix of a balanced graph, formed by considering the current matrix. To flip the directions of all the currents, you just change the polarity of all sources (that is, exchange plus and minus).
  • 5.

    The Network Theory Series has talked a lot about Dirchlet operators. Now we started talking about balanced graphs, in the context of reversing a Markov process. Dirichlet operators are interesting because they generate both a stochastic and also a valid quantum processes. How do these balanced graphs relate to what we already know? I think I'm starting to understand this all a bit better and so want to write down a few things and see what others think.

    Given a valid stochastic generator $L$, if $L=L^\top$ then $L$ is called a Dirchlet operator. Such an operator generates a continuous time stochastic process

    $$ e^{tL} $$ as well as a continuous time quantum process

    $$ e^{-itL} $$ Translating from one generator to another is done through multiplication by $-i$.

    We call a stochastic process, ``time-reversible'' whenever
    $$ \langle i| e^{tL}| j \rangle - \langle j| e^{tL}| i\rangle = 0 $$ for all times $t\in R^+$ and all basis states $i$, $j$ defined as particle occupancy sites on the underlying network graph.

    Now you might expect that reversing time means we would want to change the sign of $t$ in one of these equations above. However, for stochastic processes, this won't work. Our definition essentially asks if there is any difference between transition probabilities if you start at $i$ and move to $j$ or if you start at $j$ and move to $i$.
    Actually, we will explain why we call this time-reversibility of a stochastic process when we talk about quantum processes below.

    Returning to the definition, it's easy to see that if $L=L^\top$ then the above equations are always satisfied. In fact, that's all you get since

    $$ \partial_t ((e^{tL}){ij} - (e^{tL}){ji}) = 0 $$ must also vanish for all $t$. And so we set $t=0$ and then infer that $L=L^\top$.

    Then I think that the following conditions are equivalent

    • A stochastic generator $L$ generates a time-reversible process
    • $L$ is a Dirchlet operator

    How does this all relate to quantum mechanics and electrical circuits? I guess in terms of quantum mechanics, it's saying that's all you get.
    If your stochastic process is time-reversal symmetric, it can also generate a quantum process and that's the end of the story, or is it?

    As far as circuits go, we already know that Dirchlet operators can be considered as graphs, where the edges are resistors.
    John in fact showed that if you define a potential at each node, then the expected value of the Hamiltonian with respect to the node potential state is proportional to the electric power dissipated by such a network.

    Comment Source:The Network Theory Series has talked a lot about Dirchlet operators. Now we started talking about balanced graphs, in the context of reversing a Markov process. Dirichlet operators are interesting because they generate both a stochastic and also a valid quantum processes. How do these balanced graphs relate to what we already know? I think I'm starting to understand this all a bit better and so want to write down a few things and see what others think. Given a valid stochastic generator $L$, if $L=L^\top$ then $L$ is called a Dirchlet operator. Such an operator generates a continuous time stochastic process $$ e^{tL} $$ as well as a continuous time quantum process $$ e^{-itL} $$ Translating from one generator to another is done through multiplication by $-i$. > We call a stochastic process, ``time-reversible'' whenever > $$ \langle i| e^{tL}| j \rangle - \langle j| e^{tL}| i\rangle = 0 $$ > for all times $t\in R^+$ and all basis states $i$, $j$ defined as particle occupancy sites on the underlying network graph. Now you might expect that reversing time means we would want to change the sign of $t$ in one of these equations above. However, for stochastic processes, this won't work. Our definition essentially asks if there is any difference between transition probabilities if you start at $i$ and move to $j$ or if you start at $j$ and move to $i$. Actually, we will explain why we call this time-reversibility of a stochastic process when we talk about quantum processes below. Returning to the definition, it's easy to see that if $L=L^\top$ then the above equations are always satisfied. In fact, that's all you get since $$ \partial_t ((e^{tL})_{ij} - (e^{tL})_{ji}) = 0 $$ must also vanish for all $t$. And so we set $t=0$ and then infer that $L=L^\top$. Then I think that the following conditions are equivalent * A stochastic generator $L$ generates a time-reversible process * $L$ is a Dirchlet operator How does this all relate to quantum mechanics and electrical circuits? I guess in terms of quantum mechanics, it's saying that's all you get. If your stochastic process is time-reversal symmetric, it can also generate a quantum process and that's the end of the story, or is it? As far as circuits go, we already know that Dirchlet operators can be considered as graphs, where the edges are resistors. John in fact showed that if you define a potential at each node, then the expected value of the Hamiltonian with respect to the node potential state is proportional to the electric power dissipated by such a network.
  • 6.

    To figure out what else we can say, lets consider quantum time-reversal symmetry.

    Time reversibility of transition probabilities in quantum mechanics occurs whenever
    $$ |\langle i| e^{-itL}| j \rangle|^2 - |\langle i| e^{itL}| j\rangle|^2 = 0 $$ again for all times $t\in R^+$ and all basis states $i$, $j$ defined as particle occupancy sites on the underlying network graph.

    Here we want to know if there is a difference if we run the process forward or backwards. This is fine here, since in contrast to stochastic mechanics, in quantum theory we consider hermitian operators $L=L^\dagger$. Changing the sign of $L$ does not change this property. So how does this relate to the definition of time-reversal symmetry for a stochastic process?

    Conjugating the real valued second term from the definition yields $|\langle j| e^{-itL}| i\rangle|^2$, which shows that the definition is equivalent to checking symmetry in transitions between $i$ and $j$ and between $j$ and $i$ for a forward run process. This is how the stochastic definition got its name.

    Whenever people talk about time-symmetry, they often employ a fun operator $T$. This operator is a little bit slippery.
    You can represent it, but the representation won't be linear. This operator is in fact anti-unitary, has the property that $T=T^\dagger$, conjugates an operator $O$ as $TOT = \overline O$ and acts as identity on all the real-valued basis states $i$, $j$.
    Because of this last property and the fact that $L = TLT$ we know that

    $$ |\langle i|T e^{-itL} T| j\rangle|^2 = |\langle i| e^{it TLT}| j\rangle|^2 = |\langle i| e^{itL}| j\rangle|^2 $$ and so in fact, no real valued generator can break transition rate time-reversal symmetry in quantum processes. In particular, this is true for $L$ a Dirchlet operator. Both the stochastic and the quantum process generated by such an operator necessarily won't break time-reversal symmetry.

    We already mentioned that you can't go changing the sign of $t$ when you want to generate a stochastic process. We also mentioned that both $L$ and $-L$ generate a valid quantum process. So stochastic processes are much more restrictive in this sense. Although we can't change the sign of $t$ unless $L$ is really boring and just all zeros, we can ask another question. We can ask if both $L$ and $L^\top$ generate valid stochastic process. John arrived at a few equivalent conditions about these sorts of generators.
    We also talked about how they are related to bistochastic operators.

    Restating what we know already, given a Dirchlet operator $\widetilde{G}$, we can view $\widetilde{G}$ as a conductance matrix, and assign voltages at the nodes, and from that arrive at the current matrix

    $$ I_{ij} := \widetilde{G}{ij}(\psi_i - \psi_j) $$ Here $\widetilde{G}$ is symmetric, but $V{ij}:=(\psi_i - \psi_j)$ is not and so neither is $I$. The components above could also be replaced with the element wise Hadamard product, $\star$, meaning $I = G\star V$.

    What we noticed, was that if we set all the negative entries of $I$ to be zero, then we can create the sort of valid generator whos transpose is also a valid generator.
    However, in quantum mechanics $I$ is allowed.

    $I= - I^\top$ is anti-symmetric and so $e^{t I}$ is an orthogonal matrix. It's a valid quantum process generator. This time we don't even need to attach the imaginary $i$.

    One reason I like this is that we typically think of things the other way around. What I mean is that starting from a valid stochastic generator, there are a number of ways to recover a quantum generator. If $M$ is a valid stochastic generator then it's easy to show that

    $$ e^{-it S + t A} $$ is a valid quantum generator for $2 S := M + M^\top$ and $2A := M - M^\top$.

    The bad part about this is that these are defined in terms of projections. I always think about going from stochastic generators to quantum generators as doing a projection. I also think about adding an $i$. Now, we're going the other way. Starting with the current matrix $I$, we take the non-negative part and arrive at a stochastic generator. Here we don't need the imaginary numbers.
    The other point, which there is a lot more to say about, is that orthogonal quantum propagators can break time-reversal symmetry.

    This is not as cut and dry as it is in the stochastic case. It's easy there to show that a generator such that $L\neq L^\top$ will violate time-reversal symmetry. In quantum mechanics, the effect is a bit more subtle and there is a lot more to be said about it. In particular, just because the projection of $I$ and its elevation to a stochastic generator breaks time-symmetry in the Markov process, does not mean that $I$ will break time-reversal symmetry in its quantum process.

    For now I'm just glad that we understand the following equation a bit better.

    $$I = G\star V$$

    • $I$ is a valid quantum generator that might break time-reversal symmetry
    • $G$ is a Dirchlet operator that generates time-symmetric stochastic and quantum processes.
    • Taking the non-negative entries of $I$ and setting the rest to zero gives rise to a balanced graph, which generates a stochastic process both in forward and in reverse.

    It's really just ohm's law written in a fancy way.

    Comment Source:To figure out what else we can say, lets consider quantum time-reversal symmetry. > Time reversibility of transition probabilities in quantum mechanics occurs whenever > $$ |\langle i| e^{-itL}| j \rangle|^2 - |\langle i| e^{itL}| j\rangle|^2 = 0 $$ > again for all times $t\in R^+$ and all basis states $i$, $j$ defined as particle occupancy sites on the underlying network graph. Here we want to know if there is a difference if we run the process forward or backwards. This is fine here, since in contrast to stochastic mechanics, in quantum theory we consider hermitian operators $L=L^\dagger$. Changing the sign of $L$ does not change this property. So how does this relate to the definition of time-reversal symmetry for a stochastic process? Conjugating the real valued second term from the definition yields $|\langle j| e^{-itL}| i\rangle|^2$, which shows that the definition is equivalent to checking symmetry in transitions between $i$ and $j$ and between $j$ and $i$ for a forward run process. This is how the stochastic definition got its name. Whenever people talk about time-symmetry, they often employ a fun operator $T$. This operator is a little bit slippery. You can represent it, but the representation won't be linear. This operator is in fact anti-unitary, has the property that $T=T^\dagger$, conjugates an operator $O$ as $TOT = \overline O$ and acts as identity on all the real-valued basis states $i$, $j$. Because of this last property and the fact that $L = TLT$ we know that $$ |\langle i|T e^{-itL} T| j\rangle|^2 = |\langle i| e^{it TLT}| j\rangle|^2 = |\langle i| e^{itL}| j\rangle|^2 $$ and so in fact, no real valued generator can break transition rate time-reversal symmetry in quantum processes. In particular, this is true for $L$ a Dirchlet operator. Both the stochastic and the quantum process generated by such an operator necessarily won't break time-reversal symmetry. We already mentioned that you can't go changing the sign of $t$ when you want to generate a stochastic process. We also mentioned that both $L$ and $-L$ generate a valid quantum process. So stochastic processes are much more restrictive in this sense. Although we can't change the sign of $t$ unless $L$ is really boring and just all zeros, we can ask another question. We can ask if both $L$ and $L^\top$ generate valid stochastic process. John arrived at a few equivalent conditions about these sorts of generators. We also talked about how they are related to bistochastic operators. Restating what we know already, given a Dirchlet operator $\widetilde{G}$, we can view $\widetilde{G}$ as a conductance matrix, and assign voltages at the nodes, and from that arrive at the current matrix $$ I_{ij} := \widetilde{G}_{ij}(\psi_i - \psi_j) $$ Here $\widetilde{G}$ is symmetric, but $V_{ij}:=(\psi_i - \psi_j)$ is not and so neither is $I$. The components above could also be replaced with the element wise Hadamard product, $\star$, meaning $I = G\star V$. What we noticed, was that if we set all the negative entries of $I$ to be zero, then we can create the sort of valid generator whos transpose is also a valid generator. However, in quantum mechanics $I$ is allowed. $I= - I^\top$ is anti-symmetric and so $e^{t I}$ is an orthogonal matrix. It's a valid quantum process generator. This time we don't even need to attach the imaginary $i$. One reason I like this is that we typically think of things the other way around. What I mean is that starting from a valid stochastic generator, there are a number of ways to recover a quantum generator. If $M$ is a valid stochastic generator then it's easy to show that $$ e^{-it S + t A} $$ is a valid quantum generator for $2 S := M + M^\top$ and $2A := M - M^\top$. The bad part about this is that these are defined in terms of projections. I always think about going from stochastic generators to quantum generators as doing a projection. I also think about adding an $i$. Now, we're going the other way. Starting with the current matrix $I$, we take the non-negative part and arrive at a stochastic generator. Here we don't need the imaginary numbers. The other point, which there is a lot more to say about, is that orthogonal quantum propagators can break time-reversal symmetry. This is not as cut and dry as it is in the stochastic case. It's easy there to show that a generator such that $L\neq L^\top$ will violate time-reversal symmetry. In quantum mechanics, the effect is a bit more subtle and there is a lot more to be said about it. In particular, just because the projection of $I$ and its elevation to a stochastic generator breaks time-symmetry in the Markov process, does not mean that $I$ will break time-reversal symmetry in its quantum process. For now I'm just glad that we understand the following equation a bit better. $$I = G\star V$$ * $I$ is a valid quantum generator that might break time-reversal symmetry * $G$ is a Dirchlet operator that generates time-symmetric stochastic and quantum processes. * Taking the non-negative entries of $I$ and setting the rest to zero gives rise to a balanced graph, which generates a stochastic process both in forward and in reverse. It's really just ohm's law written in a fancy way.
  • 7.
    edited February 2014

    Thanks for all these comments! I definitely want to think about these things. I have a bunch of half-formed ideas, which might work better in a conversation than in print.

    A good thing to keep in mind is that if $L$ is a Dirichlet operator then the constant function on our graph, with $\psi_i = 1$ for all nodes $i$, is the "ground state": it has $L \psi = 0$. Also, for any stochastic state $\phi$, we have

    $$ \lim_{t \to +\infty} exp(t L) \phi = \psi $$ $\psi$ is the state of maximum entropy, and entropy increases with time, and every state approaches $\psi$ as we evolve it in time.

    If we have an infinitesimal stochastic operator $L$ that's not self-adjoint, the constant function won't have these properties. But some other function will! So, everything is fundamentally similar, just a bit twisted around. The blog article you helped write on quantum walks explains more precisely what I mean by "twisted around".

    One thing I mean by "twisted around" is that $L$ is still self-adjoint, but with respect to a different inner product!!!

    Comment Source:Thanks for all these comments! I definitely want to think about these things. I have a bunch of half-formed ideas, which might work better in a conversation than in print. A good thing to keep in mind is that if $L$ is a Dirichlet operator then the constant function on our graph, with $\psi_i = 1$ for all nodes $i$, is the "ground state": it has $L \psi = 0$. Also, for any stochastic state $\phi$, we have $$ \lim_{t \to +\infty} exp(t L) \phi = \psi $$ $\psi$ is the state of maximum entropy, and entropy increases with time, and every state approaches $\psi$ as we evolve it in time. If we have an infinitesimal stochastic operator $L$ that's not self-adjoint, the constant function won't have these properties. But some other function will! So, everything is fundamentally similar, just a bit twisted around. The blog article you helped write on quantum walks explains more precisely what I mean by "twisted around". One thing I mean by "twisted around" is that _$L$ is still self-adjoint, but with respect to a different inner product!!!_
  • 8.
    edited February 2014

    Interesting read. I have a few things to throw into the mix:

    1. You can show the equivalence of $L = L^T$ and $\langle i | \exp ( L t) |j \rangle - \langle j | \exp ( L t) |i \rangle = 0$ for all $i,j,t$ by expanding the exponentials as a power series in t and demanding that the coefficient of each power of t is zero (equivalent to saying that since it is zero at all times every derivative at $t=0$ is zero). The first order term gives $L = L^T$ and the $n$-th order term gives $L^n = (L^n)^T$, which is true if $L = L^T$. So this makes $L = L^T$ a necessary and sufficient condition for $\langle i | \exp ( L t) |j \rangle - \langle j | \exp ( L t) |i \rangle = 0$ for all $i,j,t$.

    2. You have started connecting this to a definition of quantum time-reversal symmetry in terms of probabilities. Perhaps we should first connect it to a definition based on amplitudes, which is simpler. This definition is $\langle i | \exp ( i L t) |j \rangle - \langle j | \exp (i Lt) |i \rangle = 0$ for all $i,j,t$ and by exactly the same process as above can be shown to be equivalent to $L$ having real elements. The conditions $L$ is infinitesimal stochastic and Hermitian naturally imply $L = L^T$ and $L$ real. Thus a Dirchlet operator, which are the operators that can be used to generate both types of evolution, necessarily give time-symmetry in both contexts (and of course also then for probabilities, since this follows from amplitudes).

    Comment Source:Interesting read. I have a few things to throw into the mix: 1. You can show the equivalence of $L = L^T$ and $\langle i | \exp ( L t) |j \rangle - \langle j | \exp ( L t) |i \rangle = 0$ for all $i,j,t$ by expanding the exponentials as a power series in t and demanding that the coefficient of each power of t is zero (equivalent to saying that since it is zero at all times every derivative at $t=0$ is zero). The first order term gives $L = L^T$ and the $n$-th order term gives $L^n = (L^n)^T$, which is true if $L = L^T$. So this makes $L = L^T$ a necessary and sufficient condition for $\langle i | \exp ( L t) |j \rangle - \langle j | \exp ( L t) |i \rangle = 0$ for all $i,j,t$. 2. You have started connecting this to a definition of quantum time-reversal symmetry in terms of probabilities. Perhaps we should first connect it to a definition based on amplitudes, which is simpler. This definition is $\langle i | \exp ( i L t) |j \rangle - \langle j | \exp (i Lt) |i \rangle = 0$ for all $i,j,t$ and by exactly the same process as above can be shown to be equivalent to $L$ having real elements. The conditions $L$ is infinitesimal stochastic and Hermitian naturally imply $L = L^T$ and $L$ real. Thus a Dirchlet operator, which are the operators that can be used to generate both types of evolution, necessarily give time-symmetry in both contexts (and of course also then for probabilities, since this follows from amplitudes).
  • 9.

    Back to what Tomi wrote about amplitude symmetry.

    The following conditions are equivalent

    1. $[H, K] = 0$ where antiunitary $K$ is complex conjugation in the same basis as $H$ is written in

    2. The amplitude current $j_{mn}(U) = U_{mn} - U_{mn}$ vanishes for all $m, n$

    3. $H = H^\top$

    4. $H$ takes only real values

    Comment Source:Back to what Tomi wrote about amplitude symmetry. The following conditions are equivalent 1. $[H, K] = 0$ where antiunitary $K$ is complex conjugation in the same basis as $H$ is written in 2. The amplitude current $j_{mn}(U) = U_{mn} - U_{mn}$ vanishes for all $m, n$ 3. $H = H^\top$ 4. $H$ takes only real values
  • 10.

    If we exchange $\top$ with $\dagger$ in the definition here's what happens.

    $$ \jmath_{mn}(U) := U_{mn} - U_{mn}^\dagger $$ which is true $\forall m, n$

    So then a quantum process is amplitude symmetric iff $\jmath_{mn}(U) = 0$ $\forall m, n$

    Which implies that $U = U^\dagger$. If $U$ is generated by a continuous time quantum process, then $e^{-itH} = e^{itH}$ which is true $\forall t$ only when $H = 0$.

    For the case of a given unitary propagator, the existence of a $U = U^\dagger$ is equivalent to the existence of a $P = P^2$, a bijection which follows from the existence of the invertible map

    $$ \omega^{-1}(U) = \frac{1}{2}(1 + U) =:P $$ So we can understand the definition in terms of unitary gates (propagators) which are self inverse. We get one of these every time we have a projector, and conversely.

    Comment Source:If we exchange $\top$ with $\dagger$ in the definition here's what happens. $$ \jmath_{mn}(U) := U_{mn} - U_{mn}^\dagger $$ which is true $\forall m, n$ So then a quantum process is amplitude symmetric iff $\jmath_{mn}(U) = 0$ $\forall m, n$ Which implies that $U = U^\dagger$. If $U$ is generated by a continuous time quantum process, then $e^{-itH} = e^{itH}$ which is true $\forall t$ only when $H = 0$. For the case of a given unitary propagator, the existence of a $U = U^\dagger$ is equivalent to the existence of a $P = P^2$, a bijection which follows from the existence of the invertible map $$ \omega^{-1}(U) = \frac{1}{2}(1 + U) =:P $$ So we can understand the definition in terms of unitary gates (propagators) which are self inverse. We get one of these every time we have a projector, and conversely.
  • 11.
    Comment Source:Note there's another realted thread * [Total number operator conservation in chemical reaction networks](http://forum.azimuthproject.org/discussion/1223/total-number-operator-conservation-in-chemical-reaction-networks/)
Sign In or Register to comment.