Options

Lecture 25 - Chapter 2: Reaction Networks

edited February 2020 in Baez ACT 2019: Lectures

We can use the math we've been developing to answer the question "given what I have, is it possible to get what I want?" For example, consider a set \(S\) of different resources:

  • crust
  • lemon
  • (one stick of) butter
  • (one cup of) sugar
  • egg
  • yolk
  • white
  • (lemon) filling
  • meringue
  • (unbaked) lemon pie
  • (unbaked) meringue pie

Then there is a monoid called \(\mathbb{N}[S]\) consisting of combinations of these things, like

$$ 2 [\textrm{crust}] + 3[\textrm{egg}] + 2[\textrm{filling}] .$$ We write the operation in this monoid as \(+\), and it's commutative. For example:

$$ 2 [\textrm{egg}] + [\textrm{yolk}] = [\textrm{yolk}] + 2 [\textrm{egg}] .$$ The coefficients in front of the various things say how many copies of them we have, and addition works with these in the obvious way; for example:

$$ (2 [\textrm{egg}] + [\textrm{yolk}]) + (3 [\textrm{egg}] + 3 [\textrm{crust}]) = 5[\textrm{egg}] + [\textrm{yolk}] + 3 [\textrm{crust}] . $$ Also, this monoid has an element \(0\) which serves as the identity for addition: it corresponds to nothing. For example:

$$ [\textrm{egg}] + 0 = [\textrm{egg}] .$$ This monoid is called \(\mathbb{N}[S]\) because it consists of linear combinations of things in \(S\) with natural number coefficients. So, we are not allowing ourselves to talk about \( \frac{1}{2} [\textrm{stick of butter}] \) or \( -3 [\textrm{egg}] \). Why not? Simple: while those generalizations are important, they're not part of the game we're playing right now.

Next we need to decide which combinations of things we can turn into which other combinations of things. As explained in Lecture 19, we can specify this using a Petri net:

image

Another way to present the same information is to write down a reaction network. This is a bunch of reactions going between elements of \(\mathbb{N}[S]\), like this:

$$ [\textrm{egg}] \to [\textrm{yolk}] + [\textrm{white}] $$ $$ [\textrm{lemon}] + [\text{butter}] + [\text{sugar}] + [\text{yolk}] \to [\textrm{filling}] $$ $$ [\textrm{crust}] + [\textrm{filling}] \to [\textrm{lemon pie}] $$ $$ [\textrm{white}] + [\textrm{sugar}] \to [\textrm{meringue}] $$ $$ [\textrm{lemon pie}] + [\textrm{meringue}] \to [\textrm{meringue pie}] $$ We can also think of these reactions as inequalities, which we use to make our monoid \(\mathbb{N}[S]\) into a monoidal preorder! I've been using the convention that says \(x \le y\) if we can make \(x\) from \(y\). The book uses the opposite convention. Both have their advantages, and I keep wanting to flip-flop, like right now - but all that really matters is to stay consistent. With my convention, the reaction network above gives these inequalities:

$$ [\textrm{yolk}] + [\textrm{white}] \le [\textrm{egg}] $$ $$ [\textrm{lemon filling}] \le [\textrm{lemon}] + [\text{butter}] + [\text{sugar}] + [\text{yolk}] $$ $$ [\textrm{lemon pie}] \le [\textrm{crust}] + [\textrm{filling}] $$ $$ [\textrm{meringue}] \le [\textrm{white}] + [\textrm{sugar}] $$ $$ [\textrm{meringue pie}] \le [\textrm{lemon pie}] + [\textrm{meringue}] $$ We can then reason with these inequalities using the laws of a symmetric monoidal preorder - in fact a commutative one! So, we can use these 6 laws:

  • reflexivity: \(x \le x\)
  • transitivity: \(x \le y\) and \(y \le z\) imply \(x \le z\)
  • associativity: \( x + (y + z) = (x + y) + z \)
  • the left and right unit laws: \( 0 + x = x = x + 0 \)
  • commutativity: \(x + y = y + x \)
  • the monoidal preorder law: \(x \le x'\) and \(y \le y'\) imply \(x + y \le x' + y'\).

So, since we have

$$ [\textrm{yolk}] + [\textrm{white}] \le [\textrm{egg}] $$ and

$$ [\textrm{meringue}] \le [\textrm{white}] + [\textrm{sugar}] $$ we can add a cup of sugar to both sides of the first inequality, using reflexivity and the monoidal preorder law:

$$ [\textrm{yolk}] + [\textrm{white}] + [\textrm{sugar}] \le [\textrm{egg}] + [\textrm{sugar}] $$ and add a yolk to the second inequality, in the same way:

$$ [\textrm{yolk}] + [\textrm{meringue}] \le [\textrm{yolk}] + [\textrm{white}] + [\textrm{sugar}] $$ Then we can string together the resulting inequalities using transitivity:

$$ [\textrm{yolk}] + [\textrm{meringue}] \le [\textrm{egg}] + [\textrm{sugar}] $$ and so on.

Indeed, such reasoning lets you prove it's possible to make a lemon meringue pie from suitable ingredients:

$$ [\textrm{meringue pie}] \le [\textrm{crust}] + [\textrm{lemon}] + 2[\textrm{sugar}] + [\textrm{butter}] + [\textrm{egg}] . $$ We can also prove such facts using "wiring diagrams", as mentioned in Lecture 18:

image

Here each basic inequality is drawn as a little box, and we combine them to prove more complicated inequalities. It's sort of an obvious idea. But how does it work, exactly? How, exactly, do our 6 rules for a commutative monoidal preorder turn into tricks we can play with wiring diagrams? I'll let you think about this:

Puzzle 78. When we reason using transitivity:

\(x \le y\) and \(y \le z\) imply \(x \le z\)

what does this look like in terms of wiring diagrams?

Puzzle 79. When we reason using the monoidal preorder law:

\(x \le x'\) and \(y \le y'\) imply \(x + y \le x' + y'\)

what does this look like in terms of wiring diagrams?

We could go through all 6 laws this way, but those two are the most charismatic in terms of pictures. You should read Section 2.2.2 of the book to learn more about wiring diagrams and the rules for working with them.

To read other lectures go here.

Comments

  • 1.
    edited May 2018

    I want to mention one thing, to soothe any puzzlement you may be feeling about the phrase "commutative monoidal preorder". In Lecture 22, I started by defining "symmetric monoidal preorders". These are monoidal preorders where for any elements \(x\) and \(y\) we have

    $$ x \otimes y \le y \otimes x \textrm{ and } y \otimes x \le x \otimes y .$$ In a preorder, two elements \(a\) and \(b\) are said to be isomorphic if \(a \le b\) and \(b \le a\), and we write this as \(a \cong b\). So in a symmetric monoidal preorder we have

    $$ x \otimes y \cong y \otimes x .$$ But a poset is exactly the same as a preorder where any pair of isomorphic elements are equal! So, a symmetric monoidal poset is actually commutative:

    $$ x \otimes y = y \otimes x . $$ I listed a bunch of "commutative monoidal posets" in Lecture 23. As we saw, they're very common and important.

    But in today's lecture we saw a method of building a bunch of monoidal preorders that are not posets but are still commutative. We can get one by taking the commutative monoid \(\mathbb{N}[S]\) and asserting any desired set of inequalities... along with all the other inequalities you can derive using the 6 rules I listed

    That's why today I was talking about "commutative monoidal preorders". A simple example would be to take our set of things to be

    $$ S = \{ \textrm{H}, \textrm{O}, \textrm{H}_2\textrm{O} \} $$ and put in the inequalities

    $$ 2 \textrm{H} + \textrm{O} \le \textrm{H}_2\textrm{O} $$ $$ \textrm{H}_2\textrm{O} \le 2 \textrm{H} + \textrm{O} $$ which describe a "reversible reaction" turning hydrogen and oxygen into water but also vice versa. As we saw in the discussion of Puzzle 70, this gives a commutative monoidal preorder that's not a poset, since we have

    $$ 2 \textrm{H} + \textrm{O} \cong \textrm{H}_2\textrm{O} $$ but not

    $$ 2 \textrm{H} + \textrm{O} = \textrm{H}_2\textrm{O} . $$ On the other hand, today's lemon meringue pie example was actually a poset, since there were no reversible reactions. You can't combine a yolk and a white to make an egg, for example!

    Comment Source:I want to mention one thing, to soothe any puzzlement you may be feeling about the phrase "commutative monoidal preorder". In [Lecture 22](https://forum.azimuthproject.org/discussion/2084/lecture-22-chapter-2-symmetric-monoidal-preorders/p1), I started by defining "symmetric monoidal preorders". These are monoidal preorders where for any elements \\(x\\) and \\(y\\) we have \[ x \otimes y \le y \otimes x \textrm{ and } y \otimes x \le x \otimes y .\] In a preorder, two elements \\(a\\) and \\(b\\) are said to be **isomorphic** if \\(a \le b\\) and \\(b \le a\\), and we write this as \\(a \cong b\\). So in a symmetric monoidal preorder we have \[ x \otimes y \cong y \otimes x .\] But a _poset_ is exactly the same as a preorder where any pair of isomorphic elements are equal! So, a symmetric monoidal _poset_ is actually **commutative**: \[ x \otimes y = y \otimes x . \] I listed a bunch of "commutative monoidal posets" in [Lecture 23](https://forum.azimuthproject.org/discussion/2086/lecture-23-chapter-2-commutative-monoidal-posets/p1). As we saw, they're very common and important. But in today's lecture we saw a method of building a bunch of monoidal preorders that are not posets but are still commutative. We can get one by taking the commutative monoid \\(\mathbb{N}[S]\\) and asserting any desired set of inequalities... along with all the other inequalities you can derive using the 6 rules I listed That's why today I was talking about "commutative monoidal preorders". A simple example would be to take our set of things to be \[ S = \\{ \textrm{H}, \textrm{O}, \textrm{H}_2\textrm{O} \\} \] and put in the inequalities \[ 2 \textrm{H} + \textrm{O} \le \textrm{H}_2\textrm{O} \] \[ \textrm{H}_2\textrm{O} \le 2 \textrm{H} + \textrm{O} \] which describe a "reversible reaction" turning hydrogen and oxygen into water _but also vice versa_. As we saw in the discussion of [Puzzle 70](https://forum.azimuthproject.org/discussion/2084/lecture-22-chapter-2-symmetric-monoidal-preorders/p1), this gives a commutative monoidal preorder that's not a poset, since we have \[ 2 \textrm{H} + \textrm{O} \cong \textrm{H}_2\textrm{O} \] but not \[ 2 \textrm{H} + \textrm{O} = \textrm{H}_2\textrm{O} . \] On the other hand, today's lemon meringue pie example was actually a poset, since there were no reversible reactions. You can't combine a yolk and a white to make an egg, for example!
  • 2.
    edited May 2018

    Puzzle 78. When we reason using transitivity:

    \(x \le y\) and \(y \le z\) imply \(x \le z\)

    what does this look like in terms of wiring diagrams?

    It looks like horizontal composition. That is, that fact that one process can come after another (edit: but only if the first process's output is the second process's input).

    Puzzle 79. When we reason using the monoidal preorder law:

    \(x \le x'\) and \(y \le y'\) imply \(x + y \le x' + y'\)

    what does this look like in terms of wiring diagrams?

    It looks like vertical composition. That is, two processes can be done in parallel.

    Comment Source:>>**Puzzle 78.** When we reason using transitivity: >>\\(x \le y\\) and \\(y \le z\\) imply \\(x \le z\\) >>what does this look like in terms of wiring diagrams? It looks like horizontal composition. That is, that fact that one process can come after another (edit: but only if the first process's output is the second process's input). >>**Puzzle 79.** When we reason using the monoidal preorder law: >>\\(x \le x'\\) and \\(y \le y'\\) imply \\(x + y \le x' + y'\\) >>what does this look like in terms of wiring diagrams? It looks like vertical composition. That is, two processes can be done in parallel.
  • 3.

    Keith - exactly right!

    Comment Source:Keith - exactly right!
  • 4.

    I have a question regarding notation. In the equation below, is it fair to say that the sign "\(+\)" has two meanings?

    $$ (2 [\textrm{egg}] + [\textrm{yolk}]) + (3 [\textrm{egg}] + 3 [\textrm{pie crust}]) = 5[\textrm{egg}] + [\textrm{yolk}] + 3 [\textrm{pie crust}] $$

    For example, if we were to be more explicit we could re-write the equation such as:

    $$ (2 [\textrm{egg}] +_1 [\textrm{yolk}]) +_2 (3 [\textrm{egg}] +_1 3 [\textrm{pie crust}]) = 5[\textrm{egg}] +_1 [\textrm{yolk}] +_1 3 [\textrm{pie crust}] $$

    • \(+_1\) is a way of combining elements from \(S\) (?)
    • \(+_2 : \mathbb{N}[S] \times \mathbb{N}[S] \rightarrow \mathbb{N}[S]\) is the monoidal operation.
    Comment Source:I have a question regarding notation. In the equation below, is it fair to say that the sign "\\(+\\)" has two meanings? > \[ (2 [\textrm{egg}] + [\textrm{yolk}]) + (3 [\textrm{egg}] + 3 [\textrm{pie crust}]) = 5[\textrm{egg}] + [\textrm{yolk}] + 3 [\textrm{pie crust}] \] For example, if we were to be more explicit we could re-write the equation such as: \[ (2 [\textrm{egg}] +_1 [\textrm{yolk}]) +_2 (3 [\textrm{egg}] +_1 3 [\textrm{pie crust}]) = 5[\textrm{egg}] +_1 [\textrm{yolk}] +_1 3 [\textrm{pie crust}] \] - \\(+_1\\) is a way of combining elements from \\(S\\) (?) - \\(+_2 : \mathbb{N}[S] \times \mathbb{N}[S] \rightarrow \mathbb{N}[S]\\) is the monoidal operation.
  • 5.
    edited May 2018

    Btw, does the mathjax on this forum allow for commuting diagrams, or do I have to make a picture of a commuting diagram and then upload it?

    Comment Source:Btw, does the mathjax on this forum allow for commuting diagrams, or do I have to make a picture of a commuting diagram and then upload it?
  • 6.
    edited May 2018

    Dan, kind of, yes -- but there is no defined way to add "egg" and "yolk" together while staying in the set \(S\) of our base reagents. It might be more fair to say that your \(+_1\) is a mere syntactic way to hold two kinds of reagents at once. In other words, we can represent \(2[\text{egg}] + [\text{yolk}]\) more syntactically as \((2, 1)\), where \([\text{yolk}]\) and \([\text{egg}]\) are projections from \(\mathbb{N}[S]\) to \(S\).

    This is almost exactly the same as how we write \(4\hat{x} + 5\hat{y}\) in linear algebra: it's a convenient notation for the integer-valued vector \((4, 5)\) while reminding us of our basis. The only concern against treating \(+_1\) as the same thing as \(+_2\) is that \(\text{yolk} +_1 \text{yolk}\) doesn't make sense, in an irritatingly technical sense. But we can do the obvious lifting of single components \(\text{yolk}\) to complete tuples \(1[\text{yolk}] + 0[\text{egg}] + 0[\text{pie crust}] + \cdots\), so it's a point that usually goes without mention. With this convention, both \(+_1\) and \(+_2\) are the same.

    EDIT: Ah, another way to look at it. The "addition" in \(1 + i\) isn't really something that's actionable, is it? There's no simplification you can perform. It's a fully-reduced entity in its own right. Similarly, \(\mathbb{C}\) is known to have a representation in terms of \(\mathbb{R}^2\), via the \(\Re\) and \(\Im\) projections: \(\Re(1 + i) = 1\), and \(\Im(1 + i) = 1\).

    Comment Source:[Dan](https://forum.azimuthproject.org/discussion/comment/18160/#Comment_18160), kind of, yes -- but there is no defined way to add "egg" and "yolk" together while staying in the set \\(S\\) of our base reagents. It might be more fair to say that your \\(+_1\\) is a mere syntactic way to hold two kinds of reagents at once. In other words, we can represent \\(2[\text{egg}] + [\text{yolk}]\\) more syntactically as \\((2, 1)\\), where \\([\text{yolk}]\\) and \\([\text{egg}]\\) are projections from \\(\mathbb{N}[S]\\) to \\(S\\). This is almost exactly the same as how we write \\(4\hat{x} + 5\hat{y}\\) in linear algebra: it's a convenient notation for the integer-valued vector \\((4, 5)\\) while reminding us of our basis. The only concern against treating \\(+_1\\) as the same thing as \\(+_2\\) is that \\(\text{yolk} +_1 \text{yolk}\\) doesn't make sense, in an irritatingly technical sense. But we can do the obvious lifting of single components \\(\text{yolk}\\) to complete tuples \\(1[\text{yolk}] + 0[\text{egg}] + 0[\text{pie crust}] + \cdots\\), so it's a point that usually goes without mention. With this convention, both \\(+_1\\) and \\(+_2\\) are the same. **EDIT:** Ah, another way to look at it. The "addition" in \\(1 + i\\) isn't really something that's actionable, is it? There's no simplification you can perform. It's a fully-reduced entity in its own right. Similarly, \\(\mathbb{C}\\) is known to have a representation in terms of \\(\mathbb{R}^2\\), via the \\(\Re\\) and \\(\Im\\) projections: \\(\Re(1 + i) = 1\\), and \\(\Im(1 + i) = 1\\).
  • 7.
    edited May 2018

    Dan wrote:

    I have a question regarding notation. In the equation below, is it fair to say that the sign "\(+\)" has two meanings?

    $$ (2 [\textrm{egg}] + [\textrm{yolk}]) + (3 [\textrm{egg}] + 3 [\textrm{pie crust}]) = 5[\textrm{egg}] + [\textrm{yolk}] + 3 [\textrm{pie crust}] $$

    You can say yes, or you can say no: I believe both viewpoints can be developed in a consistent way. If you treat them as different you may need to distinguish between

    $$ ([\textrm{egg}]) + ([\textrm{yolk}]) $$ and

    $$ ([\textrm{egg}] + [\textrm{yolk}]) .$$ Here's how I actually think: \(\mathbb{N}[S]\) is the free commutative monoid on the set \(S\). There are many ways to construct this thing, but they're all canonically isomorphic so it doesn't matter which one we use.

    (Sorry, that's how category theorists actually think: we are very flexible while being very precise about the ways in which we're being flexible: the words canonically isomorphic summarize a lot of detail that I'm reluctant to explain right now.)

    Here's one way to construct \(\mathbb{N}[S]\). We take elements of \(S\), create all possible finite sums of these elements, which are purely formal expressions of this sort:

    $$ s_1 + \cdots + s_n $$ where \(s_1, \dots, s_n \in S\), and then we mod out by an equivalence relation which imposes the commutative monoid rules. For example, we decree that

    $$ s_1 + s_2 + s_3 = s_2 + s_1 + s_3 .$$ This gives a commutative monoid \(\mathbb{N}[S]\). Then, we allow ourselves to abbreviate an \(n\)-fold sum of copies of \(s \in S\) as \(n s\), e.g.

    $$ [\textrm{egg}] + [\textrm{egg}] + [\textrm{egg}] = 3[\textrm{egg}] . $$ If you don't like this, here's another equivalent to way to think about it if \(S\) has some finite number of elements, say \(n\). In this case, after we choose an ordering for \(S\), we can think of \(\mathbb{N}[S]\) as the set of \(n\)-tuples of natural numbers. So, for example, if

    $$ S = \{ [\textrm{egg}], [\textrm{yolk}] \} $$ we write elements of \(\mathbb{N}[S]\) as pairs of natural numbers. The element \( (2,3) \) in here corresponds to the element

    $$ 2 [\textrm{egg}] + [\textrm{yolk}] $$ in my previous, isomorphic, description of \(\mathbb{N}[S]\). Similarly, the equation

    $$ (2,3) + (1,1) = (3,4) $$ corresponds to the equation

    $$ (2 [\textrm{egg}] + 3[\textrm{yolk}] ) + ([\textrm{egg}] + [\textrm{yolk}]) = 3 [\textrm{egg}] + 4[\textrm{yolk}] .$$

    Comment Source:Dan wrote: > I have a question regarding notation. In the equation below, is it fair to say that the sign "\\(+\\)" has two meanings? > \[ (2 [\textrm{egg}] + [\textrm{yolk}]) + (3 [\textrm{egg}] + 3 [\textrm{pie crust}]) = 5[\textrm{egg}] + [\textrm{yolk}] + 3 [\textrm{pie crust}] \] You can say yes, or you can say no: I believe both viewpoints can be developed in a consistent way. If you treat them as different you may need to distinguish between \[ ([\textrm{egg}]) + ([\textrm{yolk}]) \] and \[ ([\textrm{egg}] + [\textrm{yolk}]) .\] Here's how I actually think: \\(\mathbb{N}[S]\\) is the **free commutative monoid** on the set \\(S\\). There are many ways to construct this thing, but they're all canonically isomorphic so it doesn't matter which one we use. (Sorry, that's how category theorists actually think: we are very flexible while being very precise about the ways in which we're being flexible: the words _canonically isomorphic_ summarize a lot of detail that I'm reluctant to explain right now.) Here's one way to construct \\(\mathbb{N}[S]\\). We take elements of \\(S\\), create all possible finite sums of these elements, which are purely formal expressions of this sort: \[ s_1 + \cdots + s_n \] where \\(s_1, \dots, s_n \in S\\), and then we mod out by an equivalence relation which imposes the commutative monoid rules. For example, we decree that \[ s_1 + s_2 + s_3 = s_2 + s_1 + s_3 .\] This gives a commutative monoid \\(\mathbb{N}[S]\\). Then, we allow ourselves to abbreviate an \\(n\\)-fold sum of copies of \\(s \in S\\) as \\(n s\\), e.g. \[ [\textrm{egg}] + [\textrm{egg}] + [\textrm{egg}] = 3[\textrm{egg}] . \] If you don't like this, here's another equivalent to way to think about it if \\(S\\) has some finite number of elements, say \\(n\\). In this case, after we choose an ordering for \\(S\\), we can think of \\(\mathbb{N}[S]\\) as the set of \\(n\\)-tuples of natural numbers. So, for example, if \[ S = \\{ [\textrm{egg}], [\textrm{yolk}] \\} \] we write elements of \\(\mathbb{N}[S]\\) as pairs of natural numbers. The element \\( (2,3) \\) in here corresponds to the element \[ 2 [\textrm{egg}] + [\textrm{yolk}] \] in my previous, isomorphic, description of \\(\mathbb{N}[S]\\). Similarly, the equation \[ (2,3) + (1,1) = (3,4) \] corresponds to the equation \[ (2 [\textrm{egg}] + 3[\textrm{yolk}] ) + ([\textrm{egg}] + [\textrm{yolk}]) = 3 [\textrm{egg}] + 4[\textrm{yolk}] .\]
  • 8.

    Jonathan wrote:

    Similarly, \(\mathbb{C}\) is known to have a representation in terms of \(\mathbb{R}^2\), via the \(\Re\) and \(\Im\) projections: \(\Re(1 + i) = 1\), and \(\Im(1 + i) = 1\).

    Right, it's just like that. The only difference is that here you are using real linear combinations of \(1\) and \(i\), while in my lecture I was using natural number linear combinations of elements of any set \(S\).

    Comment Source:Jonathan wrote: > Similarly, \\(\mathbb{C}\\) is known to have a representation in terms of \\(\mathbb{R}^2\\), via the \\(\Re\\) and \\(\Im\\) projections: \\(\Re(1 + i) = 1\\), and \\(\Im(1 + i) = 1\\). Right, it's just like that. The only difference is that here you are using real linear combinations of \\(1\\) and \\(i\\), while in my lecture I was using natural number linear combinations of elements of any set \\(S\\).
  • 9.
    edited May 2018

    Dan: here's another way to answer your question, which may be more to your taste. You suggest that

    • \(+_1\) is a way of combining elements from \(S\) (?)
    • \(+_2 : \mathbb{N}[S] \times \mathbb{N}[S] \rightarrow \mathbb{N}[S]\) is the monoidal operation.

    However, you don't say what the first operation maps from and more importantly what it maps to. If you try to figure this out, you will probably be led to a framework where all you care about is \(+_2\). Give it a try! And notice that we can (and should) think of \(S\) as a subset of \(\mathbb{N}[S]\). So, there's ultimately no point in having a separate operation that only lets you combine elements of \(S\).

    Comment Source:Dan: here's another way to answer your question, which may be more to your taste. You suggest that - \\(+_1\\) is a way of combining elements from \\(S\\) (?) - \\(+_2 : \mathbb{N}[S] \times \mathbb{N}[S] \rightarrow \mathbb{N}[S]\\) is the monoidal operation. However, you don't say what the first operation maps from and more importantly what it maps _to_. If you try to figure this out, you will probably be led to a framework where all you care about is \\(+_2\\). Give it a try! And notice that we can (and should) think of \\(S\\) as a subset of \\(\mathbb{N}[S]\\). So, there's ultimately no point in having a separate operation that only lets you combine elements of \\(S\\).
  • 10.

    My Eckmann–Hilton senses are tingling...

    Comment Source:My Eckmann–Hilton senses are tingling...
  • 11.
    edited May 2018

    Puzzle 78

    Transitivity

    Puzzle 79

    Transitivity

    Here are all 6 laws and picture showing the difference between preorder diagrams and wiring diagrams:

    Transitivity

    Let me know if there are any mistakes or bad assumptions.

    Comment Source:**Puzzle 78** ![Transitivity](http://aether.co.kr/images/monoidal_preorder_transitivity.svg) **Puzzle 79** ![Transitivity](http://aether.co.kr/images/monoidal_preorder_monprelaw.svg) Here are all 6 laws and picture showing the difference between preorder diagrams and wiring diagrams: ![Transitivity](http://aether.co.kr/images/monoidal_preorder_summary.svg) Let me know if there are any mistakes or bad assumptions.
  • 12.
    edited May 2018

    Very nice Michael.

    Though, your associativity diagram looks like a braiding.

    Comment Source:Very nice Michael. Though, your associativity diagram looks like a braiding.
  • 13.
    edited May 2018

    Nice pictures, Michael!

    Keith is right: associativity should go from \((x \otimes y) \otimes z\) to \(x \otimes (y \otimes z)\) and also back the other way... but a nice feature of wiring diagrams it that we can leave associativity implicit! Since there are no parentheses in the wiring diagrams, we can write \(x\), \(y\) and \(z\) as parallel wires next to each other without \(\otimes\) symbols, and this stands for both \((x \otimes y) \otimes z\) and \(x \otimes (y \otimes z)\).

    Another smaller issue: "communitivity" should be "commutativity".

    There are probably some other ways to polish and perfect these pictures, but I'll let other people have fun figuring out the very best ways to draw them.

    Comment Source:Nice pictures, Michael! Keith is right: associativity should go from \\((x \otimes y) \otimes z\\) to \\(x \otimes (y \otimes z)\\) and also back the other way... but a nice feature of wiring diagrams it that _we can leave associativity implicit!_ Since there are no parentheses in the wiring diagrams, we can write \\(x\\), \\(y\\) and \\(z\\) as parallel wires next to each other without \\(\otimes\\) symbols, and this stands for both \\((x \otimes y) \otimes z\\) and \\(x \otimes (y \otimes z)\\). Another smaller issue: "communitivity" should be "commutativity". There are probably some other ways to polish and perfect these pictures, but I'll let other people have fun figuring out the very best ways to draw them.
  • 14.
    edited May 2018

    While Michael was drawing his picture, I've been drawing one of my own...

    I edited my lecture to point out that reaction networks are just another way to provide the same information as given by Petri nets! The reaction network I gave:

    $$ [\textrm{egg}] \to [\textrm{yolk}] + [\textrm{white}] $$ $$ [\textrm{lemon}] + [\text{butter}] + [\text{sugar}] + [\text{yolk}] \to [\textrm{filling}] $$ $$ [\textrm{crust}] + [\textrm{filling}] \to [\textrm{lemon pie}] $$ $$ [\textrm{white}] + [\textrm{sugar}] \to [\textrm{meringue}] $$ $$ [\textrm{lemon pie}] + [\textrm{meringue}] \to [\textrm{meringue pie}] $$ does the same job as this Petri net:

    image

    It's just a matter of taste which one we want to use... though there are some theorems that are nice to state in terms of Petri nets and some in terms of reaction networks. You can read a lot more about both in my book:

    Comment Source:While Michael was drawing his picture, I've been drawing one of my own... I edited my lecture to point out that _reaction networks_ are just another way to provide the same information as given by _[Petri nets](https://forum.azimuthproject.org/discussion/2079/lecture-19-chapter-2-chemistry-and-scheduling/p1)!_ The reaction network I gave: \[ [\textrm{egg}] \to [\textrm{yolk}] + [\textrm{white}] \] \[ [\textrm{lemon}] + [\text{butter}] + [\text{sugar}] + [\text{yolk}] \to [\textrm{filling}] \] \[ [\textrm{crust}] + [\textrm{filling}] \to [\textrm{lemon pie}] \] \[ [\textrm{white}] + [\textrm{sugar}] \to [\textrm{meringue}] \] \[ [\textrm{lemon pie}] + [\textrm{meringue}] \to [\textrm{meringue pie}] \] does the same job as this Petri net: <center><img src = "http://math.ucr.edu/home/baez/mathematical/7_sketches/petri_meringue.png"></center> It's just a matter of taste which one we want to use... though there are some theorems that are nice to state in terms of Petri nets and some in terms of reaction networks. You can read a lot more about both in my book: * John Baez and Jacob Biamonte, _[Quantum Techniques for Stochastic Mechanics](https://arxiv.org/abs/1209.3632)_, Section 25.1: The Reachability Problem, World Scientific Press, 2018.
  • 15.

    I was going to see if I could program myself a string diagram drawing program in Racket, but then came to the realization that keeping track of the string ends is a huge pain.

    Comment Source:I was going to see if I could program myself a string diagram drawing program in [Racket](https://racket-lang.org), but then came to the realization that keeping track of the string ends is a huge pain.
  • 16.
    edited May 2018

    In Lecture 22, John gave this example of a Petri Net to illustrate the reaction \[ 2[\text{H}] + [\text{O}] \to [\text{H}_2\text{O}] \].

    image

    It uses two arrows to indicate that you need 2 Hydrogens for the reaction. How would you represent this in a wiring diagram? Two wires labeled with the same resource?

    Comment Source:In <a href = "https://forum.azimuthproject.org/discussion/2084/lecture-22-chapter-2-symmetric-monoidal-preorders">Lecture 22</a>, John gave this example of a Petri Net to illustrate the reaction \\[ 2[\text{H}] + [\text{O}] \to [\text{H}_2\text{O}] \\]. <center><img src = "http://math.ucr.edu/home/baez/mathematical/7_sketches/H2O-II.png"></center> It uses two arrows to indicate that you need 2 Hydrogens for the reaction. How would you represent this in a wiring diagram? Two wires labeled with the same resource?
  • 17.

    Puzzles 78 and 79 along with Michael's diagrams remind me of black boxes or a mathematical understanding of squinting. For example, if I know that \(x \leq y\) and \(y \leq z\), but I don't care about the details of how these reactions happens then transitivity tells me that I can just focus on the relation \(x \leq z\).

    Comment Source:Puzzles 78 and 79 along with Michael's diagrams remind me of black boxes or a mathematical understanding of squinting. For example, if I know that \\(x \leq y\\) and \\(y \leq z\\), but I don't care about the details of how these reactions happens then transitivity tells me that I can just focus on the relation \\(x \leq z\\).
  • 18.
    edited May 2018

    @Sophie Libkind

    If I understand correctly, yes actually.

    The left-hand side of the above diagram is given by, $$ \mathrm{O} \otimes 2\mathrm{H} $$ which is just shorthand for,

    $$ \mathrm{O} \otimes \mathrm{H} \otimes \mathrm{H}. $$

    Comment Source:@Sophie Libkind If I understand correctly, yes actually. The left-hand side of the above diagram is given by, $$ \mathrm{O} \otimes 2\mathrm{H} $$ which is just shorthand for, $$ \mathrm{O} \otimes \mathrm{H} \otimes \mathrm{H}. $$
  • 19.

    Keith, just to clarify by "yes actually" you mean "yes actually you just draw two wires labeled with the same resource"?

    Comment Source:Keith, just to clarify by "yes actually" you mean "yes actually you just draw two wires labeled with the same resource"?
  • 20.
    edited May 2018

    Yes.

    Unless there is an idempotent relation where \( X \otimes X = X\), in which case the wires naturally collapse into a single wire, one can place multiples of wires next to each other.

    So far, there is nothing in the rules John has given that indicate you can't do so.

    In fact, if you think about it, you can answer #9 by thinking of addition in this context as combining multiple ( non-idempotent)) wires that have the same label.

    Comment Source:Yes. Unless there is an idempotent relation where \\( X \otimes X = X\\), in which case the wires naturally collapse into a single wire, one can place multiples of wires next to each other. So far, there is nothing in the rules John has given that indicate you can't do so. In fact, if you think about it, you can answer #9 by thinking of addition in this context as combining multiple ( non-idempotent)) wires that have the same label.
  • 21.
    edited May 2018

    Sophie wrote:

    It uses two arrows to indicate that you need 2 Hydrogens for the reaction. How would you represent this in a wiring diagram? Two wires labeled with the same resource?

    Yes, exactly. It would be nice for someone to draw an example or two where this kind of thing happens.

    Note that a Petri net specifies a way of making a commutative monoid \(\mathbb{N}[S]\) into a commutative monoidal preorder, while a wiring diagram specifies a proof of a given inequality in that commutative monoidal preorder.

    (There's a lot more to say about this, but that's where we are now.)

    Comment Source:Sophie wrote: > It uses two arrows to indicate that you need 2 Hydrogens for the reaction. How would you represent this in a wiring diagram? Two wires labeled with the same resource? Yes, exactly. It would be nice for someone to draw an example or two where this kind of thing happens. Note that a Petri net specifies a way of making a commutative monoid \\(\mathbb{N}[S]\\) into a commutative monoidal preorder, while a wiring diagram specifies a proof of a given inequality in that commutative monoidal preorder. (There's a lot more to say about this, but that's where we are now.)
  • 22.

    Let me see if I get what you're saying about the difference between a Petri net and a wiring diagram!

    Let's stick with the example where our resources are

    • \(H\)
    • \(O\)
    • \(H_2O\)

    The Petri net

    image
    makes \(\mathbb N[S]\) into a commutative monoidal preorder by specifying the relation \[ (a + 2n)[H] + (b + n)[O] + c [H_2O] \leq a [H] + b[O] + (c + n)[H_2O] \] for all \(a,b,c,n \in \mathbb N\).

    The wiring diagram with 2 hydrogen wires and 1 oxygen wires going into a box and 1 \(H_2O\) wire exiting the box would be a proof of the inequality \[ 2[H] + [O] \leq [H_2O].\] Two of these wiring diagrams set up in parallel would be a proof of the inequality \[4[H] + 2[O] \leq 2 [H_2O].\] And the wiring diagram with 2 hydrogen wires and 1 oxygen wires going into a box, 1 \(H_2O\) wire exiting the box, and 1 oxygen wire running uninterrupted in parallel would be a proof of the inequality \[2[H] + 2[O] \leq [O] + [H_2O]\]

    Comment Source:Let me see if I get what you're saying about the difference between a Petri net and a wiring diagram! Let's stick with the example where our resources are * \\(H\\) * \\(O\\) * \\(H_2O\\) The Petri net <center> <img src = "http://math.ucr.edu/home/baez/mathematical/7_sketches/H2O-II.png"> </center> makes \\(\mathbb N[S]\\) into a commutative monoidal preorder by specifying the relation \\[ (a + 2n)[H] + (b + n)[O] + c [H_2O] \leq a [H] + b[O] + (c + n)[H_2O] \\] for all \\(a,b,c,n \in \mathbb N\\). The wiring diagram with 2 hydrogen wires and 1 oxygen wires going into a box and 1 \\(H_2O\\) wire exiting the box would be a proof of the inequality \\[ 2[H] + [O] \leq [H_2O].\\] Two of these wiring diagrams set up in parallel would be a proof of the inequality \\[4[H] + 2[O] \leq 2 [H_2O].\\] And the wiring diagram with 2 hydrogen wires and 1 oxygen wires going into a box, 1 \\(H_2O\\) wire exiting the box, and 1 oxygen wire running uninterrupted in parallel would be a proof of the inequality \\[2[H] + 2[O] \leq [O] + [H_2O]\\]
  • 23.
    edited May 2018

    John and Keith

    Thanks for the correction. I just realized that the associativity diagram I drew is just an extension of commutivity as shown below:

    associativity

    So it isn't the real associativity diagram since you can have associativity without commutivity. So the simplest way to draw any of these are just wires = wires with labels but that is the same as writing them out which is fine but I think it loses a lot of information of whats really going on.

    Below is a better diagram of associativity that is not dependent on commutivity:

    associativity

    John wrote:

    but a nice feature of wiring diagrams it that we can leave associativity implicit!

    It might be because of poetic justice, but I think this may not be true? When you draw, there is an implicit order to the placement of arrows, circle or whatever it is you are drawing. So the following two pictures are actually not equal until you give it that rule!

    associativity

    Once you define commutivity, then you can say that order doesn't matter :

    associativity

    So when you first start off with a wiring diagram before you define any of the rules, there is an implicit order to the placement of the wires and then rules are added to allow permutations. I guess it really depends on what rules you want to start off with but I just thought by doing this you can show more with the diagrams within the context of monodical preorders. But then again, I only know part of the picture so please fill me in.

    Comment Source:John and Keith Thanks for the correction. I just realized that the associativity diagram I drew is just an extension of commutivity as shown below: ![associativity](http://aether.co.kr/images/associativity_permutation.svg) So it isn't the real associativity diagram since you can have associativity without commutivity. So the simplest way to draw any of these are just wires = wires with labels but that is the same as writing them out which is fine but I think it loses a lot of information of whats really going on. Below is a better diagram of associativity that is not dependent on commutivity: ![associativity](http://aether.co.kr/images/monoidal_preorder_associativity.svg) John wrote: >but a nice feature of wiring diagrams it that we can leave associativity implicit! It might be because of poetic justice, but I think this may not be true? When you draw, there is an implicit order to the placement of arrows, circle or whatever it is you are drawing. So the following two pictures are actually not equal until you give it that rule! ![associativity](http://aether.co.kr/images/implicit_order.svg) Once you define commutivity, then you can say that order doesn't matter : ![associativity](http://aether.co.kr/images/commutivity_isomorphism.svg) So when you first start off with a wiring diagram before you define any of the rules, there is an implicit order to the placement of the wires and then rules are added to allow permutations. I guess it really depends on what rules you want to start off with but I just thought by doing this you can show more with the diagrams within the context of monodical preorders. But then again, I only know part of the picture so please fill me in.
  • 24.
    edited May 2018

    I love your diagrams, Michael -- I think they make it much more clear how exactly wiring diagrams capture the kinds of things we can do with monoidal preorders!

    It might be because of poetic justice, but I think this may not be true? When you draw, there is an implicit order to the placement of arrows, circle or whatever it is you are drawing. So the following two pictures are actually not equal until you give it that rule!

    I think that, since associativity does not change the order of wires at all, this does not matter. Your example shows reordering the red arrow with respect to the other arrows, but this is something commutativity does, not associativity. As John notes, the very idea of a wiring diagram embeds associativity into its core -- so much so that although you might expect there to be something explicit to say about associativty, there just isn't anything much at all!

    For instance, consider your new associativity diagram:

    This doesn't actually capture associativity either, because associativity doesn't say anything about the kinds of relationships in the preorder. In other words, you're involving at least three, probably four(!) "reactions" in this diagram, when the associativity rule \((x \otimes y) \otimes z = x \otimes (y \otimes z)\) involves no reactions at all. Truly, associativity just says there is no distinction to be made by how you cluster your wires pictorially.

    Comment Source:I love your diagrams, Michael -- I think they make it much more clear how exactly wiring diagrams capture the kinds of things we can do with monoidal preorders! > It might be because of poetic justice, but I think this may not be true? When you draw, there is an implicit order to the placement of arrows, circle or whatever it is you are drawing. So the following two pictures are actually not equal until you give it that rule! I think that, since associativity does not change the order of wires at all, this does not matter. Your example shows reordering the red arrow with respect to the other arrows, but this is something commutativity does, not associativity. As John notes, the very idea of a wiring diagram embeds associativity into its core -- so much so that although you might expect there to be something explicit to say about associativty, there just isn't anything much at all! For instance, consider your new associativity diagram: > ![](http://aether.co.kr/images/monoidal_preorder_associativity.svg) This doesn't actually capture associativity either, because associativity doesn't say anything about the kinds of relationships in the preorder. In other words, you're involving at least three, probably four(!) "reactions" in this diagram, when the associativity rule \\((x \otimes y) \otimes z = x \otimes (y \otimes z)\\) involves no reactions at all. Truly, associativity just says there is no distinction to be made by how you cluster your wires pictorially.
  • 25.
    edited May 2018

    Jonathan

    I'm not sure if everything I did here is legal but if you look at the progression below you can literally see associativity spring up from reflexivity and monoidal preorder laws:

    associativity proof

    So you start with \(x \otimes y \otimes z \). Then apply reflexivity to x and y to add in the "reactions." Then you use monoidal preorder law to parallel contract the preorder. Then you apply reflexivity and monoidal preorder law again to y and z leaving you with the associativity identity. You can do this starting with y and z instead of x and y and the two diagrams will be the same since we haven't done anything to the three wires. The parentheses are kind of a way to give vertical ordering.

    This proof is kind of hard to see how "reactions" come into play when using regular symbols but you can still do it since you can do it with pictures. Start with :

    $$x \leq x$$ $$y \leq y$$ $$z \leq z$$ Next add the first two and then last. $$(x \otimes y) \otimes z \leq (x \otimes y) \otimes z$$ Then add the last two and then the first. $$x \otimes (y \otimes z) \leq x \otimes (y \otimes z)$$ Then notice that adding all three equals x+y+z since that is what we started with. So:

    $$(x \otimes y) \otimes z=x \otimes y \otimes z=x \otimes (y \otimes z)$$ The way I drew it in comment 23 is just shorthand for the last step in the progression since you can always write \(x \otimes y\) as two wires or one wire. But going with the flow of the all the pictures, I think this last picture portrays it the best.

    Comment Source:Jonathan I'm not sure if everything I did here is legal but if you look at the progression below you can literally see associativity spring up from reflexivity and monoidal preorder laws: ![associativity proof](http://aether.co.kr/images/associativity_proof.svg) So you start with \\(x \otimes y \otimes z \\). Then apply reflexivity to x and y to add in the "reactions." Then you use monoidal preorder law to parallel contract the preorder. Then you apply reflexivity and monoidal preorder law again to y and z leaving you with the associativity identity. You can do this starting with y and z instead of x and y and the two diagrams will be the same since we haven't done anything to the three wires. The parentheses are kind of a way to give vertical ordering. This proof is kind of hard to see how "reactions" come into play when using regular symbols but you can still do it since you can do it with pictures. Start with : $$x \leq x$$ $$y \leq y$$ $$z \leq z$$ Next add the first two and then last. $$(x \otimes y) \otimes z \leq (x \otimes y) \otimes z$$ Then add the last two and then the first. $$x \otimes (y \otimes z) \leq x \otimes (y \otimes z)$$ Then notice that adding all three equals x+y+z since that is what we started with. So: $$(x \otimes y) \otimes z=x \otimes y \otimes z=x \otimes (y \otimes z)$$ The way I drew it in comment 23 is just shorthand for the last step in the progression since you can always write \\(x \otimes y\\) as two wires or one wire. But going with the flow of the all the pictures, I think this last picture portrays it the best.
  • 26.

    Ahh, I see what you mean. It isn't clear from the diagram that the \(\le\) nodes are specific reactions (the reflexivity reactions), not just any reactions. (Though, I'm not sure equality of diagrams is obvious enough to be used here, since that amounts to an equality of proofs, and it isn't clear that these proofs are equal at all.)

    Actually, I would argue that what you've produced is a proof that \((x \otimes y) \otimes z \le x \otimes (y \otimes z)\). (Or \(\ge\), depending on the convention.) You can get the other direction by doing the analogous proof with \(y \otimes z \le y \otimes z\) at the bottom. Of course, you've implicitly used associativity to begin with to construct your proof.

    If we didn't have associativity, we would need to explicitly bind arrows together, and reaction nodes \(\le\) could only relate one input wire to one output wire (where those wires were pre/post-bound separately from the relation). It's only because we don't care at all about how wires are grouped that we can "implicitly" dovetail wires at the reaction sites.

    Comment Source:Ahh, I see what you mean. It isn't clear from the diagram that the \\(\le\\) nodes are _specific_ reactions (the reflexivity reactions), not just _any_ reactions. (Though, I'm not sure equality of diagrams is obvious enough to be used here, since that amounts to an equality of _proofs_, and it isn't clear that these proofs are equal at all.) Actually, I would argue that what you've produced is a proof that \\((x \otimes y) \otimes z \le x \otimes (y \otimes z)\\). (Or \\(\ge\\), depending on the convention.) You can get the other direction by doing the analogous proof with \\(y \otimes z \le y \otimes z\\) at the bottom. Of course, you've implicitly used associativity to begin with to construct your proof. If we didn't have associativity, we would need to explicitly bind arrows together, and reaction nodes \\(\le\\) could only relate one input wire to one output wire (where those wires were pre/post-bound separately from the relation). It's only because we don't care at all about how wires are grouped that we can "implicitly" dovetail wires at the reaction sites.
  • 27.

    Jonathan

    If we didn't have associativity, we would need to explicitly bind arrows together, and reaction nodes ≤ could only relate one input wire to one output wire (where those wires were pre/post-bound separately from the relation).

    You are probably right and I am probably missing something, but I don't quite understand what you have said above. More specifically, I am not getting why you wouldn't be able to bind arrows using the monoidal preorder law without associativity since monoidal preorder laws only require two wires.

    Sorry for being slow... I know that this is probably not important but I would like to straighten the logic behind drawing diagrams so that I don't make faulty assumptions in the future.

    Comment Source:Jonathan >If we didn't have associativity, we would need to explicitly bind arrows together, and reaction nodes ≤ could only relate one input wire to one output wire (where those wires were pre/post-bound separately from the relation). You are probably right and I am probably missing something, but I don't quite understand what you have said above. More specifically, I am not getting why you wouldn't be able to bind arrows using the monoidal preorder law without associativity since monoidal preorder laws only require two wires. Sorry for being slow... I know that this is probably not important but I would like to straighten the logic behind drawing diagrams so that I don't make faulty assumptions in the future.
  • 28.

    It's not that you wouldn't be able to bind wires -- it's that you'd be forced to explicitly show the binding together of wires with \(\otimes\), before it ever reaches a reaction node. Right now we're able to define reaction nodes as taking some number of wires in and putting some number of wires out -- but if you look closely at how \(\le\) works, reactions \(x \le y\) relate precisely two objects, \(x\) and \(y\). It's only because of associativity that we can unambiguously represent a product object by the ordered collection of its constituents, and thus write a reaction node as operating over an ordered collection of wires.

    Without associativity, we would require an extra kind of node, specifically for \(\otimes\), to either bind together two wires into one or split apart one wire into two, so that we could explicitly describe how wires are bundled together before connecting the single combined wire to a reaction node.

    Comment Source:It's not that you wouldn't be able to bind wires -- it's that you'd be forced to _explicitly show_ the binding together of wires with \\(\otimes\\), before it ever reaches a reaction node. Right now we're able to define reaction nodes as taking some number of wires in and putting some number of wires out -- but if you look closely at how \\(\le\\) works, reactions \\(x \le y\\) relate _precisely two objects_, \\(x\\) and \\(y\\). It's only because of associativity that we can unambiguously represent a product object by the ordered collection of its constituents, and thus write a reaction node as operating over an ordered collection of wires. Without associativity, we would require an extra kind of node, specifically for \\(\otimes\\), to either bind together two wires into one or split apart one wire into two, so that we could explicitly describe how wires are bundled together before connecting the single combined wire to a reaction node.
  • 29.
    edited May 2018

    I've thrown together a (rather poor, compared to yours) diagram illustrating the distinction I'm trying to make. On the right side is the situation as we have it now, with a reaction \(w \le x \otimes y \otimes z\) assuming associativity.

    If we don't have associativity, then there are two possible reactions we might have meant. Let's suppose we want \(w \le (x \otimes y) \otimes z\) to be our reaction. We need to be able to distinguish between \((x \otimes y) \otimes z\) and \(x \otimes (y \otimes z)\), since the latter isn't a valid input to our reaction. So we have to represent the input differently; in this case, with explicit \(\otimes\) combiner nodes and reducing all reactions to single-input single-output representations.

    In other words, if you replaced \(\le\) with \(\otimes\) in this diagram (and added the missing output edges), it would be completely accurate:

    But since we bake associativity into our rules for working with string diagrams (namely, allowing multi-input multi-output relations), we never actually need the \(\otimes\) node. It's an entirely redundant symbol, since it's been incorporated into the representation of our reactions.

    Comment Source:I've thrown together a (rather poor, compared to yours) diagram illustrating the distinction I'm trying to make. On the right side is the situation as we have it now, with a reaction \\(w \le x \otimes y \otimes z\\) assuming associativity. ![](https://i.imgur.com/W7fTaPH.png) If we don't have associativity, then there are two possible reactions we might have meant. Let's suppose we want \\(w \le (x \otimes y) \otimes z\\) to be our reaction. We need to be able to distinguish between \\((x \otimes y) \otimes z\\) and \\(x \otimes (y \otimes z)\\), since the latter isn't a valid input to our reaction. So we have to represent the input differently; in this case, with explicit \\(\otimes\\) combiner nodes and reducing all reactions to single-input single-output representations. In other words, if you replaced \\(\le\\) with \\(\otimes\\) in this diagram (and added the missing output edges), it would be completely accurate: ![](http://aether.co.kr/images/monoidal_preorder_associativity.svg) But since we bake associativity into our rules for working with string diagrams (namely, allowing multi-input multi-output relations), we never actually need the \\(\otimes\\) node. It's an entirely redundant symbol, since it's been incorporated into the representation of our reactions.
  • 30.

    Jonathan

    Ahhh. I got it now LOL. So you need the associativity rule in order to define a monoid and therefore even before you get a monoidal preorder you need to have associativity baked into the definition. The nice thing about wiring diagrams is that it does exactly that.

    Anyways, sorry for asking such a basic question and thanks a bunch for explaining it for me.

    Comment Source:Jonathan Ahhh. I got it now LOL. So you need the associativity rule in order to define a monoid and therefore even before you get a monoidal preorder you need to have associativity baked into the definition. The nice thing about wiring diagrams is that it does exactly that. Anyways, sorry for asking such a basic question and thanks a bunch for explaining it for me.
  • 31.

    Although wiring diagrams already assume associativity, I think there's a very informal way to force the concept to appear. If we want to show

    $$ (x \otimes y) \otimes z = x \otimes (y \otimes z), $$ we could use positional grouping to represent the idea.

    x   y               z
         is equal to
    x               y   z

    Associativity lets us slide the wires around, while commutativity lets us slide a wire past another wire. The first line informally suggests that you have to send \(x\) and \(y\) into a box together, but associativity lets you slide the variables into different bundles and send \(y\) and \(z\) into a box instead.

    To make this formal would involve cluttering up the diagram as you folks were discussing above, but the idea that associativity has to do with spacing in the diagram seems to give a partial intuition.

    Comment Source:Although wiring diagrams already assume associativity, I think there's a very informal way to force the concept to appear. If we want to show \[ (x \otimes y) \otimes z = x \otimes (y \otimes z), \] we could use positional grouping to represent the idea. <pre>x y z is equal to x y z</pre> Associativity lets us slide the wires around, while commutativity lets us slide a wire past another wire. The first line informally suggests that you have to send \\(x\\) and \\(y\\) into a box together, but associativity lets you slide the variables into different bundles and send \\(y\\) and \\(z\\) into a box instead. To make this formal would involve cluttering up the diagram as you folks were discussing above, but the idea that associativity has to do with spacing in the diagram seems to give a partial intuition.
  • 32.

    Why not take the opposite approach?

    Say that two wires are associated if they are far apart.

    That seems perfectly fine to me.

    In other words, associating something depends on the context.

    Comment Source:Why not take the opposite approach? Say that two wires are associated if they are far apart. That seems perfectly fine to me. In other words, associating something depends on the context.
Sign In or Register to comment.