Options

Lecture 61 - Chapter 4: Closed Monoidal Posets

Posets help us talk about resources: the partial ordering \(x \le y\) says when you can use one resource to get another. Monoidal posets let us combine or 'add' two resources \(x\) and \(y\) to get a resource \(x \otimes y\). But closed monoidal posets go further and let us 'subtract' resources! And the main reason for subtracting resources is to answer questions like

If you have \(x\), what must you combine it with to get \(y\)?

When dealing with money the answer to this question is called \(y - x\), but now we will call it \(x \multimap y\). Remember, we say a monoidal poset is closed if for any pair of elements \(x\) and \(y\) there's an element \(x \multimap y\) that obeys the law

$$ x \otimes a \le y \text{ if and only if } a \le x \multimap y .$$ This says roughly "if \(x\) combined with \(a\) is no more than \(y\), then \(a\) is no more than what you need to combine with \(x\) to get \(y\)". Which sounds complicated, but makes sense on reflection.

One reason for using funny symbols like \(\otimes\) and \(\multimap\) is that they don't have strongly pre-established meanings. Sometimes they mean addition and subtraction, but sometimes they will mean multiplication and division.

For example, we can take any set and make it into a poset by saying \(x \le y\) if and only if \( x = y\): this is called a discrete poset. If our set is a monoid we can make it into a monoidal poset where \( x \otimes y \) is defined to be \(x y\). And if our set is a group we can make it into a closed monoidal poset where \(x \multimap y\) is \(y\) divided by \(x\), or more precisely \(x^{-1} y\), since we have

$$ x a = y \text{ if and only if } a = x^{-1} y .$$ I said last time that if \(\mathcal{V}\) is a closed monoidal poset we can make it into a \(\mathcal{V}\)-enriched category where:

  • the objects are just elements of \(\mathcal{V}\)

  • for any \(x,y \in \mathcal{V}\) we have \(\mathcal{V}(x,y) = x \multimap y\).

This is the real reason for the word 'closed': \(\mathcal{V}\) becomes a category enriched over itself, so it's completely self-contained, like a hermit who lives in a cave and never talks to anyone but himself.

Let's show that we get a \(\mathcal{V}\)-enriched category this way.

Theorem. If \(\mathcal{V}\) is a closed monoidal category, then \(\mathcal{V}\) becomes a \(\mathcal{V}\)-enriched category as above.

Proof. If you look back at the definition of enriched category you'll see we need to check two things:

a) For any object \(x\) of \(\mathcal{V}\) we need to show

$$ I\leq\mathcal{V}(x,x) .$$ b) For any objects \(x,y,z\) of \(\mathcal{V}\) we need to show

$$ \mathcal{V}(x,y)\otimes\mathcal{V}(y,z)\leq\mathcal{V}(x,z). $$ I bet these are follow-your-nose arguments that we can do without really thinking. For a) we need to show

$$ I \leq x \multimap x $$ but by the definition of 'closed' this is true if and only if

$$ x \otimes I \le x $$ and this is true, since in fact \(x \otimes I = x\). For b) we need to show

$$ (x \multimap y) \otimes (y \multimap z) \leq x \multimap z $$ but by the definition of closed this is true if and only if

$$ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.$$ Oh-oh, this looks complicated! But don't worry, we just need this lemma:

Lemma. In a closed monoidal category we have \(a \otimes (a \multimap b) \le b\).

Then we just use this lemma twice:

$$ x \otimes (x \multimap y) \otimes (y \multimap z) \le y \otimes (y \multimap z) \le z .$$ Voilà!

How do we prove the lemma? That's easy: the definition of closed monoidal category says

$$ a \otimes (a \multimap b) \le b \text{ if and only if } a \multimap b \le a \multimap b $$ but the right-hand statement is true, so the left-hand one is too! \(\qquad \blacksquare \)

Okay, let me leave you with some puzzles:

Puzzle 193. We know that for any set \(X\) the power set \(P(X)\) becomes a monoidal poset with \(S \le T\) meaning \(S \subseteq T\), with product \(S \otimes T = S \cap T\), and with the identity element \(I = X\). Is \(P(X)\) closed? If so, what is \(S \multimap T\)?

Puzzle 194. From Lecture 11 we know that for any set \(X\) the set of partitions of \(X\), \(\mathcal{E}(X)\), becomes a poset with \(P \le Q\) meaning that \(P\) is finer than \(Q\). It's a monoidal poset with product given by the meet \(P \wedge Q\). Is this monoidal poset closed? How about if we use the join \(P \vee Q\)?

Puzzle 195. Show that in any closed monoidal poset we have

$$ I \multimap x = x $$ for every element \(x\).

In terms of resources this says that \(I\) acts like 'nothing', since it says

If you have \(I\), what do you need to combine it with to get \(x\)? \(\; \; x\)!

Of course \(I \otimes x = x = x \otimes I \) also says that \(I\) acts like 'nothing'.

Puzzle 196. Show that in any closed monoidal poset we have

$$ x \multimap y = \bigvee \lbrace a : \; x \otimes a \le y \rbrace . $$ To read other lectures go here.

Comments

  • 1.
    edited July 15

    Puzzle 193. We know that for any set \(X\) the power set \(P(X)\) becomes a monoidal poset with \(S \le T\) meaning \(S \subseteq T\), with product \(S \otimes T = S \cap T\), and with the identity element \(I = X\). Is \(P(X)\) closed? If so, what is \(S \multimap T\)?

    In the case of \(\otimes = \cap\), then \(A \multimap B = (X \backslash A) \cup B\). Here \(\backslash\) denotes set difference.

    Proof. In Lecture 6 we saw that \((A \otimes -) \dashv (A \multimap -)\) if and only if

    $$ A \multimap B = \bigcup \{Y \in X : \; A \cap Y \subseteq B \} $$ Since \(A \cap ((X \backslash A) \cup B) = A \cap B\) it must be \((X \backslash A) \cup B \subseteq \bigcup \{Y \in X : \; A \cap Y \subseteq B \}\).

    All that is left to show is \((X \backslash A) \cup B \supseteq \bigcup \{Y \in X : \; A \cap Y \subseteq B \}\). It suffices to show the contrapositive: if \(x \not\in (X \backslash A) \cup B\) then \(x \not\in \bigcup \{Y \in X : \; A \cap Y \subseteq B \}\). Assume \(x \not\in (X \backslash A) \cup B\). It must be that \(x \in A\) and \(x \not\in B.\) This means if \(x \in Y\) then \(A \cap Y \not\subseteq B,\) hence \(x \not\in\bigcup \{Y \in X : \; A \cap Y \subseteq B \}\).\(\qquad \blacksquare \)

    Comment Source:> **Puzzle 193.** We know that for any set \\(X\\) the power set \\(P(X)\\) becomes a monoidal poset with \\(S \le T\\) meaning \\(S \subseteq T\\), with product \\(S \otimes T = S \cap T\\), and with the identity element \\(I = X\\). Is \\(P(X)\\) closed? If so, what is \\(S \multimap T\\)? In the case of \\(\otimes = \cap\\), then \\(A \multimap B = (X \backslash A) \cup B\\). Here \\(\backslash\\) denotes [set difference](https://en.wikipedia.org/wiki/Complement_(set_theory)). **Proof.** In [Lecture 6](https://forum.azimuthproject.org/discussion/1901/lecture-6-chapter-1-computing-adjoints/p1) we saw that \\((A \otimes -) \dashv (A \multimap -)\\) if and only if \[ A \multimap B = \bigcup \\{Y \in X : \; A \cap Y \subseteq B \\} \] Since \\(A \cap ((X \backslash A) \cup B) = A \cap B\\) it must be \\((X \backslash A) \cup B \subseteq \bigcup \\{Y \in X : \; A \cap Y \subseteq B \\}\\). All that is left to show is \\((X \backslash A) \cup B \supseteq \bigcup \\{Y \in X : \; A \cap Y \subseteq B \\}\\). It suffices to show the *contrapositive*: if \\(x \not\in (X \backslash A) \cup B\\) then \\(x \not\in \bigcup \\{Y \in X : \; A \cap Y \subseteq B \\}\\). Assume \\(x \not\in (X \backslash A) \cup B\\). It must be that \\(x \in A\\) and \\(x \not\in B.\\) This means if \\(x \in Y\\) then \\(A \cap Y \not\subseteq B,\\) hence \\(x \not\in\bigcup \\{Y \in X : \; A \cap Y \subseteq B \\}\\).\\(\qquad \blacksquare \\)
  • 2.

    Puzzle 195. Show that in any closed monoidal poset we have \( I \multimap x = x \) for every element \(x\).

    By the anti-symmetry property of posets we need to show:

    1. \( x \le I \multimap x \)
    2. \( I \multimap x \le x\).

    For the first statement, by reflexivity we have that

    $$ x \le x $$ and we apply the monoidal unit \(I\) to the left side

    $$ I \otimes x \le x $$ and by the definition of closed we get

    $$ x \le I \multimap x .$$ For the second statement, by reflexivity we have that

    $$ I \multimap x \le I \multimap x $$ and we apply the definition of closed

    $$ I \otimes (I \multimap x) \le x $$ and use the fact that \(I\) is the monoidal unit to get

    $$ I \multimap x \le x .$$

    Comment Source:> **Puzzle 195.** Show that in any closed monoidal poset we have \\( I \multimap x = x \\) for every element \\(x\\). By the anti-symmetry property of posets we need to show: 1. \\( x \le I \multimap x \\) 2. \\( I \multimap x \le x\\). For the first statement, by reflexivity we have that \[ x \le x \] and we apply the monoidal unit \\(I\\) to the left side \[ I \otimes x \le x \] and by the definition of closed we get \[ x \le I \multimap x .\] For the second statement, by reflexivity we have that \[ I \multimap x \le I \multimap x \] and we apply the definition of closed \[ I \otimes (I \multimap x) \le x \] and use the fact that \\(I\\) is the monoidal unit to get \[ I \multimap x \le x .\]
  • 3.
    edited July 15

    but by the definition of closed this is true if and only if

    $$ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.$$

    It's interesting to note that by the definition of being closed, this is the same as \((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\), which is exactly the logical rule of hypothetical syllogism. (Alternatively, it's the same shape as composing two arrows.)

    Comment Source:> but by the definition of closed this is true if and only if > > \[ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.\] It's interesting to note that by the definition of being closed, this is the same as \\((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\\), which is exactly the logical rule of [hypothetical syllogism](https://en.wikipedia.org/wiki/Hypothetical_syllogism). (Alternatively, it's the same shape as composing two arrows.)
  • 4.

    It's interesting to note that by the definition of being closed, this is the same as \((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\), which is exactly the logical rule of hypothetical syllogism. (Alternatively, it's the same shape as composing two arrows.)

    Yup, there's an analogue of the Curry-Howard-Lambek correspondence for linear logic and symmetric closed monoidal categories.

    John Baez and Mike Stay go into this in Physics, topology, logic and computation: a Rosetta Stone (2011). Another nice discussion is in Abramsky and Tzevelekos' Introduction to Categories and Categorical Logic (2011).

    Comment Source:> It's interesting to note that by the definition of being closed, this is the same as \\((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\\), which is exactly the logical rule of [hypothetical syllogism](https://en.wikipedia.org/wiki/Hypothetical_syllogism). (Alternatively, it's the same shape as composing two arrows.) Yup, there's an analogue of the [Curry-Howard-Lambek correspondence](https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspondence#Curry%E2%80%93Howard%E2%80%93Lambek_correspondence) for linear logic and symmetric closed monoidal categories. John Baez and Mike Stay go into this in [*Physics, topology, logic and computation: a Rosetta Stone (2011)*](http://arxiv.org/abs/0903.0340). Another nice discussion is in Abramsky and Tzevelekos' [*Introduction to Categories and Categorical Logic* (2011)](https://arxiv.org/pdf/1102.1313.pdf).
  • 5.
    edited July 15

    Jonathan wrote:

    but by the definition of closed this is true if and only if

    $$ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.$$

    It's interesting to note that by the definition of being closed, this is the same as \((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\)....

    That's not so surprising, since that's what I was in the midst of proving. I was trying to prove

    $$ (x \multimap y) \otimes (y \multimap z) \le (x \multimap z) $$ and the only way to make progress was to note that this is equivalent to

    $$ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.$$

    ... which is exactly the logical rule of hypothetical syllogism. (Alternatively, it's the same shape as composing two arrows.)

    Yes, that's more interesting!

    $$ (x \multimap y) \otimes (y \multimap z) \le (x \multimap z) $$ says that we can compose morphisms in a closed monoidal category viewed as enriched over itself... and it's also a rule of logic: "if x implies y and y implies z then x implies z". That's because most systems of logic are closed monoidal categories!

    Category theory unifies everything! image

    I hadn't known this rule was called hypothetical syllogism.

    The Lemma in this lecture is a version of modus ponens.

    Comment Source:Jonathan wrote: > > but by the definition of closed this is true if and only if > > \[ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.\] > It's interesting to note that by the definition of being closed, this is the same as \\((x \multimap y) \otimes (y \multimap z) \le (x \multimap z)\\).... That's not so surprising, since that's what I was in the midst of proving. I was trying to prove \[ (x \multimap y) \otimes (y \multimap z) \le (x \multimap z) \] and the only way to make progress was to note that this is equivalent to \[ x \otimes (x \multimap y) \otimes (y \multimap z) \le z.\] > ... which is exactly the logical rule of hypothetical syllogism. (Alternatively, it's the same shape as composing two arrows.) Yes, that's more interesting! \[ (x \multimap y) \otimes (y \multimap z) \le (x \multimap z) \] says that we can compose morphisms in a closed monoidal category viewed as enriched over itself... and it's also a rule of logic: "if x implies y and y implies z then x implies z". That's because most systems of logic _are_ closed monoidal categories! Category theory unifies everything! <img src = "http://math.ucr.edu/home/baez/emoticons/thumbsup.gif"> I hadn't known this rule was called [hypothetical syllogism](https://en.wikipedia.org/wiki/Hypothetical_syllogism). The Lemma in this lecture is a version of _modus ponens_.
  • 6.

    Puzzle MD1. Expanding on what Dan Piponi discusses in his blog post Profunctors in Haskell (2011), we have that \((- \multimap -) : \mathcal{V} \nrightarrow \mathcal{V} \) is a \(\mathcal{V}\)-profunctor. If I have another \(\mathcal{V}\)-profunctor \(F : A \nrightarrow \mathcal{V}\), what is \((- \multimap -) \circ F\)?

    Comment Source:**Puzzle MD1**. Expanding on what Dan Piponi discusses in his blog post [*Profunctors in Haskell* (2011)](http://blog.sigfpe.com/2011/07/profunctors-in-haskell.html?m=1), we have that \\((- \multimap -) : \mathcal{V} \nrightarrow \mathcal{V} \\) is a \\(\mathcal{V}\\)-profunctor. If I have another \\(\mathcal{V}\\)-profunctor \\(F : A \nrightarrow \mathcal{V}\\), what is \\((- \multimap -) \circ F\\)?
  • 7.

    Answer to Puzzle 196: Let's call our monoidal poset \(\mathcal{P}\). For every \(a\in\mathcal{P}\) such that \(a\leq x\multimap y\), \(a\leq x\multimap y\Leftrightarrow x\otimes a\leq y\). If \(a,b\in\mathcal{P}\) and \(x\otimes a\leq y\) and \(x\otimes b\leq y\), then \(a\leq x\multimap y\) and \(b\leq x\multimap y\), so \(x\multimap y\) is an upper bound in \(\mathcal{P}\) for \(a\) and \(b\). By the same argument, \(x\multimap y\) must be an upper bound for \(\{a\in\mathcal{P}:x\otimes a\leq y\}\), i.e., \[\bigvee \lbrace a : \; x \otimes a \le y \rbrace\leq x\multimap y\] . Since \(x\multimap y\in\mathcal{P}\), and \(x\multimap y\leq x\multimap y\) so that \(x\otimes(x\multimap y)\leq y\), \(x\multimap y\) is an element of the set of which it is an upper bound. Therefore it is the least upper bound, i.e., \[x\multimap y=\bigvee \lbrace a : \; x \otimes a \le y \rbrace\] .

    Answer to Puzzle 194: \(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\), so this is the set with the most elements such that its union with \(S\) is \(T\). This set is \(S^c\cap T\) (where \(S^c\) denotes the set complement operation), since the addition of any more elements of \(X\) would make its union with \(S\) a proper superset of \(T\).

    Comment Source:Answer to **Puzzle 196**: Let's call our monoidal poset \\(\mathcal{P}\\). For every \\(a\in\mathcal{P}\\) such that \\(a\leq x\multimap y\\), \\(a\leq x\multimap y\Leftrightarrow x\otimes a\leq y\\). If \\(a,b\in\mathcal{P}\\) and \\(x\otimes a\leq y\\) and \\(x\otimes b\leq y\\), then \\(a\leq x\multimap y\\) and \\(b\leq x\multimap y\\), so \\(x\multimap y\\) is an upper bound in \\(\mathcal{P}\\) for \\(a\\) and \\(b\\). By the same argument, \\(x\multimap y\\) must be an upper bound for \\(\\{a\in\mathcal{P}:x\otimes a\leq y\\}\\), i.e., \\[\bigvee \lbrace a : \; x \otimes a \le y \rbrace\leq x\multimap y\\] . Since \\(x\multimap y\in\mathcal{P}\\), and \\(x\multimap y\leq x\multimap y\\) so that \\(x\otimes(x\multimap y)\leq y\\), \\(x\multimap y\\) is an element of the set of which it is an upper bound. Therefore it is the least upper bound, i.e., \\[x\multimap y=\bigvee \lbrace a : \; x \otimes a \le y \rbrace\\] . Answer to **Puzzle 194**: \\(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\\), so this is the set with the most elements such that its union with \\(S\\) is \\(T\\). This set is \\(S^c\cap T\\) (where \\(S^c\\) denotes the set complement operation), since the addition of any more elements of \\(X\\) would make its union with \\(S\\) a proper superset of \\(T\\).
  • 8.
    edited July 15

    Is \(\leq\) properly defined in Puzzle 193? I think we should have \(I\leq S\) for all \(S\in P(X)\), but it is not true that \(X\subseteq S\) (unless \(X=S\)). If, instead, we use the reverse inclusion, I think \(P(X)\) is not a closed monoidal poset.

    Comment Source:Is \\(\leq\\) properly defined in **Puzzle 193**? I think we should have \\(I\leq S\\) for all \\(S\in P(X)\\), but it is not true that \\(X\subseteq S\\) (unless \\(X=S\\)). If, instead, we use the reverse inclusion, I think \\(P(X)\\) is not a closed monoidal poset.
  • 9.
    edited July 15

    John wrote:

    That's not so surprising, since that's what I was in the midst of proving.

    Whoops. My main point was the relation to hypothetical syllogism, but I clearly had a dumb moment on the way there.

    David wrote:

    Answer to Puzzle 194: \(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\), so this is the set with the most elements such that its union with \(S\) is \(T\). This set is \(S^c\cap T\) (where \(S^c\) denotes the set complement operation), since the addition of any more elements of \(X\) would make its union with \(S\) a proper superset of \(T\).

    Couldn't it be \(S \cup (S^c \cap T)\) as well, since union is idempotent? If \(S \cup A \subseteq T\) is true, then \(S \cup (S \cup A) \subseteq T\) is also, and vice versa.

    EDIT: And this just simplifies to \(S \cup T\).

    Comment Source:[John wrote](https://forum.azimuthproject.org/discussion/comment/20130/#Comment_20130): > That's not so surprising, since that's what I was in the midst of proving. Whoops. My main point was the relation to hypothetical syllogism, but I clearly had a dumb moment on the way there. [David wrote](https://forum.azimuthproject.org/discussion/comment/20132/#Comment_20132): > Answer to **Puzzle 194**: \\(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\\), so this is the set with the most elements such that its union with \\(S\\) is \\(T\\). This set is \\(S^c\cap T\\) (where \\(S^c\\) denotes the set complement operation), since the addition of any more elements of \\(X\\) would make its union with \\(S\\) a proper superset of \\(T\\). Couldn't it be \\(S \cup (S^c \cap T)\\) as well, since union is idempotent? If \\(S \cup A \subseteq T\\) is true, then \\(S \cup (S \cup A) \subseteq T\\) is also, and vice versa. EDIT: And this just simplifies to \\(S \cup T\\).
  • 10.
    edited July 15

    David wrote:

    Is \(\leq\) properly defined in Puzzle 193? I think we should have \(I\leq S\) for all \(S\in P(X)\), but it is not true that \(X\subseteq S\) (unless \(X=S\)).

    Aargh, you're right: \( (P(X), \subseteq, \cup, \emptyset)\) is not a monoidal poset. image

    We could use reverse inclusion instead, but that's too obviously just the same puzzle as the one before, re-expressed using complements. I'm gonna replace this puzzle with one about an old friend, the poset of partitions.

    Comment Source:David wrote: > Is \\(\leq\\) properly defined in **Puzzle 193**? I think we should have \\(I\leq S\\) for all \\(S\in P(X)\\), but it is not true that \\(X\subseteq S\\) (unless \\(X=S\\)). Aargh, you're right: \\( (P(X), \subseteq, \cup, \emptyset)\\) is not a monoidal poset. <img src = "http://math.ucr.edu/home/baez/emoticons/redface.gif"> We could use reverse inclusion instead, but that's too obviously just the same puzzle as the one before, re-expressed using complements. I'm gonna replace this puzzle with one about an old friend, the poset of partitions.
  • 11.
    edited July 16

    Jonathan wrote in #9

    David wrote:

    Answer to Puzzle 194: \(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\), so this is the set with the most elements such that its union with \(S\) is \(T\). This set is \(S^c\cap T\) (where \(S^c\) denotes the set complement operation), since the addition of any more elements of \(X\) would make its union with \(S\) a proper superset of \(T\).

    Couldn't it be \(S \cup (S^c \cap T)\) as well, since union is idempotent? If \(S \cup A \subseteq T\) is true, then \(S \cup (S \cup A) \subseteq T\) is also, and vice versa.

    EDIT: And this just simplifies to \(S \cup T\).

    Yes, you're right, but it is not true that \(S\cup T\subseteq T\) (unless \(S\subseteq T\)). In fact, even if we try with \(S\multimap T=\emptyset\), we would have to have \(S\subseteq T\) in order for \(S\cup\emptyset\subseteq T\) to hold. So, I was wrong, this monoidal poset is not closed since \(S\multimap T\) does not exist, in general.

    Comment Source:Jonathan wrote in [#9](https://forum.azimuthproject.org/discussion/comment/20134/#Comment_20134) >[David wrote](https://forum.azimuthproject.org/discussion/comment/20132/#Comment_20132): >> Answer to **Puzzle 194**: \\(S\multimap T=\bigvee \lbrace A : \; S \otimes A \le T \rbrace=\bigvee \lbrace A : \; S \cup A \subseteq T\rbrace\\), so this is the set with the most elements such that its union with \\(S\\) is \\(T\\). This set is \\(S^c\cap T\\) (where \\(S^c\\) denotes the set complement operation), since the addition of any more elements of \\(X\\) would make its union with \\(S\\) a proper superset of \\(T\\). >Couldn't it be \\(S \cup (S^c \cap T)\\) as well, since union is idempotent? If \\(S \cup A \subseteq T\\) is true, then \\(S \cup (S \cup A) \subseteq T\\) is also, and vice versa. >EDIT: And this just simplifies to \\(S \cup T\\). Yes, you're right, but it is not true that \\(S\cup T\subseteq T\\) (unless \\(S\subseteq T\\)). In fact, even if we try with \\(S\multimap T=\emptyset\\), we would have to have \\(S\subseteq T\\) in order for \\(S\cup\emptyset\subseteq T\\) to hold. So, I was wrong, this monoidal poset is not closed since \\(S\multimap T\\) does not exist, in general.
  • 12.

    Puzzle 194. From Lecture 11 we know that for any set \(X\) the set of partitions of \(X\), \(\mathcal{E}(X)\), becomes a poset with \(P \le Q\) meaning that \(P\) is finer than \(Q\). It's a monoidal poset with product given by the meet \(P \wedge Q\). Is this monoidal poset closed? How about if we use the join \(P \vee Q\)?

    If \(P\multimap Q\) needs to be the coarsest partition that distinguishes everything distinguished in Q that is not distinguished in P, I think that there is no such thing in general, which means this monoidal poset is not closed. For instance, if \(P=((1,2),(3,4))\) and \(Q=((1),(2),(3),(4))\), then \(P\wedge((1,4),(2,3))=P\wedge((1,3),(2,4)) = Q\), but \(((1,4),(2,3))\vee((1,3),(2,4))=I\).

    Comment Source:> **Puzzle 194.** From [Lecture 11](https://forum.azimuthproject.org/discussion/1991/lecture-11-chapter-1-the-poset-of-partitions/p1) we know that for any set \\(X\\) the set of partitions of \\(X\\), \\(\mathcal{E}(X)\\), becomes a poset with \\(P \le Q\\) meaning that \\(P\\) is finer than \\(Q\\). It's a monoidal poset with product given by the meet \\(P \wedge Q\\). Is this monoidal poset closed? How about if we use the join \\(P \vee Q\\)? If \\(P\multimap Q\\) needs to be the coarsest partition that distinguishes everything distinguished in Q that is not distinguished in P, I think that there is no such thing in general, which means this monoidal poset is not closed. For instance, if \\(P=((1,2),(3,4))\\) and \\(Q=((1),(2),(3),(4))\\), then \\(P\wedge((1,4),(2,3))=P\wedge((1,3),(2,4)) = Q\\), but \\(((1,4),(2,3))\vee((1,3),(2,4))=I\\).
  • 13.
    edited July 16

    Puzzle 194. From Lecture 11 we know that for any set \(X\) the set of partitions of \(X\), \(\mathcal{E}(X)\), becomes a poset with \(P \le Q\) meaning that \(P\) is finer than \(Q\). It's a monoidal poset with product given by the meet \(P \wedge Q\). Is this monoidal poset closed? How about if we use the join \(P \vee Q\)?

    Just as \( (P(X), \subseteq, \cup, \emptyset)\) is not a monoidal poset, we have \( (\mathcal{E}(X), \le, \vee, D)\) is not a monoidal poset for the same reasons.

    On the other hand, I agree with Yoav: \(P \multimap Q\) does not exist in general for partitions.

    Recall from the adjoint functor theorem for posets that \(A \multimap -\) exists and \((A \wedge -) \dashv (A \multimap -)\) if and only if \((A\wedge -) \) preserves all joins.

    This means, in particular, that if \((A \multimap -)\) exists then \(A \wedge (B \vee C) = (A \wedge B) \vee (A \wedge C)\).

    But if we let

    $$ \begin{align} A & = \lbrace \lbrace 1,4 \rbrace, \lbrace 2,3 \rbrace \rbrace \\ B & = \lbrace \lbrace 1,2 \rbrace, \lbrace 3,4 \rbrace \rbrace \\ C & = \lbrace \lbrace 1\rbrace, \lbrace 2,3 \rbrace, \lbrace 4 \rbrace \rbrace \\ \end{align} $$ then we have \(A \wedge (B \vee C) = A\) but \((A \wedge B) \vee (A \wedge C) = C\) .

    Hence \((A \multimap -)\) does not exist in general for partitions.

    Comment Source:> **Puzzle 194.** From [Lecture 11](https://forum.azimuthproject.org/discussion/1991/lecture-11-chapter-1-the-poset-of-partitions/p1) we know that for any set \\(X\\) the set of partitions of \\(X\\), \\(\mathcal{E}(X)\\), becomes a poset with \\(P \le Q\\) meaning that \\(P\\) is finer than \\(Q\\). It's a monoidal poset with product given by the meet \\(P \wedge Q\\). Is this monoidal poset closed? How about if we use the join \\(P \vee Q\\)? Just as \\( (P(X), \subseteq, \cup, \emptyset)\\) is not a monoidal poset, we have \\( (\mathcal{E}(X), \le, \vee, D)\\) is not a monoidal poset for the same reasons. On the other hand, I agree with Yoav: \\(P \multimap Q\\) does not exist in general for partitions. Recall from the [adjoint functor theorem for posets](https://forum.azimuthproject.org/discussion/2031/lecture-16-chapter-1-the-adjoint-functor-theorem-for-posets/p1) that \\(A \multimap -\\) exists and \\((A \wedge -) \dashv (A \multimap -)\\) if and only if \\((A\wedge -) \\) preserves all joins. This means, in particular, that if \\((A \multimap -)\\) exists then \\(A \wedge (B \vee C) = (A \wedge B) \vee (A \wedge C)\\). But if we let \[ \begin{align} A & = \lbrace \lbrace 1,4 \rbrace, \lbrace 2,3 \rbrace \rbrace \\\\ B & = \lbrace \lbrace 1,2 \rbrace, \lbrace 3,4 \rbrace \rbrace \\\\ C & = \lbrace \lbrace 1\rbrace, \lbrace 2,3 \rbrace, \lbrace 4 \rbrace \rbrace \\\\ \end{align} \] then we have \\(A \wedge (B \vee C) = A\\) but \\((A \wedge B) \vee (A \wedge C) = C\\) . Hence \\((A \multimap -)\\) does not exist in general for partitions.
  • 14.

    There is indeed an implication operator on partitions.

    David Ellerman defines it as such:

    \[ [[x] \multimap [y]] : = \mathrm{int}([x]^c \cup [y]) \]

    Comment Source:There is indeed an implication operator on partitions. [David Ellerman](https://arxiv.org/pdf/0902.1950.pdf) defines it as such: \\[ [[x] \multimap [y]] : = \mathrm{int}([x]^c \cup [y]) \\]
  • 15.

    re Puzzle 196.

    Show that in any closed monoidal poset we have

    $$ x \multimap y = \bigvee \lbrace a : \; x \otimes a \le y \rbrace $$

    This just drops out of the rule for computing right adjoints, but there's no harm proving it directly.

    Set \(A = \bigvee \lbrace a : \; x \otimes a \le y \rbrace\).

    Then \(a\in A \implies x \otimes a \le y \implies a \le x \multimap y\), so \(x \multimap y\) is an upper bound of \(A\).

    But our lemma tells us \(x \otimes (x \multimap y) \le y\), ie \((x \multimap y)\in A\).

    Hence \(x \multimap y\) is the least upper bound of \(A\).

    Comment Source:re **Puzzle 196.** > Show that in any closed monoidal poset we have > > \[ x \multimap y = \bigvee \lbrace a : \; x \otimes a \le y \rbrace \] This just drops out of the rule for computing right adjoints, but there's no harm proving it directly. Set \\(A = \bigvee \lbrace a : \; x \otimes a \le y \rbrace\\). Then \\(a\in A \implies x \otimes a \le y \implies a \le x \multimap y\\), so \\(x \multimap y\\) is an upper bound of \\(A\\). But our lemma tells us \\(x \otimes (x \multimap y) \le y\\), ie \\((x \multimap y)\in A\\). Hence \\(x \multimap y\\) is the least upper bound of \\(A\\).
  • 16.

    There is indeed an implication operator on partitions.

    David Ellerman defines it as such:

    \[ [[x] \Rightarrow [y]] : = \mathrm{int}([x]^c \cup [y]) \]

    (I changed your post to reflect Ellerman's original notation)

    While Ellerman does define a conditional that way, \((x \wedge - )\) does not appear to be the left adjoint of \((x \Rightarrow - )\).

    There are other conditionals in logic that are not right adjoints of \(x \wedge - \). One example is David Lewis' counterfactual conditional. Other examples include also Gillies' indicative conditionals and the probability conditional (see Adams 1998, pg. 154).

    We don't need all of the adjoint functor theorem for posets to see that there's no right adjoint for \(x \wedge - \) for partitions.

    Let's just review the relevant part of the adjoint functor theorem for posets to see why it's not going to work.

    Let \(f: X \to X\) and \(g : X \to X\) by monotone endofunctors on a lattice \((X, \le, \wedge, \vee)\), such that \(g \dashv f\).

    We have:

    $$ \begin{align} g(x \vee y) \le z & \iff x \vee y \le f(z) \\ & \iff x \le f(z) \text{ and } y \le f(z) \\ & \iff g(x) \le z \text{ and } g(y) \le z \\ & \iff g(x) \vee g(y) \le z \end{align} $$ Thus \(g(x \vee y) \le z \iff g(x) \vee g(y) \le z\).

    But then, since \(g(x \vee y) \le g(x \vee y) \) we have \(g(x) \vee g(y) \le g(x \vee y)\).

    Dually, since \(g(x) \vee g(y) \le g(x) \vee g(y)\) we have \(g(x \vee y) \le g(x) \vee g(y)\).

    Hence \(g(x \vee y) = g(x) \vee g(y)\).

    Now if you look at #15 I give an example of where \(A \wedge (B \vee C) \neq (A \wedge B) \vee (A \wedge C)\).

    So Ellerman's conditional must break one of the usual rules (perhaps modus ponens or the hypothetical syllogism that Jonathan pointed out).

    Comment Source:> There is indeed an implication operator on partitions. > > [David Ellerman](https://arxiv.org/pdf/0902.1950.pdf) defines it as such: > > \\[ [[x] \Rightarrow [y]] : = \mathrm{int}([x]^c \cup [y]) \\] (I changed your post to reflect Ellerman's original notation) While Ellerman does define a conditional that way, \\((x \wedge - )\\) does not appear to be the left adjoint of \\((x \Rightarrow - )\\). There are other conditionals in logic that are not right adjoints of \\(x \wedge - \\). One example is David Lewis' [counterfactual conditional](https://plato.stanford.edu/entries/causation-counterfactual/#CouCauDep). Other examples include also Gillies' [*indicative conditionals*](http://rci.rutgers.edu/~thony/resources/phil-language-and-indicatives-radio-edit.pdf) and the *probability conditional* (see [Adams 1998, pg. 154](http://press.uchicago.edu/ucp/books/book/distributed/P/bo3636904.html)). We don't need all of the adjoint functor theorem for posets to see that there's no right adjoint for \\(x \wedge - \\) for partitions. Let's just review the relevant part of the adjoint functor theorem for posets to see why it's not going to work. Let \\(f: X \to X\\) and \\(g : X \to X\\) by monotone endofunctors on a lattice \\((X, \le, \wedge, \vee)\\), such that \\(g \dashv f\\). We have: \[ \begin{align} g(x \vee y) \le z & \iff x \vee y \le f(z) \\\\ & \iff x \le f(z) \text{ and } y \le f(z) \\\\ & \iff g(x) \le z \text{ and } g(y) \le z \\\\ & \iff g(x) \vee g(y) \le z \end{align} \] Thus \\(g(x \vee y) \le z \iff g(x) \vee g(y) \le z\\). But then, since \\(g(x \vee y) \le g(x \vee y) \\) we have \\(g(x) \vee g(y) \le g(x \vee y)\\). Dually, since \\(g(x) \vee g(y) \le g(x) \vee g(y)\\) we have \\(g(x \vee y) \le g(x) \vee g(y)\\). Hence \\(g(x \vee y) = g(x) \vee g(y)\\). Now if you look at [#15](https://forum.azimuthproject.org/discussion/comment/20139/#Comment_20139) I give an example of where \\(A \wedge (B \vee C) \neq (A \wedge B) \vee (A \wedge C)\\). So Ellerman's conditional must break one of the usual rules (perhaps *modus ponens* or the *hypothetical syllogism* that Jonathan pointed out).
  • 17.

    I would say it's the meet that is weird. It's significantly stronger then a tensor "should be". Though that depends on how one looks at partitions.

    Comment Source:I would say it's the meet that is weird. It's significantly stronger then a tensor "should be". Though that depends on how one looks at partitions.
  • 18.
    edited July 16

    Doty, in the paper I link above, Ellerman gives partition logic as a Boolean logic. A Boolean logic is one where a left and right adjoint laws hold,

    \[ x \setminus y \leq x \land \neg y, \\ \neg x \lor y \leq x \Rightarrow y. \]

    I think what confuses most people is that Ellerman uses the dual poset.

    But if you look at page 19, we have the following:

    \[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \leq y \multimap z, \] and, \[ y \multimap z = \bigvee \lbrace x \mid [x] \cap [y] \subseteq [z]\rbrace. \]

    Comment Source:Doty, [in the paper I link above](https://arxiv.org/pdf/0902.1950.pdf), Ellerman gives partition logic as a Boolean logic. A Boolean logic is one where a left and right adjoint laws hold, \\[ x \setminus y \leq x \land \neg y, \\\\ \neg x \lor y \leq x \Rightarrow y. \\] I think what confuses most people is that Ellerman uses the dual poset. But if you look at page 19, we have the following: \\[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \leq y \multimap z, \\] and, \\[ y \multimap z = \bigvee \lbrace x \mid [x] \cap [y] \subseteq [z]\rbrace. \\]
  • 19.

    So a boolean logic is different than a boolean algebra?

    Comment Source:So a boolean logic is different than a boolean algebra?
  • 20.

    The dual lattice doesn't seem to be closed either. If we know "how to relate 1 to 2" and we want to know "how to relate 1, 2, and 3", we can either learn "how to relate 1 to 3" or "how to relate 2 to 3". But if you told me that all you know is either "how to relate 1 to 3" or "how to relate 2 to 3", then the relations I'm sure you know how to do are "nothing". And, of course, learning "nothing" doesn't help me know "how to relate 1, 2, and 3".

    The reason the Ellerman implication doesn't serve as a closure relation here is that while we do have

    \[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \leq (y \Rightarrow z),\]

    we don't have

    \[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \wedge y \le z,\]

    Comment Source:The dual lattice doesn't seem to be closed either. If we know "how to relate 1 to 2" and we want to know "how to relate 1, 2, and 3", we can either learn "how to relate 1 to 3" or "how to relate 2 to 3". But if you told me that all you know is either "how to relate 1 to 3" or "how to relate 2 to 3", then the relations I'm sure you know how to do are "nothing". And, of course, learning "nothing" doesn't help me know "how to relate 1, 2, and 3". The reason the Ellerman implication doesn't serve as a closure relation here is that while we do have \\[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \leq (y \Rightarrow z),\\] we don't have \\[ [x] \cap [y] \subseteq [z] \Leftrightarrow x \wedge y \le z,\\]
  • 21.

    If you have a distinction that is made by \(A\wedge B\) you know more then that distinction is made by A and that it is made by B, there must be a common partition into two parts, that both A and B refine.

    Conversely if you use the atoms, (partitions into exactly two parts) as your element analog, then an atom refined by \(A \vee B\) might not be refined by either A or B.

    You can see this in the lattice of partitions of the 3 element set. Both {1|2,3} and {1,2|3} distinguish between 1 and 3, but their meet, is the blob: {1,2,3} which makes no distinguishments.

    Similarly, the join of {1|2,3} and {1,2|3} is {1|2|3} also refines the atom {1,3|2}.

    Which all has to do with transitively closing the equivalence relation that a partition is equivalent too. (by the closure adjunction of normal logic, a function that gives you the membership function of the elements in the same box has type A->(A->Bool) while an equivalence relation has type (A,A) -> Bool, though in practice haskell almost always uses A->B->C instead of (A,B)->C because the syntax ends up being way less noisy.

    Comment Source:If you have a distinction that is made by \\(A\wedge B\\) you know more then that distinction is made by A and that it is made by B, there must be a common partition into two parts, that both A and B refine. Conversely if you use the atoms, (partitions into exactly two parts) as your element analog, then an atom refined by \\(A \vee B\\) might not be refined by either A or B. You can see this in the lattice of partitions of the 3 element set. Both {1|2,3} and {1,2|3} distinguish between 1 and 3, but their meet, is the blob: {1,2,3} which makes no distinguishments. Similarly, the join of {1|2,3} and {1,2|3} is {1|2|3} also refines the atom {1,3|2}. Which all has to do with transitively closing the equivalence relation that a partition is equivalent too. (by the closure adjunction of normal logic, a function that gives you the membership function of the elements in the same box has type A->(A->Bool) while an equivalence relation has type (A,A) -> Bool, though in practice haskell almost always uses A->B->C instead of (A,B)->C because the syntax ends up being way less noisy.
Sign In or Register to comment.