Options

Lecture 26 - Chapter 2: Monoidal Monotones

edited February 2020 in Baez ACT 2019: Lectures

One of the main lessons of category theory is that whenever you think about some kind of mathematical gadget, you should also think about maps between gadgets of this kind. For example, when you think about sets you should also think about functions. When you think about vector spaces you should also think about linear maps. And so on.

We've been talking about various kinds of monoidal preorders. So, let's think about maps between monoidal preorders.

As I explained in Lecture 22, a monoidal preorder is a crossbreed or hybrid of a preorder and a monoid. So let's think about maps between preorders, and maps between monoids, and try to hybridize those.

We've already seen maps between preorders: they're called monotone functions:

Definition. A monotone function from a preorder \((X,\le_X)\) to \((Y,\le_Y)\) is a function \(f : X \to Y\) such that

$$ x \le_X x' \textrm{ implies } f(x) \le_Y f(x') $$ for all elements \(x,x' \in X\).

So, these functions preserve what a preorder has, namely the relation \(\le\). A monoid, on the other hand, has an associative operation \(\otimes\) and a unit element \(I\). So, a map between monoids should preserve th0se! That's how this game works.

Just to scare people, mathematicians call these maps "homomorphisms":

Definition. A homomorphism from a monoid \( (X,\otimes_X,I_X) \) to a monoid \( (Y,\otimes_Y,I_Y) \) is a function \(f : X \to Y\) such that:

$$ f(x \otimes_X x') = f(x) \otimes_Y f(x') $$ for all elements \(x,x' \in X\), and

$$ f(I_X) = I_Y .$$ You've probably seen a lot of homomorphisms between monoids. Some of them you barely noticed. For example, the set of integers \(\mathbb{Z}\) is a monoid with addition as \(\otimes\) and the number \(0\) as \(I\). So is the set \(\mathbb{R}\) of real numbers! There's a function that turns each integer into a real number:

$$ i: \mathbb{Z} \to \mathbb{R} . $$ It's such a bland function you may never have thought about it: it sends each integer to itself, but regarded as a real number. And this function is a homomorphism!

What does that mean? Look at the definition. It means you can either add two natural numbers and then regard the result as a real number... or first regard each of them as a real number and then add them... and you get the same answer either way. It also says that integer \(0\), regarded as a real number, is the real number we call \(0\).

Boring facts! But utterly crucial facts. Computer scientists need to worry about these things, because for them integers and real numbers (or floating-point numbers) are different data types, and \(i\) is doing "type conversion".

You've also seen a lot of other, more interesting homomorphisms between monoids.

For example, the whole point of the logarithm function is that it's a homomorphism. It carries multiplication to addition:

$$ \log(x \cdot x') = \log(x) + \log(x') $$ and it carries the identity for multiplication to the identity for addition:

$$ \log(1) = 0. $$ People invented tables of logarithms, and later slide rules, precisely for this reason! They wanted to convert multiplication problems into easier addition problems.

You may also have seen linear maps between vector spaces. A vector space gives a monoid with addition as \(\otimes\) and the zero vector as \(I\); any linear map between vector spaces then gives a homomorphism.

Puzzle 80. Tell me a few more homomorphisms between monoids that you routinely use, or at least know.

I hope I've convinced you: monotone functions between preorders are important, and so are homomorphisms between monoids. Thus, if we hybridize these concepts, we'll get a concept that's likely to be important.

It turns out there are a few different ways! The most obvious way is simply to combine all the conditions. There are other ways, so this way is called "strict":

Definition. A strict monoidal monotone from a monoidal preorder \( (X,\le_X,\otimes_X,I_X) \) to a monoidal preorder \( (Y,\le_Y,\otimes_Y,I_Y) \) is a function \(f : X \to Y\) such that:

$$ x \le_X x' \textrm{ implies } f(x) \le_Y f(x') $$ and

$$ f(x) \otimes_Y f(x') = f(x \otimes_X x') $$ for all elements \(x,x' \in X\), and also

$$ I_Y = f(I_X) . $$ For example, the homomorphism

$$ i : \mathbb{Z} \to \mathbb{R} ,$$ is a strict monoidal monotone: if one integer is \(\le\) another, then that's still true when we regard them as real numbers. So is the logarithm function.

What other definition could we possibly use, and why would we care? It turns out sometimes we want to replace some of the equations in the above definition by inequalities!

Definition. A lax monoidal monotone from a monoidal preorder \((X,\le_X,\otimes_X,I_X)\) to a monoidal preorder \((Y,\le_Y,\otimes_Y,I_Y)\) is a function \(f : X \to Y\) such that:

$$ x \le_X x' \textrm{ implies } f(x) \le_Y f(x') $$ and

$$ f(x) \otimes_Y f(x') \le_Y f(x \otimes_X x') $$ for all elements \(x,x' \in X\), and also

$$ I_Y \le_Y f(I_X). $$ Fong and Spivak call this simply a monoidal monotone, since it's their favorite kind. But I will warn you that others call it "lax".

We could also turn around those last two inequalities:

Definition. An oplax monoidal monotone from a monoidal preorder \((X,\le_X,\otimes_X,I_X)\) to a monoidal preorder \((Y,\le_Y,\otimes_Y,I_Y)\) is a function \(f : X \to Y\) such that:

$$ x \le_X x' \textrm{ implies } f(x) \le_Y f(x') $$ and

$$ f(x) \otimes_Y f(x') \ge_Y f(x \otimes_X x') $$ for all elements \(x,x' \in X\), and also

$$ I_Y \ge_Y f(I_X). $$ You are probably drowning in definitions now, so let me give some examples to show that they're justified. The monotone function

$$ i : \mathbb{Z} \to \mathbb{R} $$ has a right adjoint

$$ \lfloor \cdot \rfloor : \mathbb{R} \to \mathbb{Z} $$ which provides the approximation from below to the nonexistent inverse of \(i\): that is, \( \lfloor x \rfloor \) is the greatest integer that's \(\le x\). It also has a left adjoint

$$ \lceil \cdot \rceil : \mathbb{R} \to \mathbb{Z} $$ which is the best approximation from above to the nonexistent inverse of \(i\): that is, \( \lceil x \rceil \) is the least integer that's \(\ge x\).

Puzzle 81. Show that one of the functions \( \lfloor \cdot \rfloor : \mathbb{R} \to \mathbb{Z} \), \( \lceil \cdot \rceil : \mathbb{R} \to \mathbb{Z} \) is a lax monoidal monotone and the other is an oplax monoidal monotone, where we make the integers and reals into monoids using addition.

So, you should be sensing some relation between left and right adjoints, and lax and oplax monoidal monotones. We'll talk about this more! And we'll see why all this stuff is important for resource theories.

Finally, for the bravest among you:

Puzzle 82. Find a function between monoidal preorders that is both lax and oplax monoidal monotone but not strict monoidal monotone.

In case you haven't had enough jargon for today: a function between monoidal preorders that's both lax and oplax monoidal monotone is called strong monoidal monotone.

To read other lectures go here.

Comments

  • 1.

    John, for each statement like this, I think you have one too many primes:

    $$ x \le_X x' \textrm{ implies } f(x') \le_Y f(x') $$

    Comment Source:John, for each statement like this, I think you have one too many primes: \[ x \le_X x' \textrm{ implies } f(x') \le_Y f(x') \]
  • 2.
    edited May 2018

    I want to test some \( \LaTeX \) out.

    Definition. A monotone function from a preorder \((X,\le_X)\) to \((Y,\le_Y)\) is a function \(f : X \to Y\) such that the following diagram commutes,

    $$ \large\begin{matrix} x & \overset{\le_X}{\longrightarrow} & x' \\ f\downarrow & & \downarrow f \\ f(x) & \overset{\le_Y}{\longrightarrow} & f(x') \end{matrix} $$ for all elements \(x,x' \in X\).

    Comment Source:I want to test some \\( \LaTeX \\) out. **Definition.** A **monotone function** from a preorder \\((X,\le_X)\\) to \\((Y,\le_Y)\\) is a function \\(f : X \to Y\\) such that the following diagram commutes, $$ \large\begin{matrix} x & \overset{\le_X}{\longrightarrow} & x' \\\\ f\downarrow & & \downarrow f \\\\ f(x) & \overset{\le_Y}{\longrightarrow} & f(x') \end{matrix} $$ for all elements \\(x,x' \in X\\).
  • 3.
    edited May 2018

    Jonathan - thanks, fixed!

    Keith - you asked a while ago if we could draw fancy diagrams here, and the answer is basically no. But yes, one can draw commutative squares! (Anyone who wants to learn how, click on the little gear to the upper right of his comment.)

    By the way, the square you wrote down is not really a commutative square in any category, because \(f\) is not a morphism from \(x\) to \(f(x)\); it's a morphism from \(X\) to \(Y\). One should not write \(x \stackrel{f}{\to} f(x)\); one writes \(f: x \mapsto f(x) \) to indicate that the function \(f\) sends \(x\) to \(f(x)\).

    The "mapsto" arrow, \(\mapsto\), has a very different meaning than the "to" arrow, \(\to\). For more see:

    Comment Source:Jonathan - thanks, fixed! Keith - you asked a while ago if we could draw fancy diagrams here, and the answer is basically no. But yes, one can draw commutative squares! (Anyone who wants to learn how, click on the little gear to the upper right of his comment.) By the way, the square you wrote down is not really a commutative square in any category, because \\(f\\) is not a morphism from \\(x\\) to \\(f(x)\\); it's a morphism from \\(X\\) to \\(Y\\). One should not write \\(x \stackrel{f}{\to} f(x)\\); one writes \\(f: x \mapsto f(x) \\) to indicate that the function \\(f\\) sends \\(x\\) to \\(f(x)\\). The "mapsto" arrow, \\(\mapsto\\), has a very different meaning than the "to" arrow, \\(\to\\). For more see: * Math Stackexchange, [Difference of mapsto and arrow.](https://math.stackexchange.com/questions/473247/difference-of-mapsto-and-right-arrow)
  • 4.
    edited May 2018

    Puzzle 80 Let \(X\) be the monoidal preorder described by Anindya in Comment 9 of Lecture 22 . The elements of \(X\) are finite length words, the relation \(\leq_X\) is defined by word length, and \(\otimes_X\) is word concatenation. The map that takes words to their length (e.g. CAT\(\mapsto 3\)) is a homomorphism between \(X\) and \(\mathbb N\). It is a homomorphism because the lengths of two words added together is the same as concatenating the words and then taking the length of the result!

    Comment Source:**Puzzle 80** Let \\(X\\) be the monoidal preorder described by Anindya in <a href = "https://forum.azimuthproject.org/discussion/comment/17974/#Comment_17974"> Comment 9 of Lecture 22 </a>. The elements of \\(X\\) are finite length words, the relation \\(\leq_X\\) is defined by word length, and \\(\otimes_X\\) is word concatenation. The map that takes words to their length (e.g. CAT\\(\mapsto 3\\)) is a homomorphism between \\(X\\) and \\(\mathbb N\\). It is a homomorphism because the lengths of two words added together is the same as concatenating the words and then taking the length of the result!
  • 5.
    edited May 2018

    To remember the condition for lax vs. oplax monoid monotone functions, I'm testing out this mnemonic:

    Lax - doing things separately is cheaper than doing them together Oplax - doing things together is cheaper than doing them separately

    Lax functions are like relaxed people (i.e. people with time during the week) who find it easier to clean 10 minutes every day instead of cleaning for 1 hour once a week.

    Comment Source:To remember the condition for lax vs. oplax monoid monotone functions, I'm testing out this mnemonic: Lax - doing things separately is cheaper than doing them together Oplax - doing things together is cheaper than doing them separately Lax functions are like re<b>lax</b>ed people (i.e. people with time during the week) who find it easier to clean 10 minutes every day instead of cleaning for 1 hour once a week.
  • 6.
    edited May 2018

    Puzzle 82 Again let \(X\) be the monoidal preorder whose elements are words of finite length. \(s \leq_X s'\) iff the length of word \(s\) is smaller than the length of word \(s'\). \(\otimes_X\) is string concatenation and \(1_X\) is the empty string.

    Define a map \(f: X \to X\) by word reversal. For example, \[f(\textrm{CAT}) = \textrm{TAC}.\] First note that reversing a word doesn't change its length so \[s \leq_X s' \implies f(s) \leq_X f(s').\] Also reversing the empty string is still the empty string, so \(f(1_X) = 1_X\). The reflexive property then gives us \[ f(1_X) \leq 1_X \textrm{ and } 1_X \leq f(1_X) .\] Lastly, if I reverse two words and concatenate them, the resulting string has the same length as the string I get when I first concatenate them and then reverse them. So \[f(s) \otimes f(s') \leq f(s \otimes s') \textrm{ and } f(s \otimes s') \leq f(s) \otimes f(s').\]

    Therefore \(f\) is both lax and oplax! However, \[f(\textrm{CAT}) \otimes f(\textrm{DOG}) = \textrm{TAC} \otimes \textrm{GOD} = \textrm{TACGOD}\] while \[f(\textrm{CAT} \otimes \textrm{DOG}) = f(\textrm{CATDOG}) = \textrm{GODTAC}.\] Since \[f(\textrm{CAT}) \otimes f(\textrm{DOG}) \neq f(\textrm{CAT} \otimes \textrm{DOG})\] \(f\) is not strict monoidal monotone.

    Comment Source:**Puzzle 82** Again let \\(X\\) be the monoidal preorder whose elements are words of finite length. \\(s \leq_X s'\\) iff the length of word \\(s\\) is smaller than the length of word \\(s'\\). \\(\otimes_X\\) is string concatenation and \\(1_X\\) is the empty string. Define a map \\(f: X \to X\\) by word reversal. For example, \\[f(\textrm{CAT}) = \textrm{TAC}.\\] First note that reversing a word doesn't change its length so \\[s \leq_X s' \implies f(s) \leq_X f(s').\\] Also reversing the empty string is still the empty string, so \\(f(1_X) = 1_X\\). The reflexive property then gives us \\[ f(1_X) \leq 1_X \textrm{ and } 1_X \leq f(1_X) .\\] Lastly, if I reverse two words and concatenate them, the resulting string has the same length as the string I get when I first concatenate them and then reverse them. So \\[f(s) \otimes f(s') \leq f(s \otimes s') \textrm{ and } f(s \otimes s') \leq f(s) \otimes f(s').\\] Therefore \\(f\\) is both lax and oplax! However, \\[f(\textrm{CAT}) \otimes f(\textrm{DOG}) = \textrm{TAC} \otimes \textrm{GOD} = \textrm{TACGOD}\\] while \\[f(\textrm{CAT} \otimes \textrm{DOG}) = f(\textrm{CATDOG}) = \textrm{GODTAC}.\\] Since \\[f(\textrm{CAT}) \otimes f(\textrm{DOG}) \neq f(\textrm{CAT} \otimes \textrm{DOG})\\] \\(f\\) is not strict monoidal monotone.
  • 7.
    edited May 2018

    Oh, reversal! I was trying so many different things, and I was even trying endomorphisms on that exact monoidal preorder, but I never came up with reversal! That's a fantastic solution, Sophie.

    (I think that's supposed to be Puzzle 82, though, not Puzzle 81!)

    Comment Source:Oh, _reversal_! I was trying so many different things, and I was even trying endomorphisms on that exact monoidal preorder, but I never came up with reversal! That's a fantastic solution, Sophie. (I think that's supposed to be Puzzle 82, though, not Puzzle 81!)
  • 8.

    Thanks Jonathan! I edited the puzzle number :)

    Comment Source:Thanks Jonathan! I edited the puzzle number :)
  • 9.

    that's v neat @Sophie – one slight typo: the line \(f(s) \otimes f(s') = f(s \otimes s')\) should be \(f(s) \otimes f(s') \cong f(s \otimes s')\)

    Comment Source:that's v neat @Sophie – one slight typo: the line \\(f(s) \otimes f(s') = f(s \otimes s')\\) should be \\(f(s) \otimes f(s') \cong f(s \otimes s')\\)
  • 10.

    I was wondering if we needed both rules – "\(f\) preserves \(\otimes\)" and "\(f\) preserves \(I\)" – in the definition of a monoid homomorphism.

    Or is "\(f\) preserves \(\otimes\)" enough on its own?

    If we can pick an \(x\) such that \(f(x) = I_Y\) and then we have \[f(I_X) = f(I_X) \otimes I_Y = f(I_X) \otimes f(x) = f(I_X \otimes x) = f(x) = I_Y\]

    But what if \(I_Y\) is not in the image of \(f\)?

    I suspect we might be able to get a counterexample in this case – but I'm struggling to come up with one.

    Comment Source:I was wondering if we needed both rules – "\\(f\\) preserves \\(\otimes\\)" and "\\(f\\) preserves \\(I\\)" – in the definition of a monoid homomorphism. Or is "\\(f\\) preserves \\(\otimes\\)" enough on its own? If we can pick an \\(x\\) such that \\(f(x) = I_Y\\) and then we have \\[f(I_X) = f(I_X) \otimes I_Y = f(I_X) \otimes f(x) = f(I_X \otimes x) = f(x) = I_Y\\] But what if \\(I_Y\\) is *not* in the image of \\(f\\)? I suspect we might be able to get a counterexample in this case – but I'm struggling to come up with one.
  • 11.
    edited May 2018

    @Anindya: If we have a monoid homomorphism \(f : X \to Y\), then we have \(f(1_X) \otimes f(x) = f(1_X \otimes x) = f(x) = f(x \otimes_X 1_X) = f(x) \otimes_Y f(1_X)\). So if nothing else, \(f(1_X)\) is a unit for the image \(f(X) \subseteq Y\). And as you say, if \(1_Y\) is mapped to, then \(1_X\) must be one of the things that map to it. (Nice proof!)

    Unfortunately, it's entirely possible for something that isn't the unit of the larger monoid to nonetheless behave like a unit for a subset of the monoid. Consider the monoid \(\langle \mathcal{P}(S), \cup, \emptyset \rangle \) of subsets of a set \(S\), with union as product and the empty set as unit. Let \(x \in S\), and consider the upset \(\operatorname{\uparrow}\{x\}\) consisting of the collection of all sets containing \(x\). Then \(\{x\}\) acts as a unit for this collection, because every subset in this collection already contains \(x\).

    One reason to require that a homomorphism preserve the unit is because this lets us classify submonoids (and subalgebras in general). A submonoid is a subset of a monoid where the unit and multiplication are the same as the containing monoid. With monoid homomorphisms as defined, every submonoid is just the image of a homomorphism into the containing monoid. (This can be as simple as an injection, like \(i : \mathbb{N} \to \mathbb{Z}\) defined by \(i(x) = x\).)

    Comment Source:@[Anindya](https://forum.azimuthproject.org/discussion/comment/18231/#Comment_18231): If we have a monoid homomorphism \\(f : X \to Y\\), then we have \\(f(1_X) \otimes f(x) = f(1_X \otimes x) = f(x) = f(x \otimes_X 1_X) = f(x) \otimes_Y f(1_X)\\). So if nothing else, \\(f(1_X)\\) is a unit for the image \\(f(X) \subseteq Y\\). And as you say, if \\(1_Y\\) is mapped to, then \\(1_X\\) must be one of the things that map to it. (Nice proof!) Unfortunately, it's entirely possible for something that isn't the unit of the larger monoid to nonetheless behave like a unit for a subset of the monoid. Consider the monoid \\(\langle \mathcal{P}(S), \cup, \emptyset \rangle \\) of subsets of a set \\(S\\), with union as product and the empty set as unit. Let \\(x \in S\\), and consider the upset \\(\operatorname{\uparrow}\\{x\\}\\) consisting of the collection of all sets containing \\(x\\). Then \\(\\{x\\}\\) acts as a unit for this collection, because every subset in this collection already contains \\(x\\). One reason to require that a homomorphism preserve the unit is because this lets us classify _submonoids_ (and [subalgebras](http://planetmath.org/homomorphismbetweenalgebraicsystems) in general). A submonoid is a subset of a monoid where the unit and multiplication are the same as the containing monoid. With monoid homomorphisms as defined, every submonoid is just the image of a homomorphism into the containing monoid. (This can be as simple as an injection, like \\(i : \mathbb{N} \to \mathbb{Z}\\) defined by \\(i(x) = x\\).)
  • 12.

    @Jonathan – Ah okay! and you can simplify this to give the counterexample I'm looking for:

    Take the trivial monoid \(\mathbf{1}\) and the 2-element join semilattice \((\mathbf{2}, \vee, \bot)\)

    Let \(f : \mathbf{1} \rightarrow \mathbf{2}\) be the map that picks out \(\top\)

    Then \(f(\bullet) \otimes f(\bullet) = f(\bullet) \vee f(\bullet) = \top \vee \top = \top = f(\bullet \otimes \bullet)\)

    So \(f\) preserves \(\otimes\). But \(f(\bullet)\neq\bot\), so \(f\) does not preserve identity.

    Comment Source:@Jonathan – Ah okay! and you can simplify this to give the counterexample I'm looking for: Take the trivial monoid \\(\mathbf{1}\\) and the 2-element join semilattice \\((\mathbf{2}, \vee, \bot)\\) Let \\(f : \mathbf{1} \rightarrow \mathbf{2}\\) be the map that picks out \\(\top\\) Then \\(f(\bullet) \otimes f(\bullet) = f(\bullet) \vee f(\bullet) = \top \vee \top = \top = f(\bullet \otimes \bullet)\\) So \\(f\\) preserves \\(\otimes\\). But \\(f(\bullet)\neq\bot\\), so \\(f\\) does not preserve identity.
  • 13.

    Thanks @Anindya, I fixed the error!

    Comment Source:Thanks @Anindya, I fixed the error!
  • 14.

    I am having a hard time understanding strong monoidal monotone functions. This seems to be a paradox for me (like this water is both hot and cold). Does anyone have an example of one that can clarify how this can happen?

    Comment Source:I am having a hard time understanding strong monoidal monotone functions. This seems to be a paradox for me (like this water is both hot and cold). Does anyone have an example of one that can clarify how this can happen?
  • 15.
    edited May 2018

    Michael, can you clarify where you find Sophie's construction of a strong monoidal monotone unclear? (Also, every strict monoidal monotone is also strong; but I suspect it's the strong-but-not-strict ones that are confusing.)

    Comment Source:Michael, can you clarify where you find [Sophie's construction](https://forum.azimuthproject.org/discussion/comment/18225/#Comment_18225) of a strong monoidal monotone unclear? (Also, every _strict_ monoidal monotone is also _strong_; but I suspect it's the strong-but-not-strict ones that are confusing.)
  • 16.

    Jonathan

    I missed Sophie's answer to Puzzle 82. Thats pretty clear now. These creatures are some sneaky ones eh? LOL

    Also, every strict monoidal monotone is also strong; but I suspect it's the strong-but-not-strict ones that are confusing.

    I was also confused about this but now is clear. Thanks a lot!

    Comment Source:Jonathan I missed Sophie's answer to Puzzle 82. Thats pretty clear now. These creatures are some sneaky ones eh? LOL >Also, every strict monoidal monotone is also strong; but I suspect it's the strong-but-not-strict ones that are confusing. I was also confused about this but now is clear. Thanks a lot!
  • 17.
    edited May 2018

    John I was having a hard time proving puzzle 83 and I think it's because the unit conditions are the other way around for the lax and oplax monotones, respectively:

    • For the lax monotone we should require: \( I_Y \le f(I_X) . \)

    • For the oplax monotone we should require: \( f(I_X) \le_Y I_Y . \)

    And a nitpick: one of these equations was using \(1\) rather than \(I\) to denote the unit; I've noticed this notation is also mixed at the end of the next lecture.

    Comment Source:[John](https://forum.azimuthproject.org/profile/17/John%20Baez) I was having a hard time proving [puzzle 83](https://forum.azimuthproject.org/discussion/2098/lecture-27-chapter-2-adjoints-of-monoidal-monotones#latest) and I think it's because the unit conditions are the other way around for the lax and oplax monotones, respectively: - For the lax monotone we should require: \\( I_Y \le f(I_X) . \\) - For the oplax monotone we should require: \\( f(I_X) \le_Y I_Y . \\) And a nitpick: one of these equations was using \\(1\\) rather than \\(I\\) to denote the unit; I've noticed this notation is also mixed at the end of the [next lecture](https://forum.azimuthproject.org/discussion/2098/lecture-27-chapter-2-adjoints-of-monoidal-monotones#latest).
  • 18.

    Dan, I reposted your comment in Lecture 27 so others working on the puzzle will see it there as well. Hope that's okay with you!

    Comment Source:Dan, I reposted your comment in Lecture 27 so others working on the puzzle will see it there as well. Hope that's okay with you!
  • 19.

    Dan, I reposted your comment in Lecture 27 so others working on the puzzle will see it there as well. Hope that's okay with you!

    Sure Sophie! And thank you for the interesting points you raised there!

    Comment Source:> Dan, I reposted your comment in Lecture 27 so others working on the puzzle will see it there as well. Hope that's okay with you! Sure [Sophie](https://forum.azimuthproject.org/profile/2225/Sophie%20Libkind)! And thank you for the interesting points you raised there!
  • 20.
    edited May 2018

    Dan wrote:

    John I was having a hard time proving puzzle 83 and I think it's because the unit conditions are the other way around for the lax and oplax monotones, respectively:

    • For the lax monotone we should require: \( I_Y \le f(I_X) . \)

    • For the oplax monotone we should require: \( f(I_X) \le_Y I_Y . \)

    And a nitpick: one of these equations was using \(1\) rather than \(I\) to denote the unit; I've noticed this notation is also mixed at the end of the next lecture.

    You're right on all of these! I'm fixing these mistakes now. It's important to get those inequalities pointing the right way. Thanks!

    Comment Source:Dan wrote: > [John](https://forum.azimuthproject.org/profile/17/John%20Baez) I was having a hard time proving [puzzle 83](https://forum.azimuthproject.org/discussion/2098/lecture-27-chapter-2-adjoints-of-monoidal-monotones#latest) and I think it's because the unit conditions are the other way around for the lax and oplax monotones, respectively: > - For the lax monotone we should require: \\( I_Y \le f(I_X) . \\) > - For the oplax monotone we should require: \\( f(I_X) \le_Y I_Y . \\) > And a nitpick: one of these equations was using \\(1\\) rather than \\(I\\) to denote the unit; I've noticed this notation is also mixed at the end of the [next lecture](https://forum.azimuthproject.org/discussion/2098/lecture-27-chapter-2-adjoints-of-monoidal-monotones#latest). You're right on all of these! I'm fixing these mistakes now. It's important to get those inequalities pointing the right way. Thanks!
  • 21.
    edited May 2018

    Anindya wrote:

    I was wondering if we needed both rules – "f preserves ⊗" and "f preserves I" – in the definition of a monoid homomorphism.

    That's a good question; for group homomorphisms we don't need the second condition. But for monoid homomorphisms we do. Jonathan and you have given some nice examples from logic; here's a typical example from analysis:

    Let \(X\) be the set of functions \(f : [0,1] \to \mathbb{R}\). Make this into a monoid with pointwise multiplication of functions as its multiplication and the constant function \(1\) as the unit. Let

    $$ F : X \to X $$ be the map that multiplies any function \(f\) by the characteristic function of the interval \( [0,1/2] \). Then

    $$ F(fg) = F(f) F(g) $$ for all \(f,g \in X\) but

    $$ F(1) \ne 1 .$$ We can generalize this as follows. Suppose \(X\) is any monoid and \(p \in M\) is a central idempotent: an element that commutes with everything in \(X\) and has \(p^2 = p \). (In the previous example, \(p\) is the characteristic function of the interval \( [0,1/2] \).) Let

    $$ F : X \to X $$ be the map that multiplies any element of \(X\) by \(p\). Then

    $$ F(fg) = p f g = p^2 f g = p f p g = F(f) F(g) $$ but

    $$ F(1) = p \ne 1 $$ unless \(p\) = 1.

    Puzzle. Suppose \(X\) is any monoid and \(F : X \to X\) is any map with \(F(fg) = F(f) F(g) \) for all \(f,g \in X\). Is \(F(1)\) a central idempotent?

    Comment Source:Anindya wrote: > I was wondering if we needed both rules – "f preserves ⊗" and "f preserves I" – in the definition of a monoid homomorphism. That's a good question; for group homomorphisms we don't need the second condition. But for monoid homomorphisms we do. Jonathan and you have given some nice examples from logic; here's a typical example from analysis: Let \\(X\\) be the set of functions \\(f : [0,1] \to \mathbb{R}\\). Make this into a monoid with pointwise multiplication of functions as its multiplication and the constant function \\(1\\) as the unit. Let \[ F : X \to X \] be the map that multiplies any function \\(f\\) by the characteristic function of the interval \\( [0,1/2] \\). Then \[ F(fg) = F(f) F(g) \] for all \\(f,g \in X\\) but \[ F(1) \ne 1 .\] We can generalize this as follows. Suppose \\(X\\) is any monoid and \\(p \in M\\) is a **central idempotent**: an element that commutes with everything in \\(X\\) and has \\(p^2 = p \\). (In the previous example, \\(p\\) is the characteristic function of the interval \\( [0,1/2] \\).) Let \[ F : X \to X \] be the map that multiplies any element of \\(X\\) by \\(p\\). Then \[ F(fg) = p f g = p^2 f g = p f p g = F(f) F(g) \] but \[ F(1) = p \ne 1 \] unless \\(p\\) = 1. **Puzzle.** Suppose \\(X\\) is any monoid and \\(F : X \to X\\) is any map with \\(F(fg) = F(f) F(g) \\) for all \\(f,g \in X\\). Is \\(F(1)\\) a central idempotent?
  • 22.

    Re this puzzle:

    Suppose \(X\) is any monoid and \(F : X \to X\) is any map with \(F(fg) = F(f) F(g) \) for all \(f,g \in X\). Is \(F(1)\) a central idempotent?

    It's taken me a bit of fiddling to come up with a counterexample to that one, but I think I've got one...

    First let's note that \(F(1)F(1) = F(1.1) = F(1)\), so \(F(1)\) is certainly idempotent. And \(F(1)F(g) = F(1.g) = F(g.1) = F(g)F(1)\), so \(F(1)\) commutes with anything in the image of \(F\). But \(F(1)\) need not be central (ie commute with anything in \(X\)), as the following example shows.

    Let \(2 = \{0, 1\}\) be a two-element set, and let \(X\) be the set of maps \(2\rightarrow 2\).

    So \(X\) has four elements:

    — the identity map
    — the map switching round \(0\) and \(1\)
    — the constant map \(c_0\) sending everything to \(0\)
    — the constant map \(c_1\) sending everything to \(1\)

    \(X\) is a monoid under composition, but it is not commutative: \(c_0\circ c_1 \neq c_1 \circ c_0\)

    Now let \(F : X \rightarrow X\) be the constant map sending everything in \(X\) to \(c_0\).

    Then we certainly have \(F(x\circ y) = F(x)\circ F(y)\), since both sides are always \(c_0\).

    But as noted above, \(c_0\) is not central because it does not commute with \(c_1\).

    Comment Source:Re this puzzle: > Suppose \\(X\\) is any monoid and \\(F : X \to X\\) is any map with \\(F(fg) = F(f) F(g) \\) for all \\(f,g \in X\\). Is \\(F(1)\\) a central idempotent? It's taken me a bit of fiddling to come up with a counterexample to that one, but I think I've got one... First let's note that \\(F(1)F(1) = F(1.1) = F(1)\\), so \\(F(1)\\) is certainly idempotent. And \\(F(1)F(g) = F(1.g) = F(g.1) = F(g)F(1)\\), so \\(F(1)\\) commutes with anything in the image of \\(F\\). But \\(F(1)\\) need not be central (ie commute with anything in \\(X\\)), as the following example shows. Let \\(2 = \\{0, 1\\}\\) be a two-element set, and let \\(X\\) be the set of maps \\(2\rightarrow 2\\). So \\(X\\) has four elements: — the identity map — the map switching round \\(0\\) and \\(1\\) — the constant map \\(c_0\\) sending everything to \\(0\\) — the constant map \\(c_1\\) sending everything to \\(1\\) \\(X\\) is a monoid under composition, but it is not commutative: \\(c_0\circ c_1 \neq c_1 \circ c_0\\) Now let \\(F : X \rightarrow X\\) be the constant map sending everything in \\(X\\) to \\(c_0\\). Then we certainly have \\(F(x\circ y) = F(x)\circ F(y)\\), since both sides are always \\(c_0\\). But as noted above, \\(c_0\\) is not central because it does not commute with \\(c_1\\).
  • 23.

    (actually just noticed we can delete the "switch" map from X above to get an even smaller counterexample – a monoid with just three elements. since any two-element monoid is commutative, this is the smallest counterexample. more generally, consider the "words on an alphabet" example where we identify any two words that begin with the same letter.)

    Comment Source:(actually just noticed we can delete the "switch" map from X above to get an even smaller counterexample – a monoid with just three elements. since any two-element monoid is commutative, this is the smallest counterexample. more generally, consider the "words on an alphabet" example where we identify any two words that begin with the same letter.)
  • 24.

    Good, Anindya! Right, saying that a map from a monoid to itself \(F: X \to X\) preserves multiplication quickly implies that \(F(1)\) is an idempotent that commute with everything in the range of \(F\), but no more... so on the principle that "you don't get something for nothing", there should be examples where \(F(1)\) doesn't commute with everything in \(X\). Then the challenge is to find a counterexample... a challenge you met.

    Comment Source:Good, Anindya! Right, saying that a map from a monoid to itself \\(F: X \to X\\) preserves multiplication quickly implies that \\(F(1)\\) is an idempotent that commute with everything _in the range_ of \\(F\\), but no more... so on the principle that "you don't get something for nothing", there should be examples where \\(F(1)\\) doesn't commute with everything in \\(X\\). Then the challenge is to find a counterexample... a challenge you met.
  • 25.
    edited May 2018

    Michael wrote:

    I am having a hard time understanding strong monoidal monotone functions. This seems to be a paradox for me (like this water is both hot and cold). Does anyone have an example of one that can clarify how this can happen?

    I'm guessing what seemed "paradoxical" was having

    $$ f(x) \otimes f(x) \le f(x \otimes x') \textrm{ and } I \le f(I) $$ and also

    $$ f(x) \otimes f(x) \ge f(x \otimes x') \textrm{ and } I \ge f(I) $$ yet still not having

    $$ f(x) \otimes f(x) = f(x \otimes x') \textrm{ and } I = f(I). $$ Presumably this was because you're used to posets, where \(x \le y \) and \(y \le x\) imply \(x = y\). But in a preorder this needn't be true.

    So, you need to understand preorders that aren't posets. Here is the easiest example: take a set \(X\) and decree that everything in this set is less than or equal to everything else. Then the laws of a preorder hold: check them in your mind, and if you don't instantly remember what these laws are, go to jail and stay there until you do! But the law that makes a preorder a poset:

    $$ \textrm{ if } x \le y \textrm{ and } y \le x \textrm{ then } x = y $$ obviously does not hold.

    So, this kind of preorder, very far from being a poset, is good to keep in mind.

    Let's use this kind to get a monotone map that's strong monoidal but not strict monoidal.

    Answer to Puzzle 82. Let \(X\) and \(Y\) be monoids and let \(f : X \to Y\) be a function that's not a homomorphism: for example,

    $$ f(x \otimes x') \ne f(x) \otimes f(x') $$ for some \(x,x' \in X\). Examples of this are a dime a dozen: just take your favorite two monoids with lots of elements and take some random idiotic function between them: it probably won't make \(f(x \otimes x') = f(x) \otimes f(x')\) for all \(x,x' \in X\).

    Now, make \(X\) into a preorder in the silly way I just described: decree that everything is less than or equal to everything. Do the same for \(Y\).

    Then it's easy to see that \(X\) and \(Y\) are monoidal preorders. For example \(X\) obeys

    $$ x_1 \le x_1' \textrm{ and } x_2 \le x_2' \textrm{ imply } x_1 \otimes x_2 \le x_1' \otimes x_2' $$ because everything in \(X\) is less than or equal to everything else!

    Similarly, it's easy to see that \(f: X \to Y\) is lax monoidal. We have

    $$ f(x) \otimes f(x) \le f(x \otimes x') \textrm{ and } I \le f(I), $$ because everything in \(Y\) is less than or equal to everything else!

    We also know that \(f\) is oplax monoidal:

    $$ f(x) \otimes f(x) \ge f(x \otimes x') \textrm{ and } I \ge f(I) $$ because everything in \(Y\) is greater than or equal to everything else!

    So \(f\) is strong monoidal for very silly reasons. But it's not strict monoidal, because we've set things up to ensure

    $$ f(x \otimes x') \ne f(x) \otimes f(x') . $$ Get it?

    The moral is that sometimes in a preorder saying that one thing is less than or equal to another is saying absolutely nothing, because everybody is less than or equal to everybody else. So in a typical preorder, you should never expect to reason from inequalities to equations. For that, you want a poset.

    Comment Source:Michael wrote: > I am having a hard time understanding strong monoidal monotone functions. This seems to be a paradox for me (like this water is both hot and cold). Does anyone have an example of one that can clarify how this can happen? I'm guessing what seemed "paradoxical" was having \[ f(x) \otimes f(x) \le f(x \otimes x') \textrm{ and } I \le f(I) \] and also \[ f(x) \otimes f(x) \ge f(x \otimes x') \textrm{ and } I \ge f(I) \] yet still not having \[ f(x) \otimes f(x) = f(x \otimes x') \textrm{ and } I = f(I). \] Presumably this was because you're used to _posets_, where \\(x \le y \\) and \\(y \le x\\) imply \\(x = y\\). But in a _preorder_ this needn't be true. So, you need to understand preorders that aren't posets. Here is the easiest example: take a set \\(X\\) and decree that _everything_ in this set is less than or equal to _everything else_. Then the laws of a preorder hold: check them in your mind, and if you don't instantly remember what these laws are, go to jail and stay there until you do! But the law that makes a preorder a poset: \[ \textrm{ if } x \le y \textrm{ and } y \le x \textrm{ then } x = y \] obviously does _not_ hold. So, this kind of preorder, very far from being a poset, is good to keep in mind. Let's use this kind to get a monotone map that's strong monoidal but not strict monoidal. **Answer to Puzzle 82.** Let \\(X\\) and \\(Y\\) be monoids and let \\(f : X \to Y\\) be a function that's _not_ a homomorphism: for example, \[ f(x \otimes x') \ne f(x) \otimes f(x') \] for some \\(x,x' \in X\\). Examples of this are a dime a dozen: just take your favorite two monoids with lots of elements and take some random idiotic function between them: it probably won't make \\(f(x \otimes x') = f(x) \otimes f(x')\\) for all \\(x,x' \in X\\). Now, make \\(X\\) into a preorder in the silly way I just described: decree that _everything_ is less than or equal to _everything_. Do the same for \\(Y\\). Then it's easy to see that \\(X\\) and \\(Y\\) are monoidal preorders. For example \\(X\\) obeys \[ x_1 \le x_1' \textrm{ and } x_2 \le x_2' \textrm{ imply } x_1 \otimes x_2 \le x_1' \otimes x_2' \] because _everything in \\(X\\) is less than or equal to everything else!_ Similarly, it's easy to see that \\(f: X \to Y\\) is lax monoidal. We have \[ f(x) \otimes f(x) \le f(x \otimes x') \textrm{ and } I \le f(I), \] because _everything in \\(Y\\) is less than or equal to everything else!_ We also know that \\(f\\) is oplax monoidal: \[ f(x) \otimes f(x) \ge f(x \otimes x') \textrm{ and } I \ge f(I) \] because _everything in \\(Y\\) is greater than or equal to everything else!_ So \\(f\\) is strong monoidal for very silly reasons. But it's not _strict_ monoidal, because we've set things up to ensure \[ f(x \otimes x') \ne f(x) \otimes f(x') . \] Get it? The moral is that _sometimes_ in a preorder saying that one thing is less than or equal to another is saying _absolutely nothing_, because _everybody_ is less than or equal to _everybody_ else. So in a typical preorder, you should _never_ expect to reason from inequalities to equations. For that, you want a poset.
  • 26.

    John

    take a set X and decree that everything in this set is less than or equal to everything else.

    I think I will always remember this example John. Thanks. Your example made me realize why I have trouble sometimes with the most basic things. I keep thinking in terms of specific examples while trying to keep tabs on the abstraction which causes problems if the example in my head isn't abstract enough to capture the greater picture.

    So in a typical preorder, you should never expect to reason from inequalities to equations. For that, you want a poset.

    This opened up a lot for me. Thanks.

    Comment Source:John >take a set X and decree that everything in this set is less than or equal to everything else. I think I will always remember this example John. Thanks. Your example made me realize why I have trouble sometimes with the most basic things. I keep thinking in terms of specific examples while trying to keep tabs on the abstraction which causes problems if the example in my head isn't abstract enough to capture the greater picture. >So in a typical preorder, you should never expect to reason from inequalities to equations. For that, you want a poset. This opened up a lot for me. Thanks.
  • 27.
    edited May 2018

    I'm glad that helped:

    I keep thinking in terms of specific examples...

    If you do that, which can be dangerous, you need enough examples to keep from getting led astray by the peculiarities of special cases.

    In particular, you need to pay special attention to "silly" examples, where the structures are chosen in a "trivial" way. The preorder where everything is \(\le\) to everything else; the preorder where anything is only \(\le\) itself - all other examples stand somewhere between these two.

    Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of posets that aren't preorders. Without this, you can't see what the special extra features are buying you! Mathematicians automatically do this whenever they learn new concepts.

    It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates.

    Comment Source:I'm glad that helped: > I keep thinking in terms of specific examples... If you do that, which can be dangerous, you need enough examples to keep from getting led astray by the peculiarities of special cases. In particular, you need to pay special attention to "silly" examples, where the structures are chosen in a "trivial" way. The preorder where everything is \\(\le\\) to everything else; the preorder where anything is only \\(\le\\) itself - all other examples stand somewhere between these two. Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of posets that _aren't_ preorders. Without this, you can't see what the special extra features are buying you! Mathematicians automatically do this whenever they learn new concepts. It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates.
  • 28.

    It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates.

    I def need more mathematical creatures in my zoo LOL. I will keep this in mind and practice finding examples of new concepts with special features.

    Comment Source:>It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates. I def need more mathematical creatures in my zoo LOL. I will keep this in mind and practice finding examples of new concepts with special features.
  • 29.

    It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates.

    John, this example is a bit confusing: animals are a more generic structure than vertebrates (vertebrates are animals with a specific structure in them - they are animals with vertebrae), so

    preorders \(\rightarrow\) animals

    posets \(\rightarrow\) vertebrates

    So following the logic, to learn more about vertebrates, one should look for vertebrates which aren't animals, with attached interpretation - to learn more about vertebrates, look for some artificial structures which have spine-like support in them.

    Comment Source:>It's just like if you were a biologist and you'd just learned about "animals" and "vertebrates": you'd have to learn about some animals that aren't vertebrates. John, this example is a bit confusing: animals are a more generic structure than vertebrates (vertebrates are animals with a specific structure in them - they are animals with vertebrae), so preorders \\(\rightarrow\\) animals posets \\(\rightarrow\\) vertebrates So following the logic, to learn more about vertebrates, one should look for vertebrates which *aren't* animals, with attached interpretation - to learn more about vertebrates, look for some artificial structures which have spine-like support in them.
  • 30.

    Well, every poset is a preorder, so I believe John meant "you need to look for examples of preorders that aren't posets". If we correct this side of the analogy, everything works well.

    Comment Source:Well, every poset _is_ a preorder, so I believe John meant "you need to look for examples of preorders that aren't posets". If we correct this side of the analogy, everything works well.
  • 31.
    edited May 2018

    Puzzle 80. Tell me a few more homomorphisms between monoids that you routinely use, or at least know.

    Here are a couple programming examples.

    In Haskell, in the base language there is a type class for Monoids defined in Data.Monoid.

    Haskell has its own notation for Monoids. In Haskell, \(I\) becomes mempty and \(\otimes\) becomes (<>).

    Two of the monoids defined in Data.Monoid are Monoid (First a) and Monoid [a]. First a is also defined in Data.Monoid.

    First is a newtype wrapper around Maybe a. Its identity element is First Nothing. Its monoidal operation (<>) has the following definition

    (First Nothing) <> x = x
    (First (Just a)) <> _ = First (Just a)
    

    The other monoid instance is for list, the free monoid over a. Its identity element is [] and monoidal operation is list append (ie, ++).

    There are two monoid homomorphisms between these types.

    1. The first homomorphism is First . listToMaybe :: [a] -> First a. Here listToMaybe is defined in Data.Maybe.listToMaybe. This function tries to take the first element of a list. If there is such an element a then it gets wrapped in First (Just a). If it does not exist (because the list is empty), then it returns First Nothing.

      The Monoid homomorphisms laws in Haskell are defined by: $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; (a\; \mathsf{<>}\; b) \equiv ((\mathsf{First}\; .\; \mathsf{listToMaybe})\; a)\; \mathsf{<>}\; ((\mathsf{First}\; .\; \mathsf{listToMaybe})\; b)$$ $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; (\mathsf{mempty}\; ::\; \mathsf{[}a\mathsf{]}) \equiv (\mathsf{mempty}\; :: \; \mathsf{First}\; a)$$ Hence this function is a monoid homomorphism.

    2. The second homomorphism is in the opposite direction: maybeToList . getFirst :: First a -> [a]. This is the right inverse of First . listToMaybe, so: $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; .\; (\mathsf{maybeToList}\; .\; \mathsf{getFirst}) \equiv \mathsf{id}$$ The function maybeToList is defined in Data.Maybe.maybeToList. It obeys the same monoid homomorphism laws as its left inverse.

    Comment Source:> **Puzzle 80.** Tell me a few more homomorphisms between monoids that you routinely use, or at least know. Here are a couple programming examples. In Haskell, in the base language there is a type class for Monoids defined in [`Data.Monoid`](https://hackage.haskell.org/package/base-4.11.1.0/docs/Data-Monoid.html). Haskell has its own notation for Monoids. In Haskell, \\(I\\) becomes `mempty` and \\(\otimes\\) becomes `(<>)`. Two of the monoids defined in `Data.Monoid` are `Monoid (First a)` and `Monoid [a]`. `First a` is also defined in `Data.Monoid`. `First` is a [`newtype`](https://wiki.haskell.org/Newtype) wrapper around `Maybe a`. Its identity element is `First Nothing`. Its monoidal operation `(<>)` has the following definition <pre> (First Nothing) <> x = x (First (Just a)) <> _ = First (Just a) </pre> The other monoid instance is for *list*, the [free monoid](https://en.wikipedia.org/wiki/Free_monoid) over `a`. Its identity element is `[]` and monoidal operation is *list append* (ie, [`++`](https://hackage.haskell.org/package/base-4.11.1.0/docs/Prelude.html#v:-43--43-)). There are two monoid homomorphisms between these types. 1. The first homomorphism is `First . listToMaybe :: [a] -> First a`. Here `listToMaybe` is defined in [`Data.Maybe.listToMaybe`](https://hackage.haskell.org/package/base-4.11.1.0/docs/Data-Maybe.html#v:listToMaybe). This function tries to take the first element of a list. If there is such an element `a` then it gets wrapped in `First (Just a)`. If it does not exist (because the list is empty), then it returns `First Nothing`. The *Monoid homomorphisms laws* in Haskell are defined by: $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; (a\; \mathsf{<>}\; b) \equiv ((\mathsf{First}\; .\; \mathsf{listToMaybe})\; a)\; \mathsf{<>}\; ((\mathsf{First}\; .\; \mathsf{listToMaybe})\; b)$$ $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; (\mathsf{mempty}\; ::\; \mathsf{[}a\mathsf{]}) \equiv (\mathsf{mempty}\; :: \; \mathsf{First}\; a)$$ Hence this function is a monoid homomorphism. 2. The second homomorphism is in the opposite direction: `maybeToList . getFirst :: First a -> [a]`. This is the *right inverse* of `First . listToMaybe`, so: $$(\mathsf{First}\; .\; \mathsf{listToMaybe})\; .\; (\mathsf{maybeToList}\; .\; \mathsf{getFirst}) \equiv \mathsf{id}$$ The function `maybeToList` is defined in [`Data.Maybe.maybeToList`](https://hackage.haskell.org/package/base-4.11.1.0/docs/Data-Maybe.html#v:maybeToList). It obeys the same monoid homomorphism laws as its left inverse.
  • 32.

    @Matthew – what's the motivation for "wrapping" Maybe with First? is Maybe not a monoid "already", so to speak?

    Comment Source:@Matthew – what's the motivation for "wrapping" `Maybe` with `First`? is `Maybe` not a monoid "already", so to speak?
  • 33.
    edited May 2018

    Maybe a all on its own doesn’t have a monoid instance. There are a few possible semantics it can have.

    If a is a monoid, then there is an instance for Maybe a. It’s worth trying to guess how this is defined in base if you haven the seen it.

    Apart from First, there is another new type called Last that gives different monoid semantics for Maybe. You can always look it up with hoogle, but the name is a good hint at how it works if you want to guess...

    Comment Source:`Maybe a` all on its own doesn’t have a monoid instance. There are a few possible semantics it can have. If `a` is a monoid, then there is an instance for `Maybe a`. It’s worth trying to guess how this is defined in base if you haven the seen it. Apart from `First`, there is another new type called `Last` that gives different monoid semantics for `Maybe`. You can always look it up with hoogle, but the name is a good hint at how it works if you want to guess...
  • 34.

    ah I get it – there's no canonical way of defining a product of Maybes. you have choose one (First or Last).

    Comment Source:ah I get it – there's no canonical way of defining a product of Maybes. you have choose one (First or Last).
  • 35.
    edited May 2018

    ah I get it – there's no canonical way of defining a product of Maybes. you have choose one (First or Last).

    It's a little trickier than that, sadly.

    For instance, while Maybe a is not necessarily a monoid, Maybe [a] is.

    Feel free to check in the interpreter, but we have:

    $$ (\textsf{Just}\; a) <> (\textsf{Just}\; b) \cong \textsf{Just}\; (a ++ b) $$ So we have that Just :: [a] -> Maybe [a] is a semigroup homorphism...

    Comment Source:> ah I get it – there's no canonical way of defining a product of Maybes. you have choose one (First or Last). It's a little trickier than that, sadly. For instance, while `Maybe a` is not necessarily a monoid, `Maybe [a]` *is*. Feel free to check in the interpreter, but we have: $$ (\textsf{Just}\; a) <> (\textsf{Just}\; b) \cong \textsf{Just}\; (a ++ b) $$ So we have that `Just :: [a] -> Maybe [a]` is a semigroup homorphism...
  • 36.
    edited May 2018

    Also it's important to note that list being the free* (and therefore initial) there is a mapping, foldr mempty (<>) is a mapping [a] -> a that is automatically a monoid homomorphism, irregardless what the monoid on a is.

    • technically list append is also infinitely associative. I have read anyway. (When i try to use what I thinkthis means, the left associative form diverges, and the right doesn't. Which makes them non equal. So.. either my memory or my understanding is wrong. )
    • the difference between foldr and foldl' only matters if the biop is non associative or you care about /how/ the value is computed.
    Comment Source:Also it's important to note that list being the free* (and therefore initial) there is a mapping, `foldr mempty (<>)` is a mapping `[a] -> a` that is automatically a monoid homomorphism, irregardless what the monoid on `a` is. * technically list append is also infinitely associative. I have read anyway. (When i try to use what I thinkthis means, the left associative form diverges, and the right doesn't. Which makes them non equal. So.. either my memory or my understanding is wrong. ) * the difference between foldr and foldl' only matters if the biop is non associative or you care about /how/ the value is computed.
  • 37.
    edited May 2018

    Hey I have a terminology question. There are two ways to turn a preorder into a equivalence relation. 1) the symmetric closure:

    \[a \sim b := a \le b \text{ or } b \le a \] and 2) the symmetric ??:

    \[a \sim b := a \le b \text{ and } b \le a \]

    I think I've read "core" used for this in the context of turning groups commutative. Is that a good choice for the general idea? The closure is often the free construction, and I think the "core" would be the other adjoint to the forgetful functor some times.

    Comment Source:Hey I have a terminology question. There are two ways to turn a preorder into a equivalence relation. 1) the symmetric closure: \\[a \sim b := a \le b \text{ or } b \le a \\] and 2) the symmetric ??: \\[a \sim b := a \le b \text{ and } b \le a \\] I think I've read "core" used for this in the context of turning groups commutative. Is that a good choice for the general idea? The closure is often the free construction, and I think the "core" would be the other adjoint to the forgetful functor some times.
  • 38.

    I wrote:

    Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of posets that aren't preorders.

    As Jonathan pointed out, that was wrong: there aren't any posets that aren't preorders. I meant to say this:

    Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of preorders that aren't posets.

    Comment Source:I wrote: > Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of posets that aren't preorders. As Jonathan pointed out, that was wrong: there aren't any posets that aren't preorders. I meant to say this: > Another important thing to do is this: whenever someone defines some sort of gadget, like a "preorder", and then defines a nice special case with extra features, like a "poset", you need to look for examples of preorders that aren't posets.
  • 39.
    edited May 2018

    Christopher wrote:

    There are two ways to turn a preorder into a equivalence relation. 1) the symmetric closure:

    \[a \sim b := a \le b \text{ or } b \le a \]

    Alas, that doesn't always work.

    Puzzle. Find a preorder such that \(\sim\), defined as above, is not an equivalence relation.

    and 2) the symmetric ??:

    \[a \sim b := a \le b \text{ and } b \le a \]

    This does give an equivalence relation, and it's very important. We usually write \(a \cong b\) in this case, and say \(a\) and \(b\) are isomorphic.

    I think I've read "core" used for this in the context of turning groups commutative.

    I haven't heard of this, and I don't see how exactly this is related to maing groups commutative (aka "abelian"). There's a forgetful functor from abelian groups to groups, and it has a left adjoint called "abelianization". There's also another way to take a group and get an abelian group, called taking the "center". However, this is not a functor!

    Anyway, I guess you're talking about the forgetful functor from the category of "sets with equivalence relation" to the category of preorders, and possible left or right adjoints to this. Method 2) of turning preorders into equivalence relations feels like a right adjoint, though I haven't checked.

    Here's another way to turn a preorder into an equivalence relation. Start with a preorder. Take its symmetric closure as you did, defining \(a \sim b\) if \(a \le b\) or \(b \le a\). Then take the transitive closure of that relation\(\sim\), getting a relation that's both symmetric and transitive.

    I believe this method gives the left adjoint to the forgetful functor from the category of "sets with equivalence relation" to the category of preorders... though again, I haven't carefully checked this so don't sue me if I'm wrong!

    Comment Source:Christopher wrote: > There are two ways to turn a preorder into a equivalence relation. 1) the symmetric closure: > \\[a \sim b := a \le b \text{ or } b \le a \\] Alas, that doesn't always work. **Puzzle.** Find a preorder such that \\(\sim\\), defined as above, is not an equivalence relation. > and 2) the symmetric ??: > \\[a \sim b := a \le b \text{ and } b \le a \\] This does give an equivalence relation, and it's very important. We usually write \\(a \cong b\\) in this case, and say \\(a\\) and \\(b\\) are **isomorphic**. > I think I've read "core" used for this in the context of turning groups commutative. I haven't heard of this, and I don't see how exactly this is related to maing groups commutative (aka "abelian"). There's a forgetful functor from abelian groups to groups, and it has a left adjoint called ["abelianization"](http://mathworld.wolfram.com/Abelianization.html). There's also another way to take a group and get an abelian group, called taking the ["center"](https://en.wikipedia.org/wiki/Center_(group_theory)). However, this is not a functor! Anyway, I guess you're talking about the forgetful functor from the category of "sets with equivalence relation" to the category of preorders, and possible left or right adjoints to this. Method 2) of turning preorders into equivalence relations feels like a right adjoint, though I haven't checked. Here's another way to turn a preorder into an equivalence relation. Start with a preorder. Take its symmetric closure as you did, defining \\(a \sim b\\) if \\(a \le b\\) or \\(b \le a\\). Then take the [transitive closure](https://en.wikipedia.org/wiki/Transitive_closure) of that relation\\(\sim\\), getting a relation that's both symmetric and transitive. I believe this method gives the left adjoint to the forgetful functor from the category of "sets with equivalence relation" to the category of preorders... though again, I haven't carefully checked this so don't sue me if I'm wrong!
  • 40.
    edited May 2018

    Christopher: what you actually described were the two main ways to take an arbitrary binary relation and turn it into a symmetric binary relation. These are the left and right adjoints to the forgetful functor from "sets with symmetric relation" to "sets with arbitrary binary relation". As usual the left adjoint is the "permissive, generous" method, which uses "or", while the right adjoint is the "cautious, ungenerous" method which uses "and".

    You see, "or" is itself a left adjoint, while "and" is a right adjoint.

    Comment Source:Christopher: what you actually described were the two main ways to take an arbitrary binary relation and turn it into a symmetric binary relation. These are the left and right adjoints to the forgetful functor from "sets with symmetric relation" to "sets with arbitrary binary relation". As usual the left adjoint is the "permissive, generous" method, which uses "or", while the right adjoint is the "cautious, ungenerous" method which uses "and". You see, ["or" is itself a left adjoint, while "and" is a right adjoint](https://forum.azimuthproject.org/discussion/2037/lecture-17-chapter-1-the-grand-synthesis/p1).
  • 41.

    @Matthew -- re:

    while Maybe a is not necessarily a monoid, Maybe [a] is

    This seems to be a special case of a more general result, namely that if Moo is a monoid, then we can define a canonical monoid structure on Maybe Moo.

    The unit is Nothing, and we define (Just x) <> (Just y) to be Just (x <> y).

    Justhere is a good example of a semigroup homomorphism between monoids that isn't a monoid homomorphism (because it doesn't preserve the unit).

    Comment Source:@Matthew -- re: > while `Maybe a` is not necessarily a monoid, `Maybe [a]` *is* This seems to be a special case of a more general result, namely that if `Moo` is a monoid, then we can define a canonical monoid structure on `Maybe Moo`. The unit is `Nothing`, and we define `(Just x) <> (Just y)` to be `Just (x <> y)`. `Just`here is a good example of a semigroup homomorphism between monoids that isn't a monoid homomorphism (because it doesn't preserve the unit).
  • 42.
    edited May 2018

    @Matthew -- re:

    while Maybe a is not necessarily a monoid, Maybe [a] is

    This seems to be a special case of a more general result, namely that if Moo is a monoid, then we can define a canonical monoid structure on Maybe Moo.

    The unit is Nothing, and we define (Just x) <> (Just y) to be Just (x <> y).

    You are absolutely right!

    Haskell's defines the instance Semigroup a => Semigroup (Maybe a) in the Prelude, following your specification:

    instance Semigroup a => Semigroup (Maybe a) where
        Nothing <> b       = b
        a       <> Nothing = a
        Just a  <> Just b  = Just (a <> b)
    

    And the unit for this monoid is defined in Data.Monoid, again exactly as you suggest:

    instance Semigroup a => Monoid (Maybe a) where
        mempty = Nothing
    
    Comment Source:> @Matthew -- re: > > while `Maybe a` is not necessarily a monoid, `Maybe [a]` *is* > > This seems to be a special case of a more general result, namely that if `Moo` is a monoid, then we can define a canonical monoid structure on `Maybe Moo`. > > The unit is `Nothing`, and we define `(Just x) <> (Just y)` to be `Just (x <> y)`. You are absolutely right! Haskell's defines the instance `Semigroup a => Semigroup (Maybe a)` in the [Prelude](https://hackage.haskell.org/package/base-4.11.1.0/docs/src/GHC.Base.html#line-406), following your specification: <pre> instance Semigroup a => Semigroup (Maybe a) where Nothing <> b = b a <> Nothing = a Just a <> Just b = Just (a <> b) </pre> And the *unit* for this monoid is defined in [Data.Monoid](https://hackage.haskell.org/package/base-4.11.1.0/docs/src/GHC.Base.html#line-422), again exactly as you suggest: <pre> instance Semigroup a => Monoid (Maybe a) where mempty = Nothing </pre>
Sign In or Register to comment.