Options

Lecture 67 - Chapter 4: Feedback in Collaborative Design

edited February 2020 in Baez ACT 2019: Lectures

Last time we learned most of the tricks needed to assemble a co-design diagram from the smaller boxes inside:

image

Remember, we're really building a feasibility relation out of other feasibility relations. Each smaller box is a feasibility relation. If we don't inquire into exactly what feasibility relations these particular boxes stand for, just how they're assembled, there's just one question left to discuss. What happens when a wire bends back?

Intuitively, this describes 'feedback'. For example, I may take some bread, toast it, sell it, make money, and use that to buy more bread. Here I'm stringing together a number of processes, each one producing the resources needed for the next... until finally the last one produces some resources needed for the first!

It's not obvious how to describe feedback using just composition, which takes two feasibility relations \(\Phi \colon X \nrightarrow Y\) and \(\Psi \colon Y \nrightarrow Z\) and puts one after the other:

image

and tensoring, which takes two feasibility relations \(\Phi \colon X \nrightarrow Y\) and \(\Psi \colon X' \nrightarrow Y'\) and puts them side by side:

image

Using these we can build co-design diagrams like this:

image

But we cannot get wires that bend back! For that we need two extra feasibility relations. The first is called the cup:

image

It looks like a wire coming in labelled \(X\), bending around and going back out to the left. In co-design diagrams, labels of wires stand for preorders. The wire coming in stands for a preorder \(X\) as usual. But what about the wire going back out? This stands for \(X^{\text{op}}\): the same set with the opposite concept of \(\le\).

This is a a rule we haven't discussed yet. A wire going from left to right, labelled by some preorder, stands for that preorder. But a wire going right to left, labelled by some preorder stands for the opposite of that preorder.

But how do we tell if wire are going from left to right or the other way? One way is to draw arrows on them, as I've done just now. Fong and Spivak often use a different notation: a wire going from left to right is labelled \(\le\), while one going the other way is labelled \(\ge\). This will turn out to make a lot of sense.

Anyway, it looks like our cup stands for a feasibility relation from \(X^{\text{op}} \times X \) to... nothing! But what's 'nothing'?

That's another rule I haven't told you. 'Nothing' - the invisible wire - is our symbol for the preorder \(\textbf{1}\). This is the set \( \{0\} \) made into a preorder in the only way possible, with \(0 \le 0\).

So, our cup stands for some particular feasibility relation

$$ \cup_X \colon X^{\text{op}} \times X \nrightarrow \textbf{1} . $$ Which one? We can guess if remember that such a feasibility relation is really a monotone function

$$ \cup_X \colon (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \to \mathbf{Bool} .$$ We can simplify this a bit:

Puzzle 213. Show that for any preorder \(A\), the preorder \( A \times \textbf{1}\) is isomorphic to \(A\):: in other words, there is a monotone function from \( A \times \textbf{1}\) to \(A\) with a monotone inverse. For short we write \(A \times \textbf{1} \cong A\). (This is one way in which \(\textbf{1}\) acts like 'nothing'.)

Puzzle 214. Show that for any preorders \(A\) and \(B\) we have \( (A \times B)^{\text{op}} \cong A^{\text{op}} \times B^{\text{op}}\).

Puzzle 215. Show that for any preorder \(A\) we have \( (A^{\text{op}})^{\text{op}} \cong A\).

So, we get

$$ (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \cong (X^{\text{op}} \times X)^\text{op} \cong (X^{\text{op}})^\text{op} \times X^\text{op} \cong X \times X^\text{op} $$ Replacing the fancy expression at left by the simpler but isomorphic expression at right, we can reinterpret the cup more simply as a monotone function

$$ \cup_X \colon X \times X^{\text{op}} \to \textbf{Bool} . $$ What could this be? It should remind you of our old friend the hom-functor

$$ \text{hom} \colon X^{\text{op}} \times X \to \textbf{Bool} .$$ And that's what it is - just twisted around a bit! In other words, we define

$$ \cup_X (x,x') = \text{hom}(x',x) . $$ If you forget your old friend the hom-functor, this just means that

$$ \cup_X (x,x') = \begin{cases} \texttt{true} & \mbox{if } x' \le x \\ \texttt{false} & \mbox{otherwise.} \end{cases} $$ Whew! It's simple in the end: it just says that when send a resource round a bend, what comes out must be less than or equal to what came in. The little inequality symbols on this picture are designed to make that easier to remember:

image

One more thing: to describe feedback we also need a feasibility relation called the cap, which looks like this:

image

This is some feasibility relation from 'nothing' to \( X \times X^{\text{op}}\), or more precisely

$$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} .$$ Puzzle 216. Rewrite this feasibility relation as a monotone function and simplify it just as we did for the cup. Then, guess what it is!

The cap winds up being just as simple as the cup. We can discuss this and work it out... and next time I'll show you how to use the cap and cup to describe feedback.

By the way, this stuff we're doing was first invented in particle physics. Feynman invented diagrams that describe the interaction between particles... and he realized that an antiparticle could be understood as a particle going 'backwards in time', so he drew little arrows to indicate whether the edges in his diagrams were going forwards or backwards in time. The cup describes the annihilation of a particle-antiparticle pair, and the cap describes the creation of such a pair.

Only much later did people realize that the preorder \(X^{\text{op}}\) is mathematically like the 'antiparticle' of \(X\).

To read other lectures go here.

Comments

  • 1.
    edited July 2018

    Puzzle 213. Show that for any preorder \(A\), the preorder \( A \times \textbf{1}\) is isomorphic to \(A\):: in other words, there is a monotone function from \( A \times \textbf{1}\) to \(A\) with a monotone inverse. For short we write \(A \times \textbf{1} \cong A\). (This is one way in which \(\textbf{1}\) acts like 'nothing'.)

    If we note that \(A \cong A^{\mathbf{1}}\) for all categories (and therefore also all preorders), then under currying,

    \[ A \times \mathbf{1} \cong A^{\mathbf{1}} \]

    and since,

    \[ A^{\mathbf{1}} \cong A \]

    then by transitivity of \(\cong\),

    \[ A \times \mathbf{1} \cong A \]

    \(\blacksquare\).

    Comment Source:>**Puzzle 213.** Show that for any preorder \\(A\\), the preorder \\( A \times \textbf{1}\\) is **isomorphic** to \\(A\\):: in other words, there is a monotone function from \\( A \times \textbf{1}\\) to \\(A\\) with a monotone inverse. For short we write \\(A \times \textbf{1} \cong A\\). (This is one way in which \\(\textbf{1}\\) acts like 'nothing'.) If we note that \\(A \cong A^{\mathbf{1}}\\) for all categories (and therefore also all preorders), then under currying, \\[ A \times \mathbf{1} \cong A^{\mathbf{1}} \\] and since, \\[ A^{\mathbf{1}} \cong A \\] then by transitivity of \\(\cong\\), \\[ A \times \mathbf{1} \cong A \\] \\(\blacksquare\\).
  • 2.
    edited July 2018

    Keith: currying \(A \cong A^\mathbf{1}\) gives \(A \times \mathbf{1} \cong A\), not \(A \times \mathbf{1} \cong A^\mathbf{1}\). Remember,

    $$ A \times B \cong C \text{ iff } A \cong C^B $$ So, if have all these tools at our disposal, \(A \cong A^\mathbf{1}\) instantly implies what we want.

    But more importantly: your proof has just reduced showing \(A \times \mathbf{1} \cong A\) to showing \(A \cong A^{\mathbf{1}}\) and showing that currying works for preorders. These facts take more work to prove than proving \(A \times \mathbf{1} \cong A \) directly!

    I notice that you often seek 'tricks' rather than just straightforwardly working from the definitions. This is sometimes a virtue, but also sometimes a vice. When one is getting started in a new subject, there are always lots of things one needs to prove by just straighforwardly manipulating the definitions. It's important to develop the confidence and 'muscle' to push one's way through these jobs.

    In our situation here, I've defined products of preorders, but not exponentials of preorders, and I haven't proved that

    $$ A \times B \cong C \text{ iff } A \cong C^B . $$ So, a proof that relies on these not-yet-established facts doesn't count as a proof. Part of the game of math is figuring out how to get from what we officially 'have' to what we're trying to prove. We don't have the tools you're trying to use. It's as if I said "okay, let's build our strength and endurance: climb that hill!" and you said "I'll just get into my helicopter..." It's missing the point of the exercise.

    For example: we've discussed exponentials of categories, which I called 'functor categories', but we haven't proved \( A \times B \cong C \text{ iff } A \cong C^B \) for these; nor have we proved that the exponential of categories that happen to be preorders is again a preorder, etc. All this stuff is true, but it's about fifty times faster to prove \( A \times \mathbf{1} \cong A \) directly, than to prove all this stuff.

    In fact, I could have proved it in much less time than it took to write this comment! I'm only writing it because I want to help you develop 'muscle' as well as cleverness.

    Comment Source:Keith: currying \\(A \cong A^\mathbf{1}\\) gives \\(A \times \mathbf{1} \cong A\\), not \\(A \times \mathbf{1} \cong A^\mathbf{1}\\). Remember, \[ A \times B \cong C \text{ iff } A \cong C^B \] So, if have all these tools at our disposal, \\(A \cong A^\mathbf{1}\\) instantly implies what we want. But more importantly: your proof has just reduced showing \\(A \times \mathbf{1} \cong A\\) to showing \\(A \cong A^{\mathbf{1}}\\) and showing that currying works for preorders. These facts take more work to prove than proving \\(A \times \mathbf{1} \cong A \\) directly! I notice that you often seek 'tricks' rather than just straightforwardly working from the definitions. This is sometimes a virtue, but also sometimes a vice. When one is getting started in a new subject, there are always lots of things one needs to prove by just straighforwardly manipulating the definitions. It's important to develop the confidence and 'muscle' to push one's way through these jobs. In our situation here, I've defined products of preorders, but not exponentials of preorders, and I haven't proved that \[ A \times B \cong C \text{ iff } A \cong C^B . \] So, a proof that relies on these not-yet-established facts doesn't count as a proof. Part of the game of math is figuring out how to get from what we officially 'have' to what we're trying to prove. We don't have the tools you're trying to use. It's as if I said "okay, let's build our strength and endurance: climb that hill!" and you said "I'll just get into my helicopter..." It's missing the point of the exercise. For example: we've discussed exponentials of categories, which I called 'functor categories', but we haven't proved \\( A \times B \cong C \text{ iff } A \cong C^B \\) for these; nor have we proved that the exponential of categories that happen to be preorders is again a preorder, etc. All this stuff is _true_, but it's about fifty times faster to prove \\( A \times \mathbf{1} \cong A \\) _directly_, than to prove all this stuff. In fact, I could have proved it in much less time than it took to write this comment! I'm only writing it because I want to help you develop 'muscle' as well as cleverness.
  • 3.
    edited July 2018

    By the way, I learned something about 'muscle' from the physicist Hans Bethe. The Manhattan Project was full of brilliant people, but they called Bethe 'the battleship' because - while perhaps not a genius like some others - he was determined and unstoppable:

    It is not possible for us to mirror the extraordinary mental faculties of minds like Bethe and Einstein. But we can very much try to emulate their personal qualities which are more accessible if we persevere. In case of Bethe, one of his most important traits was an uncanny ability to sense his own strengths and limitations, to work on problems for which he "possessed an unfair advantage". Bethe knew he was not a genius like Dirac or Heisenberg. Rather, his particular strength was in applying a dazzling array of mathematical techniques and physical insight to concrete problems for which results could be compared with hard numbers from experiment. He could write down the problem and then go straight for the solution; this earned him the nickname "the battleship".

    Another important thing to learn from Bethe was that just like Fermi, he was willing to do whatever it took to get the solution. If it meant tedious calculations filling reams of paper, he would do it. If it meant borrowing mathematical tricks from another field he would do it.

    (From a review of Nuclear Forces: The Making of the Physicist Hans Bethe, by Silvan Schweber.)

    It's good to try to cultivate cleverness and 'brilliance', but it's also good to build 'muscle', because a combination of these skills is much more powerful than just one.

    Comment Source:By the way, I learned something about 'muscle' from the physicist Hans Bethe. The Manhattan Project was full of brilliant people, but they called Bethe 'the battleship' because - while perhaps not a genius like some others - he was determined and unstoppable: > It is not possible for us to mirror the extraordinary mental faculties of minds like Bethe and Einstein. But we can very much try to emulate their personal qualities which are more accessible if we persevere. In case of Bethe, one of his most important traits was an uncanny ability to sense his own strengths and limitations, to work on problems for which he "possessed an unfair advantage". Bethe knew he was not a genius like Dirac or Heisenberg. Rather, his particular strength was in applying a dazzling array of mathematical techniques and physical insight to concrete problems for which results could be compared with hard numbers from experiment. He could write down the problem and then go straight for the solution; this earned him the nickname "the battleship". > Another important thing to learn from Bethe was that just like Fermi, he was willing to do whatever it took to get the solution. If it meant tedious calculations filling reams of paper, he would do it. If it meant borrowing mathematical tricks from another field he would do it. (From a [review](https://blogs.scientificamerican.com/the-curious-wavefunction/book-review-e2809cnuclear-forces-the-making-of-the-physicist-hans-bethee2809d-by-silvan-schweber/) of _Nuclear Forces: The Making of the Physicist Hans Bethe_, by Silvan Schweber.) It's good to try to cultivate cleverness and 'brilliance', but it's also good to build 'muscle', because a combination of these skills is much more powerful than just one.
  • 4.
    edited July 2018

    Puzzle 213: To show \(A \times 1 \simeq A \), we need to show that there's a monotone function \(f\) with an inverse \(f^{-1}\) between these preorders.

    Define \(f : (a, 1) \mapsto a\). This is a monotone function, whenever \((a,1) \le (b,1)\) we have \(a \le b\) because by the definition of a product preorder, \((a,1) \le (b,1)\) iff \(a \le b\) and \(1 \le 1\).

    The inverse \(f^{-1} : a \mapsto (a,1)\) is also monotone since whenever \(a \le b\) we have \((a,1) \le (b,1)\) as \(1 \le 1\) holds for all pairs in \(A \times 1\). Composing both functions (\(f f^{-1} = 1_A\) and \(f^{-1}f = 1_{A \times 1} \)) proves \(A \times 1 \simeq A \).

    Note \(f\) is the projection map that appears the categorical definition of a product.

    Puzzle 214: Since the opposite preorder \(X^\mathrm{op}\) has the same objects as \(X\), it holds that there's a bijection from \( (A \times B)^\mathrm{op} \) to \(A^\mathrm{op} \times B^\mathrm{op}\). We just have to prove that both product preorders have the same order relation.

    For \((a,b) \le (a',b')\) in \(A \times B\), we have \((a,b) \ge (a',b')\) in \((A \times B)^\mathrm{op} \). However, we have \(a \ge a'\) in \(A^\mathrm{op}\) if \(a \le a'\) in \(A\) (the same is true for \(B\)), and thus \((a,b) \ge (a',b')\) in \(A^\mathrm{op} \times B^\mathrm{op}\) if \((a,b) \le (a',b')\) in \(A \times B\). So any inequality in \((A \times B)^\mathrm{op} \) is also in \(A^\mathrm{op} \times B^\mathrm{op}\) and vice versa, therefore \((A \times B)^\mathrm{op} \simeq A^\mathrm{op} \times B^\mathrm{op}\).

    Alternatively, here's an attempt to do a more category-theoretic version of the previous paragraph: Consider the diagram \( A^\mathrm{op} \overset{\pi}{\leftarrow} (A \times B)^\mathrm{op} \overset{\varphi}{\rightarrow} B^\mathrm{op} \) and the diagram \( A^\mathrm{op} \overset{\pi'}{\leftarrow} A^\mathrm{op} \times B^\mathrm{op}\overset{\varphi'}{\rightarrow} B^\mathrm{op} \). It's not hard to see that the projections are equal \(\pi =\pi', \varphi =\varphi'\). By the universal property of products, there must be an unique monotone function \(h: (A \times B)^\mathrm{op} \to A^\mathrm{op} \times B^\mathrm{op} \) such that \(\pi = \pi'h \) and \(\varphi = \varphi' h\), but since both projections are equal, \(h\) must be the identity map. Therefore \((A \times B)^\mathrm{op} \simeq A^\mathrm{op} \times B^\mathrm{op}\).

    Puzzle 215: If \(a \le a'\) in \(A\), then \(a \ge a'\) in \(A^\mathrm{op}\). If \(a \ge a'\) in \(A^\mathrm{op}\), then \(a \le a'\) in \(\left(A^\mathrm{op}\right)^\mathrm{op}\). By transitivity, whenever \(a \le a'\) in \(A\), \(a \le a'\) in \(\left(A^\mathrm{op}\right)^\mathrm{op}\) so \(A \simeq \left(A^\mathrm{op}\right)^\mathrm{op}\).

    Puzzle 216: Rewritten as a monotone function, we have \(\cap_X : 1^\mathrm{op} \times X^\mathrm{op} \times X \to \mathbf{Bool}\). By the above theorems (puzzle answers), this function is isomorphic to a function \(X^\mathrm{op} \times X \to \mathbf{Bool}\). But this is just the hom-functor! So we have

    $$ \cap_X (x,x') = \hom(x,x')$$ In other words,

    $$ \cap_X(x,x') =\begin{cases}\mathrm{True} & \mathrm{if} \ x \le x' \\ \mathrm{False} & \mathrm{otherwise} \end{cases}$$

    Comment Source:**Puzzle 213:** To show \\(A \times 1 \simeq A \\), we need to show that there's a monotone function \\(f\\) with an inverse \\(f^{-1}\\) between these preorders. Define \\(f : (a, 1) \mapsto a\\). This is a monotone function, whenever \\((a,1) \le (b,1)\\) we have \\(a \le b\\) because by the definition of a product preorder, \\((a,1) \le (b,1)\\) iff \\(a \le b\\) and \\(1 \le 1\\). The inverse \\(f^{-1} : a \mapsto (a,1)\\) is also monotone since whenever \\(a \le b\\) we have \\((a,1) \le (b,1)\\) as \\(1 \le 1\\) holds for all pairs in \\(A \times 1\\). Composing both functions (\\(f f^{-1} = 1_A\\) and \\(f^{-1}f = 1_{A \times 1} \\)) proves \\(A \times 1 \simeq A \\). Note \\(f\\) is the projection map that appears the categorical definition of a product. **Puzzle 214:** Since the opposite preorder \\(X^\mathrm{op}\\) has the same objects as \\(X\\), it holds that there's a bijection from \\( (A \times B)^\mathrm{op} \\) to \\(A^\mathrm{op} \times B^\mathrm{op}\\). We just have to prove that both product preorders have the same order relation. For \\((a,b) \le (a',b')\\) in \\(A \times B\\), we have \\((a,b) \ge (a',b')\\) in \\((A \times B)^\mathrm{op} \\). However, we have \\(a \ge a'\\) in \\(A^\mathrm{op}\\) if \\(a \le a'\\) in \\(A\\) (the same is true for \\(B\\)), and thus \\((a,b) \ge (a',b')\\) in \\(A^\mathrm{op} \times B^\mathrm{op}\\) if \\((a,b) \le (a',b')\\) in \\(A \times B\\). So any inequality in \\((A \times B)^\mathrm{op} \\) is also in \\(A^\mathrm{op} \times B^\mathrm{op}\\) and vice versa, therefore \\((A \times B)^\mathrm{op} \simeq A^\mathrm{op} \times B^\mathrm{op}\\). Alternatively, here's an attempt to do a more category-theoretic version of the previous paragraph: Consider the diagram \\( A^\mathrm{op} \overset{\pi}{\leftarrow} (A \times B)^\mathrm{op} \overset{\varphi}{\rightarrow} B^\mathrm{op} \\) and the diagram \\( A^\mathrm{op} \overset{\pi'}{\leftarrow} A^\mathrm{op} \times B^\mathrm{op}\overset{\varphi'}{\rightarrow} B^\mathrm{op} \\). It's not hard to see that the projections are equal \\(\pi =\pi', \varphi =\varphi'\\). By the universal property of products, there must be an unique monotone function \\(h: (A \times B)^\mathrm{op} \to A^\mathrm{op} \times B^\mathrm{op} \\) such that \\(\pi = \pi'h \\) and \\(\varphi = \varphi' h\\), but since both projections are equal, \\(h\\) must be the identity map. Therefore \\((A \times B)^\mathrm{op} \simeq A^\mathrm{op} \times B^\mathrm{op}\\). **Puzzle 215:** If \\(a \le a'\\) in \\(A\\), then \\(a \ge a'\\) in \\(A^\mathrm{op}\\). If \\(a \ge a'\\) in \\(A^\mathrm{op}\\), then \\(a \le a'\\) in \\(\left(A^\mathrm{op}\right)^\mathrm{op}\\). By transitivity, whenever \\(a \le a'\\) in \\(A\\), \\(a \le a'\\) in \\(\left(A^\mathrm{op}\right)^\mathrm{op}\\) so \\(A \simeq \left(A^\mathrm{op}\right)^\mathrm{op}\\). **Puzzle 216:** Rewritten as a monotone function, we have \\(\cap_X : 1^\mathrm{op} \times X^\mathrm{op} \times X \to \mathbf{Bool}\\). By the above theorems (puzzle answers), this function is isomorphic to a function \\(X^\mathrm{op} \times X \to \mathbf{Bool}\\). But this is just the hom-functor! So we have \[ \cap_X (x,x') = \hom(x,x')\] In other words, \[ \cap_X(x,x') =\begin{cases}\mathrm{True} & \mathrm{if} \ x \le x' \\\ \mathrm{False} & \mathrm{otherwise} \end{cases}\]
  • 5.

    I learned something about 'muscle' from the physicist Hans Bethe

    TIL that "Bethe" is spelled "Bethe", and not "Beta". I can't remember seeing the name written down before, mostly heard it in audiobooks about Feynman's life.

    Comment Source:> I learned something about 'muscle' from the physicist Hans Bethe TIL that "Bethe" is spelled "Bethe", and not "Beta". I can't remember seeing the name written down before, mostly heard it in audiobooks about Feynman's life.
  • 6.

    Having a crack at these:

    Puzzle 213. Show that for any preorder \(A\), the preorder \( A \times \textbf{1}\) is isomorphic to \(A\):: in other words, there is a monotone function from \( A \times \textbf{1}\) to \(A\) with a monotone inverse. For short we write \(A \times \textbf{1} \cong A\). (This is one way in which \(\textbf{1}\) acts like 'nothing'.)

    Puzzle 214. Show that for any preorders \(A\) and \(B\) we have \( (A \times B)^{\text{op}} \cong A^{\text{op}} \times B^{\text{op}}\).

    Puzzle 215. Show that for any preorder \(A\) we have \( (A^{\text{op}})^{\text{op}} \cong A\).

    Let's write \(\bullet\) for the sole element in \(\textbf{1}\). Define \(f : A \times \textbf{1} \to A\) and \(g : A \to A \times \textbf{1}\) by \(f(a, \bullet) = a, g(a) = (a, \bullet)\).

    \(f\) and \(g\) are clearly both monotone and inverse to each other. So \(A \times \textbf{1} \cong A\).

    Now define \(h : (A \times B)^\text{op} \to A^\text{op} \times B^\text{op}\) and \(k : A^\text{op} \times B^\text{op} \to (A \times B)^\text{op} \) by \(h(a, b) = (a, b), k(a, b) = (a, b)\).

    \(h\) and \(k\) are clearly inverse to each other, but are they monotone? Yes, because

    \(\qquad(a, b) \leq_{(A \times B)^\text{op}} (a', b')\)

    \(\qquad\iff (a', b') \leq_{A \times B} (a, b)\)

    \(\qquad\iff a' \leq_A a \text{ and } b' \leq_B b\)

    \(\qquad\iff a \leq_{A^\text{op}} a' \text{ and } b \leq_{B^\text{op}} b'\)

    \(\qquad\iff (a, b) \leq_{A^\text{op} \times B^\text{op}} (a', b')\)

    So \((A \times B)^\text{op} \cong A^\text{op} \times B^\text{op}\).

    Finally, define \(m : (A^\text{op})^\text{op} \to A\) and \(n : A \to (A^\text{op})^\text{op}\) by \(m(a) = a, n(a) = a\).

    \(m\) and \(n\) are clearly inverse to each other, but are they monotone? Yes, because

    \(\qquad a \leq_{(A^\text{op})^\text{op}} a'\)

    \(\qquad\iff a' \leq_{A^\text{op}} a\)

    \(\qquad\iff a \leq_A a'\)

    So \((A^\text{op})^\text{op} \cong A\).

    Comment Source:Having a crack at these: > **Puzzle 213.** Show that for any preorder \\(A\\), the preorder \\( A \times \textbf{1}\\) is **isomorphic** to \\(A\\):: in other words, there is a monotone function from \\( A \times \textbf{1}\\) to \\(A\\) with a monotone inverse. For short we write \\(A \times \textbf{1} \cong A\\). (This is one way in which \\(\textbf{1}\\) acts like 'nothing'.) > **Puzzle 214.** Show that for any preorders \\(A\\) and \\(B\\) we have \\( (A \times B)^{\text{op}} \cong A^{\text{op}} \times B^{\text{op}}\\). > **Puzzle 215.** Show that for any preorder \\(A\\) we have \\( (A^{\text{op}})^{\text{op}} \cong A\\). Let's write \\(\bullet\\) for the sole element in \\(\textbf{1}\\). Define \\(f : A \times \textbf{1} \to A\\) and \\(g : A \to A \times \textbf{1}\\) by \\(f(a, \bullet) = a, g(a) = (a, \bullet)\\). \\(f\\) and \\(g\\) are clearly both monotone and inverse to each other. So \\(A \times \textbf{1} \cong A\\). Now define \\(h : (A \times B)^\text{op} \to A^\text{op} \times B^\text{op}\\) and \\(k : A^\text{op} \times B^\text{op} \to (A \times B)^\text{op} \\) by \\(h(a, b) = (a, b), k(a, b) = (a, b)\\). \\(h\\) and \\(k\\) are clearly inverse to each other, but are they monotone? Yes, because \\(\qquad(a, b) \leq_{(A \times B)^\text{op}} (a', b')\\) \\(\qquad\iff (a', b') \leq_{A \times B} (a, b)\\) \\(\qquad\iff a' \leq_A a \text{ and } b' \leq_B b\\) \\(\qquad\iff a \leq_{A^\text{op}} a' \text{ and } b \leq_{B^\text{op}} b'\\) \\(\qquad\iff (a, b) \leq_{A^\text{op} \times B^\text{op}} (a', b')\\) So \\((A \times B)^\text{op} \cong A^\text{op} \times B^\text{op}\\). Finally, define \\(m : (A^\text{op})^\text{op} \to A\\) and \\(n : A \to (A^\text{op})^\text{op}\\) by \\(m(a) = a, n(a) = a\\). \\(m\\) and \\(n\\) are clearly inverse to each other, but are they monotone? Yes, because \\(\qquad a \leq_{(A^\text{op})^\text{op}} a'\\) \\(\qquad\iff a' \leq_{A^\text{op}} a\\) \\(\qquad\iff a \leq_A a'\\) So \\((A^\text{op})^\text{op} \cong A\\).
  • 7.
    edited July 2018

    re the cap question:

    This is some feasibility relation from 'nothing' to \(X^{\text{op}} \times X\), or more precisely

    $$ \cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X .$$
    Puzzle 216. Rewrite this feasibility relation as a monotone function and simplify it just as we did for the cup. Then, guess what it is!

    As a monotone function this should be \(\cap_X \colon \textbf{1}^\text{op} \times (X^\text{op} \times X) \to \textbf{Bool}\)

    Now \(\textbf{1}^\text{op} \cong \textbf{1}\) so the domain is isomorphic to plain old \(X^\text{op} \times X\).

    Which suggests "cap" is just "hom" – or more precisely \(\cap_X(x,x') = \text{hom}(x, x')\).

    I'm thinking we could perhaps use these tricks to "reshuffle" the inputs and outputs of any profunctor, ie any \(\Phi : X \nrightarrow Y\) can be written as a profunctor \(X \times Y^\text{op} \nrightarrow \textbf{1}\), or \(\textbf{1} \nrightarrow X^\text{op} \times Y\), or \(Y^\text{op} \nrightarrow X^\text{op}\), and we get these by judiciously composing with cup and cap...

    Comment Source:re the cap question: > This is some feasibility relation from 'nothing' to \\(X^{\text{op}} \times X\\), or more precisely > \[ \cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X .\] > **Puzzle 216.** Rewrite this feasibility relation as a monotone function and simplify it just as we did for the cup. Then, guess what it is! As a monotone function this should be \\(\cap_X \colon \textbf{1}^\text{op} \times (X^\text{op} \times X) \to \textbf{Bool}\\) Now \\(\textbf{1}^\text{op} \cong \textbf{1}\\) so the domain is isomorphic to plain old \\(X^\text{op} \times X\\). Which suggests "cap" is just "hom" – or more precisely \\(\cap_X(x,x') = \text{hom}(x, x')\\). I'm thinking we could perhaps use these tricks to "reshuffle" the inputs and outputs of any profunctor, ie any \\(\Phi : X \nrightarrow Y\\) can be written as a profunctor \\(X \times Y^\text{op} \nrightarrow \textbf{1}\\), or \\(\textbf{1} \nrightarrow X^\text{op} \times Y\\), or \\(Y^\text{op} \nrightarrow X^\text{op}\\), and we get these by judiciously composing with cup and cap...
  • 8.

    I think you could also argue that "cap" is a monotone function by duality with "cup".

    Comment Source:I think you could also argue that "cap" is a monotone function by duality with "cup".
  • 9.

    Hey Anindya,

    I'm thinking we could perhaps use these tricks to "reshuffle" the inputs and outputs of any profunctor, ie any \(\Phi : X \nrightarrow Y\) can be written as a profunctor \(X \times Y^\text{op} \nrightarrow \textbf{1}\), or \(\textbf{1} \nrightarrow X^\text{op} \times Y\), or \(Y^\text{op} \nrightarrow X^\text{op}\), and we get these by judiciously composing with cup and cap...

    Do you think we can recover all of the "yanking conditions" of a symmetric compact closed monoidal category?

    $$ A\xrightarrow{\cong} A\otimes I\xrightarrow{A\otimes\eta} A\otimes (A^\ast\otimes A) \xrightarrow{\cong} (A\otimes A^\ast) \otimes A \xrightarrow{\varepsilon\otimes A} I\otimes A\xrightarrow{\cong} A \\ A^\ast\xrightarrow{\cong} I\otimes A^\ast\xrightarrow{\eta\otimes A^\ast}(A^\ast\otimes A)\otimes A^\ast\xrightarrow{\cong} A^\ast\otimes (A\otimes A^\ast)\xrightarrow{A^\ast \otimes\varepsilon} A^\ast\otimes I\xrightarrow{\cong} A^\ast $$ Here "\(\ast\)" would be "\(\mathrm{op}\)", "\(\otimes\)" would be "\(\times\)", and "\(\to\)" would be "\(\nrightarrow\)"...

    Comment Source:Hey Anindya, > I'm thinking we could perhaps use these tricks to "reshuffle" the inputs and outputs of any profunctor, ie any \\(\Phi : X \nrightarrow Y\\) can be written as a profunctor \\(X \times Y^\text{op} \nrightarrow \textbf{1}\\), or \\(\textbf{1} \nrightarrow X^\text{op} \times Y\\), or \\(Y^\text{op} \nrightarrow X^\text{op}\\), and we get these by judiciously composing with cup and cap... Do you think we can recover all of the "yanking conditions" of a [symmetric compact closed monoidal category](https://en.wikipedia.org/wiki/Compact_closed_category#Symmetric_compact_closed_category)? \[ A\xrightarrow{\cong} A\otimes I\xrightarrow{A\otimes\eta} A\otimes (A^\ast\otimes A) \xrightarrow{\cong} (A\otimes A^\ast) \otimes A \xrightarrow{\varepsilon\otimes A} I\otimes A\xrightarrow{\cong} A \\\\ A^\ast\xrightarrow{\cong} I\otimes A^\ast\xrightarrow{\eta\otimes A^\ast}(A^\ast\otimes A)\otimes A^\ast\xrightarrow{\cong} A^\ast\otimes (A\otimes A^\ast)\xrightarrow{A^\ast \otimes\varepsilon} A^\ast\otimes I\xrightarrow{\cong} A^\ast \] Here "\\(\ast\\)" would be "\\(\mathrm{op}\\)", "\\(\otimes\\)" would be "\\(\times\\)", and "\\(\to\\)" would be "\\(\nrightarrow\\)"...
  • 10.

    I'm afraid I've only got the vaguest understanding of how coherence conditions work in monoidal categories, Matthew. But it seems to me that both those equations amount to a "bendy string" as a string diagram (an Ƨ in the first case and an S in the second). And usually in string diagrams you can "pull things taut", which is what I suppose is going on here.

    Comment Source:I'm afraid I've only got the vaguest understanding of how coherence conditions work in monoidal categories, Matthew. But it seems to me that both those equations amount to a "bendy string" as a string diagram (an Ƨ in the first case and an S in the second). And usually in string diagrams you can "pull things taut", which is what I suppose is going on here.
  • 11.

    Well, we can try to see if the yanking axiom holds.

    The equations we want to test are,

    \[ (Id_X \times \cap_X) \circ (\cup_X \times Id_X) = Id_X = (\cap_X \times Id_X) \circ (Id_X \times \cup_X).
    \]

    Comment Source:Well, we can try to see if the yanking axiom holds. The equations we want to test are, \\[ (Id\_X \times \cap\_X) \circ (\cup\_X \times Id\_X) = Id\_X = (\cap\_X \times Id\_X) \circ (Id\_X \times \cup\_X). \\]
  • 12.
    edited July 2018

    Actually, why not work this out.

    Consider this row of profunctors:

    $$ A \cong A \times \textbf{1} \nrightarrow A \times (A^\text{op} \times A) \cong (A \times A^\text{op}) \times A \nrightarrow \textbf{1} \times A \cong A $$ Given any objects \(x, y\) in \(A\) the value of the composite profunctor at \((x, y)\) is

    $$ \bigvee\big(\Phi(x, (z, z', z'')) \otimes \Psi((z, z', z''), y)\big) $$ where we're joining over triples \((z, z', z'')\) in \(A \times A^\text{op} \times A\) and

    $$ \Phi(x, (z, z', z'')) = \text{id}_A(x, z) \otimes \cap_A(z', z'') = \text{hom}_A(x, z) \otimes \text{hom}_A(z', z'') $$ $$ \Psi((z, z', z''), y) = \cup _{A^\mathrm{op}}(z, z') \otimes \text{id}_A(z'', y) = \text{hom} _{A^\mathrm{op}}(z', z) \otimes \text{hom}_A(z'', y) $$ (Note that we're using \(\cup _{A^\text{op}} : A \times A^\text{op} \nrightarrow \textbf{1}\) not \(\cup_A : A^\text{op} \times A \nrightarrow \textbf{1}\) here!)

    So our join is

    $$ \bigvee\big(\text{hom}_A(x, z) \otimes \text{hom}_A(z', z'') \otimes \text{hom} _{A^\text{op}}(z', z) \otimes \text{hom}_A(z'', y)\big) $$ Reorder the middle two terms and switch round \(\text{hom} _{A^\text{op}}\) to get

    $$ \bigvee\big(\text{hom}_A(x, z) \otimes \text{hom}_A(z, z') \otimes \text{hom}_A(z', z'') \otimes \text{hom}_A(z'', y)\big) $$ Now for any triple \((z, z', z'')\) this is \(\leq \text{hom}(x, y)\), so the join must be \(\leq \text{hom}(x, y)\).

    But for the triple \((x, x, x)\) we get the "summand" \(\text{hom}(x, x) \otimes \text{hom}(x, x) \otimes \text{hom}(x, x) \otimes \text{hom}(x, y) \geq I \otimes I \otimes I \otimes \text{hom}(x, y) = \text{hom}(x, y)\)

    So the join must also be \(\geq \text{hom}(x, y)\).

    Hence the composite profunctor equals the identity profunctor. QED.

    Comment Source:Actually, why not work this out. Consider this row of profunctors: \[ A \cong A \times \textbf{1} \nrightarrow A \times (A^\text{op} \times A) \cong (A \times A^\text{op}) \times A \nrightarrow \textbf{1} \times A \cong A \] Given any objects \\(x, y\\) in \\(A\\) the value of the composite profunctor at \\((x, y)\\) is \[ \bigvee\big(\Phi(x, (z, z', z'')) \otimes \Psi((z, z', z''), y)\big) \] where we're joining over triples \\((z, z', z'')\\) in \\(A \times A^\text{op} \times A\\) and \[ \Phi(x, (z, z', z'')) = \text{id}_A(x, z) \otimes \cap_A(z', z'') = \text{hom}_A(x, z) \otimes \text{hom}_A(z', z'') \] \[ \Psi((z, z', z''), y) = \cup _{A^\mathrm{op}}(z, z') \otimes \text{id}_A(z'', y) = \text{hom} _{A^\mathrm{op}}(z', z) \otimes \text{hom}_A(z'', y) \] (Note that we're using \\(\cup _{A^\text{op}} : A \times A^\text{op} \nrightarrow \textbf{1}\\) not \\(\cup_A : A^\text{op} \times A \nrightarrow \textbf{1}\\) here!) So our join is \[ \bigvee\big(\text{hom}_A(x, z) \otimes \text{hom}_A(z', z'') \otimes \text{hom} _{A^\text{op}}(z', z) \otimes \text{hom}_A(z'', y)\big) \] Reorder the middle two terms and switch round \\(\text{hom} _{A^\text{op}}\\) to get \[ \bigvee\big(\text{hom}_A(x, z) \otimes \text{hom}_A(z, z') \otimes \text{hom}_A(z', z'') \otimes \text{hom}_A(z'', y)\big) \] Now for any triple \\((z, z', z'')\\) this is \\(\leq \text{hom}(x, y)\\), so the join must be \\(\leq \text{hom}(x, y)\\). But for the triple \\((x, x, x)\\) we get the "summand" \\(\text{hom}(x, x) \otimes \text{hom}(x, x) \otimes \text{hom}(x, x) \otimes \text{hom}(x, y) \\geq I \otimes I \otimes I \otimes \text{hom}(x, y) = \text{hom}(x, y)\\) So the join must also be \\(\geq \text{hom}(x, y)\\). Hence the composite profunctor equals the identity profunctor. QED.
  • 13.
    edited July 2018

    Hey Anindya,

    I'm afraid I've only got the vaguest understanding of how coherence conditions work in monoidal categories, Matthew.

    Hmm... I think I am in the same boat.

    (EDIT: n/m, you've got more muscle than me...)

    Maybe we can start with something easier. I bet we could prove a consequence of the category being symmetric compact closed.

    On page 42 of John Baez and Mike Stay's Rosetta Stone (2009) they define:

    $$ Y \multimap Z := Y^\ast ⊗ Z $$ They then write on the next page, approximately:

    $$ \mathrm{Hom}(Y \otimes X, Z) \cong \mathrm{Hom}(X, Y^\ast \otimes Z) $$ (They're using Linear Logic rather than conventional category theory, but hopefully John can come by and correct me if I am making a grave mistake here.)

    Based on this, perhaps we can recover \((Y \times -) \dashv (Y^{\mathrm{op}} \times -)\).

    I just tried to show a relationship like this over in Lecture 66 for posets and preorders. I don't think it should be too hard to show this for profunctors (famous last words)...

    Comment Source:Hey Anindya, > I'm afraid I've only got the vaguest understanding of how coherence conditions work in monoidal categories, Matthew. Hmm... I think I am in the same boat. (EDIT: n/m, you've got more muscle than me...) Maybe we can start with something easier. I bet we could prove a consequence of the category being symmetric compact closed. On page 42 of John Baez and Mike Stay's [Rosetta Stone (2009)](http://math.ucr.edu/home/baez/rosetta.pdf) they define: \[ Y \multimap Z := Y^\ast ⊗ Z \] They then write on the next page, approximately: \[ \mathrm{Hom}(Y \otimes X, Z) \cong \mathrm{Hom}(X, Y^\ast \otimes Z) \] (They're using *Linear Logic* rather than conventional category theory, but hopefully John can come by and correct me if I am making a grave mistake here.) Based on this, perhaps we can recover \\((Y \times -) \dashv (Y^{\mathrm{op}} \times -)\\). I just tried to show a relationship like this over in Lecture 66 for posets and preorders. I don't think it should be too hard to show this for profunctors (famous last words)...
  • 14.
    edited July 2018

    Anindya wrote:

    (Also for some reason \cup_{A^\text{op}} isn't working so I'm writing \cup_A^\text{op} instead – sorry.)

    Try \cup_{A^\mathrm{op}} instead: \(\cup_{A^\mathrm{op}}\)

    Comment Source:Anindya wrote: > (Also for some reason `\cup_{A^\text{op}}` isn't working so I'm writing `\cup_A^\text{op}` instead – sorry.) Try `\cup_{A^\mathrm{op}}` instead: \\(\cup_{A^\mathrm{op}}\\)
  • 15.

    I literally tried copying and pasting that code in and it still didn't work! Sticking in a space seems to do the trick tho: \cup _{A^\text{op}} = \(\cup _{A^\text{op}}\)

    Comment Source:I literally tried copying and pasting that code in and it still didn't work! Sticking in a space seems to do the trick tho: `\cup _{A^\text{op}}` = \\(\cup _{A^\text{op}}\\)
  • 16.
    edited July 2018

    general rule here that's worth noting.

    Proposition: Given \(\Phi: X \nrightarrow Y, \Psi: Y \nrightarrow Z, \Omega: X \nrightarrow Z\), suppose \(\forall x \in \text{Ob}(X)\) and \(\forall z \in \text{Ob}(Z)\) we have

    $$ \forall y \in \text{Ob}(Y) \quad \Phi(x, y) \otimes \Psi(y, z) \leq \Omega(x, z) $$ $$ \exists y \in \text{Ob}(Y) \quad \Phi(x, y) \otimes \Psi(y, z) \geq \Omega(x, z) $$ Then \(\Psi\circ\Phi = \Omega\).

    The first condition tells us that the join \(\bigvee\big(\Phi(x, y) \otimes \Psi(y, z)\big) \leq \Omega(x, z)\).

    The second condition tells us that the join \(\bigvee\big(\Phi(x, y) \otimes \Psi(y, z)\big) \geq \Omega(x, z)\).

    (because a join is a least upper bound and a least upper bound respectively!) QED

    Comment Source:general rule here that's worth noting. > **Proposition**: Given \\(\Phi: X \nrightarrow Y, \Psi: Y \nrightarrow Z, \Omega: X \nrightarrow Z\\), suppose \\(\forall x \in \text{Ob}(X)\\) and \\(\forall z \in \text{Ob}(Z)\\) we have > \[ \forall y \in \text{Ob}(Y) \quad \Phi(x, y) \otimes \Psi(y, z) \leq \Omega(x, z) \] > \[ \exists y \in \text{Ob}(Y) \quad \Phi(x, y) \otimes \Psi(y, z) \geq \Omega(x, z) \] > Then \\(\Psi\circ\Phi = \Omega\\). The first condition tells us that the join \\(\bigvee\big(\Phi(x, y) \otimes \Psi(y, z)\big) \leq \Omega(x, z)\\). The second condition tells us that the join \\(\bigvee\big(\Phi(x, y) \otimes \Psi(y, z)\big) \geq \Omega(x, z)\\). (because a join is a *least* upper bound and a least *upper bound* respectively!) QED
  • 17.
    edited July 2018

    Scott wrote:

    TIL that "Bethe" is spelled "Bethe", and not "Beta". I can't remember seeing the name written down before, mostly heard it in audiobooks about Feynman's life.

    Yes, it's pronounced "Bethe"... and the potential for a pun here led to the famous Alpher-Bethe-Gamow paper in which Gamow and his student Alpher showed that the Big Bang would create hydrogen and helium in the amounts we actually see! Bethe was roped in as a coauthor just for a joke.

    Gamow later wrote:

    The results of these calculations were first announced in a letter to The Physical Review, April 1, 1948. This was signed Alpher, Bethe, and Gamow, and is often referred to as the 'alphabetical article.' It seemed unfair to the Greek alphabet to have the article signed by Alpher and Gamow only, and so the name of Dr. Hans A. Bethe (in absentia) was inserted in preparing the manuscript for print. Dr. Bethe, who received a copy of the manuscript, did not object, and, as a matter of fact, was quite helpful in subsequent discussions. There was, however, a rumor that later, when the alpha, beta, gamma theory went temporarily on the rocks, Dr. Bethe seriously considered changing his name to Zacharias.

    The close fit of the calculated curve and the observed abundances is shown in Fig. 15, which represents the results of later calculations carried out on the electronic computer of the National Bureau of Standards by Ralph Alpher and R. C. Herman (who stubbornly refuses to change his name to Delter.)

    Comment Source:Scott wrote: > TIL that "Bethe" is spelled "Bethe", and not "Beta". I can't remember seeing the name written down before, mostly heard it in audiobooks about Feynman's life. Yes, it's pronounced "Bethe"... and the potential for a pun here led to the famous [Alpher-Bethe-Gamow](https://en.wikipedia.org/wiki/Alpher%E2%80%93Bethe%E2%80%93Gamow_paper) paper in which Gamow and his student Alpher showed that the Big Bang would create hydrogen and helium in the amounts we actually see! Bethe was roped in as a coauthor just for a joke. Gamow later wrote: > The results of these calculations were first announced in a letter to _The Physical Review_, April 1, 1948. This was signed Alpher, Bethe, and Gamow, and is often referred to as the 'alphabetical article.' It seemed unfair to the Greek alphabet to have the article signed by Alpher and Gamow only, and so the name of Dr. Hans A. Bethe (in absentia) was inserted in preparing the manuscript for print. Dr. Bethe, who received a copy of the manuscript, did not object, and, as a matter of fact, was quite helpful in subsequent discussions. There was, however, a rumor that later, when the alpha, beta, gamma theory went temporarily on the rocks, Dr. Bethe seriously considered changing his name to Zacharias. > The close fit of the calculated curve and the observed abundances is shown in Fig. 15, which represents the results of later calculations carried out on the electronic computer of the National Bureau of Standards by Ralph Alpher and R. C. Herman (who stubbornly refuses to change his name to Delter.)
  • 18.
    edited July 2018

    Great answers to all the puzzles, Scott!

    Puzzle 216: Rewritten as a monotone function, we have \(\cap_X : 1^\mathrm{op} \times X^\mathrm{op} \times X \to \mathbf{Bool}\). By the above theorems (puzzle answers), this function is isomorphic to a function \(X^\mathrm{op} \times X \to \mathbf{Bool}\). But this is just the hom-functor! So we have

    $$ \cap_X (x,x') = \hom(x,x')$$ Right!

    It's worth letting newbies know that last two sentences are not the conclusion of some logical argument, but rather a natural guess. The thought process is

    If you're looking for a monotone function \(\cap_X: X^\mathrm{op} \times X \to \mathbf{Bool}\), the obvious thing to try is the hom-functor \(\text{hom}: X^\mathrm{op} \times X \to \mathbf{Bool}\).

    And indeed, this turns out to work very well, so that's what we'll use!

    In other words,

    $$ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}$$
    Comment Source:Great answers to all the puzzles, Scott! > **Puzzle 216:** Rewritten as a monotone function, we have \\(\cap_X : 1^\mathrm{op} \times X^\mathrm{op} \times X \to \mathbf{Bool}\\). By the above theorems (puzzle answers), this function is isomorphic to a function \\(X^\mathrm{op} \times X \to \mathbf{Bool}\\). But this is just the hom-functor! So we have \[ \cap_X (x,x') = \hom(x,x')\] Right! It's worth letting newbies know that last two sentences are not the conclusion of some logical argument, but rather a natural guess. The thought process is <center><i>If you're looking for a monotone function \\(\cap_X: X^\mathrm{op} \times X \to \mathbf{Bool}\\), the obvious thing to try is the hom-functor \\(\text{hom}: X^\mathrm{op} \times X \to \mathbf{Bool}\\).</i></center> And indeed, this turns out to work very well, so that's what we'll use! > In other words, > \[ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}\]
  • 19.

    Actually, now that I think about it, upper- and downsets satisfy a zig-zag (and zag-zig?) relation composed with the constant feasibility relation \(!_x: 1\nrightarrow X\),

    Not sure how to best prove this, maybe someone else could show this.

    Comment Source:Actually, now that I think about it, upper- and downsets satisfy a zig-zag (and zag-zig?) relation composed with the constant feasibility relation \\(!\_x: 1\nrightarrow X\\), Not sure how to best prove this, maybe someone else could show this.
  • 20.
    edited July 2018

    Matthew wrote:

    Do you think we can recover all of the "yanking conditions" of a symmetric compact closed monoidal category?

    $$ A\xrightarrow{\cong} A\otimes I\xrightarrow{A\otimes\eta} A\otimes (A^\ast\otimes A) \xrightarrow{\cong} (A\otimes A^\ast) \otimes A \xrightarrow{\varepsilon\otimes A} I\otimes A\xrightarrow{\cong} A \\\\ A^\ast\xrightarrow{\cong} I\otimes A^\ast\xrightarrow{\eta\otimes A^\ast}(A^\ast\otimes A)\otimes A^\ast\xrightarrow{\cong} A^\ast\otimes (A\otimes A^\ast)\xrightarrow{A^\ast \otimes\varepsilon} A^\ast\otimes I\xrightarrow{\cong} A^\ast $$ Here "\(\ast\)" would be "\(\mathrm{op}\)", "\(\otimes\)" would be "\(\times\)", and "\(\to\)" would be "\(\nrightarrow\)

    Yes! And \( \eta\) is the cap, and \(\varepsilon\) is the cup.

    This will be the subject of my next lecture, and you folks have done most of the work for me!

    I call the yanking identities the zig-zag equations. If we draw a cap like a cap, a cup like a cup, and an identity like a pipe, they say this:

    image
    Comment Source:Matthew wrote: > Do you think we can recover all of the "yanking conditions" of a [symmetric compact closed monoidal category](https://en.wikipedia.org/wiki/Compact_closed_category#Symmetric_compact_closed_category)? > \[ A\xrightarrow{\cong} A\otimes I\xrightarrow{A\otimes\eta} A\otimes (A^\ast\otimes A) \xrightarrow{\cong} (A\otimes A^\ast) \otimes A \xrightarrow{\varepsilon\otimes A} I\otimes A\xrightarrow{\cong} A \\\\ A^\ast\xrightarrow{\cong} I\otimes A^\ast\xrightarrow{\eta\otimes A^\ast}(A^\ast\otimes A)\otimes A^\ast\xrightarrow{\cong} A^\ast\otimes (A\otimes A^\ast)\xrightarrow{A^\ast \otimes\varepsilon} A^\ast\otimes I\xrightarrow{\cong} A^\ast \] > Here "\\(\ast\\)" would be "\\(\mathrm{op}\\)", "\\(\otimes\\)" would be "\\(\times\\)", and "\\(\to\\)" would be "\\(\nrightarrow\\) Yes! And \\( \eta\\) is the cap, and \\(\varepsilon\\) is the cup. This will be the subject of my next lecture, and you folks have done most of the work for me! I call the yanking identities the **zig-zag equations.** If we draw a cap like a cap, a cup like a cup, and an identity like a pipe, they say this: <center><img src = "http://math.ucr.edu/home/baez/zigzag.jpg"></center>
  • 21.
    edited July 2018

    Actually, now that I think about, isn't

    $$ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\ \mathrm{false} & \mathrm{otherwise} \end{cases}$$ a bit redundant? \([x \leq x']\) is already the relation that gives \(\mathrm{true}\) when \(x\) is less then or equal to \(x'\), and \(\mathrm{false}\) otherwise. Running a conditional on \([x \leq x']\) to give either \(\mathrm{true}\) or \(\mathrm{false}\) is therefor redundant.

    Or to be blunter,

    \[ \cap_X(x,x') := [x \leq x']. \]

    Comment Source:Actually, now that I think about, isn't \[ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}\] a bit redundant? \\([x \leq x']\\) is already *the* relation that gives \\(\mathrm{true}\\) when \\(x\\) is less then or equal to \\(x'\\), and \\(\mathrm{false}\\) otherwise. Running a conditional on \\([x \leq x']\\) to give either \\(\mathrm{true}\\) or \\(\mathrm{false}\\) is therefor redundant. Or to be blunter, \\[ \cap_X(x,x') := [x \leq x']. \\]
  • 22.
    edited July 2018

    Actually, let us define an operation, called bracket, that takes a relation \(R\), and two elements in a set, and gives \(\mathrm{true}\) when \(R\) holds for the elements \(x\) and \(x'\) and \(\mathrm{false}\) otherwise,

    \[ [x R x'] =\begin{cases}\mathrm{true} & \mathrm{if} \ x R x' \\ \mathrm{false} & \mathrm{otherwise} \end{cases} \]

    then,

    \[ \cap_X(x,x') := [x \leq x']. \]

    Interesting fact about brackets is that when we identify \(\mathbf{Bool}\) with \((\varnothing \subseteq \mathbf{1})\), with \(\mathbf{false} \cong \varnothing\) and \(\mathbf{true} \cong \mathbf{1}\) then taking the cartesion product with them allows us to have an object appear only when our relation holds (up to isomorphism),

    \[ [x R x'] \times A \\ \implies A \text{ when } [x R x'] = \mathrm{true} \\ \implies \varnothing\text{ when } [x R x'] = \mathrm{false} \]

    since the cartesion product of any \(A\) with \(\varnothing\) is isomorphic to \(\varnothing\), and the cartesion product of any \(A\) with \(\mathbf{1}\) is isomorphic to \(A\).

    In fact, thinking about it,

    \[ (\cup_{x,x} \times id_{x'}) \circ (id_x \times \cap_{x,x')} ) \\ = ([x \geq x]\times id_{x'}) \circ (id_x \times [x \leq x'] ) \\ = ([x = x]\times id_x') \circ (id_x \times [x \leq x'] ) \]

    is equivalent to an upper-set, if we are allowed to vary \(x'\). Which is sort of what I wanted to show above.

    Comment Source:Actually, let us define an operation, called bracket, that takes a relation \\(R\\), and two elements in a set, and gives \\(\mathrm{true}\\) when \\(R\\) holds for the elements \\(x\\) and \\(x'\\) and \\(\mathrm{false}\\) otherwise, \\[ [x R x'] =\begin{cases}\mathrm{true} & \mathrm{if} \ x R x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases} \\] then, \\[ \cap_X(x,x') := [x \leq x']. \\] Interesting fact about brackets is that when we identify \\(\mathbf{Bool}\\) with \\((\varnothing \subseteq \mathbf{1})\\), with \\(\mathbf{false} \cong \varnothing\\) and \\(\mathbf{true} \cong \mathbf{1}\\) then taking the cartesion product with them allows us to have an object appear only when our relation holds (up to isomorphism), \\[ [x R x'] \times A \\\\ \implies A \text{ when } [x R x'] = \mathrm{true} \\\\ \implies \varnothing\text{ when } [x R x'] = \mathrm{false} \\] since the cartesion product of any \\(A\\) with \\(\varnothing\\) is isomorphic to \\(\varnothing\\), and the cartesion product of any \\(A\\) with \\(\mathbf{1}\\) is isomorphic to \\(A\\). In fact, thinking about it, \\[ (\cup\_{x,x} \times id_{x'}) \circ (id_x \times \cap_{x,x')} ) \\\\ = ([x \geq x]\times id_{x'}) \circ (id_x \times [x \leq x'] ) \\\\ = ([x = x]\times id_x') \circ (id_x \times [x \leq x'] ) \\] is equivalent to an upper-set, if we are allowed to vary \\(x'\\). Which is sort of what I wanted to show above.
  • 23.
    edited July 2018

    I should warn people that I've fixed my convention concerning the cap: now it's

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} $$

    image

    So, the above discussions of the cap now look 'backwards', because they were based on the convention where \(\cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X \). This is nobody's fault but mine! Luckily, it's easy to adjust these arbitrary conventions.

    Comment Source:I should warn people that I've fixed my convention concerning the cap: now it's \[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \] <center><img width = "120" src = "http://math.ucr.edu/home/baez/mathematical/7_sketches/cap.png"></center> So, the above discussions of the cap now look 'backwards', because they were based on the convention where \\(\cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X \\). This is nobody's fault but mine! Luckily, it's easy to adjust these arbitrary conventions.
  • 24.

    Keith wrote

    Actually, now that I think about, isn't

    $$ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}$$ a bit redundant? \([x \leq x']\) is already
    the relation that gives \(\mathrm{true}\) when \(x\) is less then or equal to \(x'\), and \(\mathrm{false}\) otherwise.

    Yeah, it seems redundant to me. Although, one could argue that really cap is this function \(1^\mathrm{op} \times \left(X^\mathrm{op} \times X\right)\): $$ \cap_X(1, (x,x')) =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}$$ where \(1\) is the object of the singleton preorder. But cap is isomorphic to the hom-functor which is just a preorder relation on \(X\): \(\hom(x,x') = (x\le x')\).

    John wrote

    It's worth letting newbies know that last two sentences are not the conclusion of some logical argument, but rather a natural guess. The thought process is

    I should probably highlight guesses like that more clearly. Thanks for pointing it out.

    Comment Source:Keith wrote > Actually, now that I think about, isn't > \[ \cap_X(x,x') =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}\] > a bit redundant? \\([x \leq x']\\) is already *the* relation that gives \\(\mathrm{true}\\) when \\(x\\) is less then or equal to \\(x'\\), and \\(\mathrm{false}\\) otherwise. Yeah, it seems redundant to me. Although, one could argue that really cap is this function \\(1^\mathrm{op} \times \left(X^\mathrm{op} \times X\right)\\): \[ \cap_X(1, (x,x')) =\begin{cases}\mathrm{true} & \mathrm{if} \ x \le x' \\\ \mathrm{false} & \mathrm{otherwise} \end{cases}\] where \\(1\\) is the object of the singleton preorder. But cap is isomorphic to the hom-functor which is just a preorder relation on \\(X\\): \\(\hom(x,x') = (x\le x')\\). John wrote > It's worth letting newbies know that last two sentences are not the conclusion of some logical argument, but rather a natural guess. The thought process is I should probably highlight guesses like that more clearly. Thanks for pointing it out.
  • 25.
    edited August 2018

    John wrote:

    I should warn people that I've fixed my convention concerning the cap: now it's

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} $$

    So for this cap, we have \( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\) ??

    This doesn't seem arbitrary to me for some reason. The new cap bends upwards so when you combine it with a cup you get a snake but the old cap bends downwards which produces a loop when combined with a cup?

    This is pure newb rambling but the snake seems like an identity on a wire whereas the loop is an identity on a box...

    Comment Source:John wrote: >I should warn people that I've fixed my convention concerning the cap: now it's >\[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \] So for this cap, we have \\( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\\) ?? This doesn't seem arbitrary to me for some reason. The new cap bends upwards so when you combine it with a cup you get a snake but the old cap bends downwards which produces a loop when combined with a cup? This is pure newb rambling but the snake seems like an identity on a wire whereas the loop is an identity on a box...
  • 26.
    edited August 2018

    Michael Hong wrote:

    So for this cap, we have \( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\) ?

    I think you have \(\cup\) flipped. I think this should be:

    $$ \cap_X(x,x') \cong \hom(x',x) \cong \cup_X(x',x) $$ They aren't equal, but they are congruent because we can ignore the \(\mathbf{1}\) in \(\cap_X \colon \textbf{1} \times (X \times X^{\text{op}}) \to \textbf{Bool}\) and \(\cup_X \colon (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \to \mathbf{Bool}\) thanks to Puzzle 213.

    Comment Source:Michael Hong wrote: > So for this cap, we have \\( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\\) ? I think you have \\(\cup\\) flipped. I think this should be: \[ \cap\_X(x,x') \cong \hom(x',x) \cong \cup\_X(x',x) \] They aren't equal, but they are congruent because we can ignore the \\(\mathbf{1}\\) in \\(\cap_X \colon \textbf{1} \times (X \times X^{\text{op}}) \to \textbf{Bool}\\) and \\(\cup_X \colon (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \to \mathbf{Bool}\\) thanks to **Puzzle 213**.
  • 27.

    Matthew

    As usual, my wires are probably jumbled up but I do not see why \(\cap_X(x,x') \cong \hom(x',x) \cong \cup_X(x',x)\). John hasn't changed the definition for the cup:

    $$ \cup_X \colon X^{\text{op}} \times X \nrightarrow \textbf{1} $$ $$ \cup_X \colon (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \to \mathbf{Bool} $$ $$ \cup_X \colon X \times X^{\text{op}} \to \textbf{Bool} $$ $$ \cup_X (x,x') = \text{hom}(x',x) $$

    So if we only change, the definition of the cap :

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} $$ $$ \cup_X \colon \textbf{1} \times X \times X^{\text{op}} \to \mathbf{Bool} $$ $$ \cup_X \colon X \times X^{\text{op}} \to \mathbf{Bool} $$ $$ \cup_X (x,x') = \text{hom}(x',x) $$ Their ordering is exactly the same?

    Comment Source:Matthew As usual, my wires are probably jumbled up but I do not see why \\(\cap\_X(x,x') \cong \hom(x',x) \cong \cup\_X(x',x)\\). John hasn't changed the definition for the cup: >\[ \cup_X \colon X^{\text{op}} \times X \nrightarrow \textbf{1} \] >\[ \cup_X \colon (X^{\text{op}} \times X)^\text{op} \times \textbf{1} \to \mathbf{Bool} \] >\[ \cup_X \colon X \times X^{\text{op}} \to \textbf{Bool} \] >\[ \cup_X (x,x') = \text{hom}(x',x) \] So if we only change, the definition of the cap : \[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \] \[ \cup_X \colon \textbf{1} \times X \times X^{\text{op}} \to \mathbf{Bool} \] \[ \cup_X \colon X \times X^{\text{op}} \to \mathbf{Bool} \] \[ \cup_X (x,x') = \text{hom}(x',x) \] Their ordering is exactly the same?
  • 28.

    Hey Michael,

    Sorry for the delay.

    I think you are right! I was confused by the notation. Thanks for the clarification.

    Comment Source:Hey Michael, Sorry for the delay. I think you are right! I was confused by the notation. Thanks for the clarification.
  • 29.
    edited August 2018

    Michael Hong wrote:

    So for this cap, we have \( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\) ?

    Right.

    I find the arbitrary conventions in this course endlessly confusing, first because there are \(2^n\) options some fairly large value of \(n\) (like 10), second because there aren't completely standard choices in every case, and third because using Fong and Spivak's textbook I don't feel completely free to make the choices I think are best!

    For example: I don't like string diagrams going left to right; I like them going top to bottom. So when I say the 'cap'

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} $$ is drawn like this:

    image

    I'm already annoyed that the cap isn't being drawn as an actual cap \(\cap\). And I'm also annoyed by the fact that this convention doesn't turn into my usual convention just by rotating the picture 90 degrees... I could explain that, but never mind.

    But I can live with all this.

    On a separate note, I really want the cap to go like this:

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \qquad (GOOD) $$ and not this:

    $$ \cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X \qquad (EVIL) $$ because in the category of finite-dimensional real vector spaces we have a similar such thing for any object \(V\):

    $$ \cap_X \colon \mathbb{R} \to V \otimes V^\ast $$ where \(V \otimes V^*\) is the space of linear transformations of \(V\) (or crudely speaking, 'square matrices') and \(\cap_X\) sends the number 1 to the identity matrix. It's purely a convention that we think of linear transformation of \(V\) as elements of |(V \otimes V^\ast\) rather than \(V^\ast \otimes V\), but this conventions works well if we multiply a vector by a matrix on the _left_, as we usually do.

    (Here I'm hinting at, but not really explaining, a deep connection between profunctors and linear algebra. We already seen it before: our formula for composing profunctors looks just like the formula for matrix multiplication! So, I want my conventions in these various subjects to match up nicely... and this choice of the cap makes it work.)

    Given that we have

    $$ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} $$ it's easy to see that we must have \(\cap_X(x,x') = \texttt{true}\) iff \(x' \le x\), because this relation must remain true if we make \(x \in X\) bigger, and also if we make \(x' \in X^{\text{op}} \) bigger, which is the same as making \(x' \in X\) smaller.

    I could talk for hours about these interlocking conventions, since I've spent my life worrying about them.... but this is enough!

    Comment Source:Michael Hong wrote: > So for this cap, we have \\( \cap_X (x,x') = \hom(x',x) = \cup_X (x,x')\\) ? Right. I find the arbitrary conventions in this course endlessly confusing, first because there are \\(2^n\\) options some fairly large value of \\(n\\) (like 10), second because there aren't completely standard choices in every case, and third because using Fong and Spivak's textbook I don't feel completely free to make the choices _I_ think are best! For example: I don't like string diagrams going left to right; I like them going top to bottom. So when I say the 'cap' \[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \] is drawn like this: <center><img width = "120" src = "http://math.ucr.edu/home/baez/mathematical/7_sketches/cap.png"></center> I'm already annoyed that the cap isn't being drawn as an actual cap \\(\cap\\). And I'm also annoyed by the fact that this convention doesn't turn into my usual convention just by rotating the picture 90 degrees... I could explain that, but never mind. But I can live with all this. On a separate note, I really want the cap to go like this: \[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \qquad (GOOD) \] and not this: \[ \cap_X \colon \textbf{1} \nrightarrow X^{\text{op}} \times X \qquad (EVIL) \] because in the category of finite-dimensional real vector spaces we have a similar such thing for any object \\(V\\): \[ \cap_X \colon \mathbb{R} \to V \otimes V^\ast \] where \\(V \otimes V^*\\) is the space of linear transformations of \\(V\\) (or crudely speaking, 'square matrices') and \\(\cap_X\\) sends the number 1 to the identity matrix. It's purely a convention that we think of linear transformation of \\(V\\) as elements of \|(V \otimes V^\ast\\) rather than \\(V^\ast \otimes V\\), but this conventions works well if we multiply a vector by a matrix on the _left_, as we usually do. (Here I'm hinting at, but not really explaining, a deep connection between profunctors and linear algebra. We already seen it before: our formula for composing profunctors looks just like the formula for matrix multiplication! So, I want my conventions in these various subjects to match up nicely... and this choice of the cap makes it work.) Given that we have \[ \cap_X \colon \textbf{1} \nrightarrow X \times X^{\text{op}} \] it's easy to see that we must have \\(\cap_X(x,x') = \texttt{true}\\) iff \\(x' \le x\\), because this relation must remain true if we make \\(x \in X\\) _bigger_, and also if we make \\(x' \in X^{\text{op}} \\) _bigger_, which is the same as making \\(x' \in X\\) _smaller_. I could talk for hours about these interlocking conventions, since I've spent my life worrying about them.... but this is enough!
  • 30.
    edited August 2018

    So the equations for these cups and caps look awfully like one of the ways we saw of defining Galois connections (and then adjoint functors), namely with id->R.L and L.R->id. To make this precise, is the category of endofunctors (on certain categories) monoidal (and compact - if that's the right word for having the cup and cap), with the above natural transformations as the cup and cap, id as the unit, and something as the tensor? Oh wait, the tensor is monadic join, am I talking about monads by accident?

    Edit: no, that's not quite right - I'm confusing a monoid in a category with a monoidal category. Hmmm

    Comment Source:So the equations for these cups and caps look awfully like one of the ways we saw of defining Galois connections (and then adjoint functors), namely with id->R.L and L.R->id. To make this precise, is the category of endofunctors (on certain categories) monoidal (and compact - if that's the right word for having the cup and cap), with the above natural transformations as the cup and cap, id as the unit, and *something* as the tensor? Oh wait, the tensor is monadic *join*, am I talking about monads by accident? Edit: no, that's not quite right - I'm confusing a monoid in a category with a monoidal category. Hmmm
  • 31.
    edited August 2018

    No, Reuben Cohn-Gordon, it's not an accident at all.

    See for instance the paper Category Theory Using String Diagrama by Daniel Marsden, where indeed units and counits are shown as cups and caps.

    Comment Source:No, Reuben Cohn-Gordon, it's not an accident at all. See for instance the paper [Category Theory Using String Diagrama by Daniel Marsden](https://arxiv.org/abs/1401.7220), where indeed units and counits are shown as cups and caps.
  • 32.

    @Keith - great paper, esp. p19. I'd add cup can be seen as creation and cap as annihilation which I don't think is in there.

    Comment Source:@Keith - great paper, esp. p19. I'd add cup can be seen as creation and cap as annihilation which I don't think is in there.
  • 33.
    edited August 2018

    Reuben - very good observation! There's a general concept of adjunction that includes adjoint functors between categories and caps and cups within monoidal categories.

    image

    An adjunction can live in any 2-category. A 2-category is a thing with

    • objects,
    • morphisms, and
    • 2-morphisms

    Adjoint functors live in the 2-category of

    • categories,
    • functors, and
    • natural transformations

    But a monoidal category is the same as a 2-category with one object, so adjunctions can live in there too!

    Click to see the definition of 'adjunction', then scroll down to see the definition in terms of string diagrams, and you'll see the snake equations (also known as zig-zag equations).

    You really need to learn about 2-categories to penetrate the deeper layers of category theory: the thing of all sets is best thought of as a category. While there's a category of (small) categories, the thing of all categories is best thought of as a 2-category.

    Luckily, 2-categories lend themselves to nice 2-dimensional pictures. I think the most fun gentle introduction is these videos:

    Comment Source:Reuben - very good observation! There's a general concept of [adjunction](https://ncatlab.org/nlab/show/adjunction#idea) that includes adjoint functors _between_ categories and caps and cups _within_ monoidal categories. <img width = "100" src = "http://math.ucr.edu/home/baez/mathematical/warning_sign.jpg"> An adjunction can live in any [2-category](https://ncatlab.org/nlab/show/2-category). A 2-category is a thing with * objects, * morphisms, and * 2-morphisms Adjoint functors live in the 2-category of * categories, * functors, and * natural transformations But a monoidal category is the same as a 2-category with one object, so adjunctions can live in there too! Click to see the definition of 'adjunction', then scroll down to see the definition in terms of string diagrams, and you'll see the snake equations (also known as zig-zag equations). You really need to learn about 2-categories to penetrate the deeper layers of category theory: the thing of all sets is best thought of as a category. While there's a category of (small) categories, the thing of all categories is best thought of as a 2-category. Luckily, 2-categories lend themselves to nice 2-dimensional pictures. I think the most fun gentle introduction is these videos: * The Catsters, [string diagrams](https://www.youtube.com/watch?v=USYRDDZ9yEc&list=PL50ABC4792BD0A086).
  • 34.
    edited August 2018

    Speaking of which, those Catster videos made me fall in love with string diagrammatics.

    Comment Source:Speaking of which, those Catster videos made me fall in love with string diagrammatics.
  • 35.

    I'm very happy to hear that Keith :-)

    Comment Source:I'm very happy to hear that Keith :-)
  • 36.
    edited August 2018

    The Catsters are still one of my favorite groups even though Cheng became more famous after she went solo.

    Comment Source:The Catsters are still one of my favorite groups even though Cheng became more famous after she went solo.
Sign In or Register to comment.