It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 502
- Study Groups 21
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 2
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- Baez ACT 2019: Online Course 339
- Baez ACT 2019: Lectures 79
- Baez ACT 2019: Exercises 149
- Baez ACT 2019: Chat 50
- UCR ACT Seminar 4
- General 72
- Azimuth Code Project 110
- Statistical methods 4
- Drafts 10
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 715

Options

In the puzzles last time we saw something nice: a feasibility relation between preorders is a generalization of something we learned about a long time ago: a monotone function between preorders.

But the really cool part is this: given preorders \(X\) and \(Y\), we can get a feasibility relation \(\Phi : X \nrightarrow Y\) either from a monotone function \(f : X \to Y\) or from a monotone function \(g: Y \to X\). So, feasibility relations put monotone functions going *forwards* from \(X\) to \(Y\) and those going *backwards* from \(Y\) to \(X\) into a common framework!

Even better, we saw that one of our favorite themes, namely *adjoints*, is deeply connected to this idea. Let me state this as a theorem:

**Theorem.** Let \(f : X \to Y \) and \(g: Y \to X\) be monotone functions between the preorders \(X\) and \(Y\). Define the feasibility relations \(\Phi : X \nrightarrow Y\) by

$$ \Phi(x,y) \text{ if and only if } f(x) \le y $$ and

$$ \Psi(x,y) \text{ if and only if } x \le g(y) .$$ Then \(\Phi = \Psi\) if and only if \(f \) is the left adjoint of \(g\).

**Proof.** We have \(\Phi = \Psi\) iff

$$ \Phi(x,y) \text{ if and only if } \Psi(x,y) $$ for all \(x \in X, y \in Y\), but by our definitions this is true iff

$$ f(x) \le y \text{ if and only if } x \le g(y) $$ which is true iff \(f\) is the left adjoint of \(g\). \( \qquad \blacksquare \)

Ah, if only all proofs were so easy!

Now, to make feasibility relations into a truly satisfactory generalization of monotone functions, we should figure out how to *compose* them. Luckily this is easy, because we already know how to compose relations from Lecture 40. So, we should try to prove this:

**Theorem.** Suppose that \(\Phi : X \nrightarrow Y, \Psi : Y \nrightarrow Z\) are feasibility relations between preorders. Then there is a **composite** feasibility relation

$$ \Psi \Phi : X \nrightarrow Z $$ defined as follows:

$$ (\Psi \Phi)(x,z) = \text{true} $$ if and only if for some \(y \in Y\),

$$ \Phi(x,y) = \text{true} \text{ and } \Psi(y,z) = \text{true}. $$
**Puzzle 176.** Prove this! Show that \(\Psi \Phi\) really is a feasibility relation.

I hope you see how reasonable this form of composition is. Think of it in terms of our pictures from last time:

Here we have three preorders \(X,Y,Z\), which we can think of as cities with one-way roads. We can also take one-way airplane flights from \(X\) to \(Y\) and from \(Y\) to \(Z\): the flights in blue are a way of drawing a feasibility relation \(\Phi: X \nrightarrow Y\), and the flights in red are a way of drawing \(\Psi: Y \nrightarrow Z\).

**Puzzle 177.** Is \( (\Psi\Phi)(N,y) = \text{true}\)?

**Puzzle 178.** Is \( (\Psi\Phi)(W,y) = \text{true}\)?

**Puzzle 179.** Is \(( \Psi\Phi)(E,y) = \text{true}\)?

**Puzzle 180.** Prove that there is a category \(\textbf{Feas}\) whose objects are preorders and whose morphisms are feasibility relations, composed as above.

**Puzzle 181.** Suppose that \(\Phi : X \nrightarrow Y, \Psi : Y \nrightarrow Z\) are feasibility relations between preorders. Prove that

$$ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) $$ where \(\wedge\) is the meet in the poset \(\textbf{Bool}\), and \(\bigvee\) is the join. How is this formula related to matrix multiplication?

You may remember that a feasibility relation is a \(\mathcal{V}\)-enriched profunctor in the special case where \(\mathcal{V} = \textbf{Bool}\). This formula will be the key to defining composition for more general \(\mathcal{V}\)-enriched profunctors. But we need to talk about that more.

## Comments

I think I get what's going on. \(\mathbf{Feas}\) is something like a category of pre-ordered sets and

relationsbetween them. In fact, if we restrict \(X\) and \(Y\) to just sets (with no pre-ordering on them), then \(\mathbf{Feas} \mid_{\mathbf{Set}} \cong \mathbf{Rel}\)Edit: Here \(\mathcal{C} \mid_{\mathcal{D}}\) just means that category \(\mathcal{C}\) is restricted to \(\mathcal{D}\).

Thinking about it a bit, I think I could answer

Puzzle 180maximally efficiently (lazily) by stating that \(\mathbf{Feas}\) is a category if, and only if, the following diagram in \(\mathbf{Cat}\) commutes,\[ \begin{matrix} \mathbf{Set} & \overset{i}\rightarrow & \mathbf{Rel}\\ \iota\downarrow & & \downarrow \iota' \\ \mathbf{Poset} & \underset{i'}\rightarrow & \mathbf{Feas} \end{matrix} \]

So, if I understand, the composition in \(\mathbf{Feas}\) should be like that of both \(\mathbf{Rel}\), where we compose on an existence of common objects, and that of \(\mathbf{Poset}\), where composition preserves the monotone structure.

`I think I get what's going on. \\(\mathbf{Feas}\\) is something like a category of pre-ordered sets and *relations* between them. In fact, if we restrict \\(X\\) and \\(Y\\) to just sets (with no pre-ordering on them), then \\(\mathbf{Feas} \mid\_{\mathbf{Set}} \cong \mathbf{Rel}\\) Edit: Here \\(\mathcal{C} \mid\_{\mathcal{D}}\\) just means that category \\(\mathcal{C}\\) is restricted to \\(\mathcal{D}\\). Thinking about it a bit, I think I could answer **Puzzle 180** maximally efficiently (lazily) by stating that \\(\mathbf{Feas}\\) is a category if, and only if, the following diagram in \\(\mathbf{Cat}\\) commutes, \\[ \begin{matrix} \mathbf{Set} & \overset{i}\rightarrow & \mathbf{Rel}\\\\ \iota\downarrow & & \downarrow \iota' \\\\ \mathbf{Poset} & \underset{i'}\rightarrow & \mathbf{Feas} \end{matrix} \\] So, if I understand, the composition in \\(\mathbf{Feas}\\) should be like that of both \\(\mathbf{Rel}\\), where we compose on an existence of common objects, and that of \\(\mathbf{Poset}\\), where composition preserves the monotone structure.`

I think you want those functors the other way around.

`I think you want those functors the other way around.`

No, I want \(i :\mathbf{Set} \to \mathbf{Rel}\), since \(\mathbf{Set}\) is included in \(\mathbf{Rel}\), and likewise with \(\iota : \mathbf{Set} \to \mathbf{Poset}\).

We therefor have an inclusion, \( \langle i,\iota \rangle :\mathbf{Set} \to \mathbf{Rel} \times \mathbf{Poset}\), which is this diagram,

\[ \begin{matrix} \mathbf{Set} & \overset{i}\rightarrow & \mathbf{Rel}\\ \iota\downarrow & & \\ \mathbf{Poset} & &

\end{matrix} \]

`No, I want \\(i :\mathbf{Set} \to \mathbf{Rel}\\), since \\(\mathbf{Set}\\) is included in \\(\mathbf{Rel}\\), and likewise with \\(\iota : \mathbf{Set} \to \mathbf{Poset}\\). We therefor have an inclusion, \\( \langle i,\iota \rangle :\mathbf{Set} \to \mathbf{Rel} \times \mathbf{Poset}\\), which is this diagram, \\[ \begin{matrix} \mathbf{Set} & \overset{i}\rightarrow & \mathbf{Rel}\\\\ \iota\downarrow & & \\\\ \mathbf{Poset} & & \end{matrix} \\]`

Actually, thinking about it, we could give a dual presentation by restricting \(\mathbf{Feas}\).

Then restricting the posets in \(\mathbf{Feas}\) to only sets gives the category \(\mathbf{Rel}\) (as I noted above), restricting thre feasiblity relations in \(\mathbf{Feas}\) to only monotone functions gives the category \(\mathbf{Poset}\), and restricting in both way gives the category \(\mathbf{Set}\).

If there was a category \(\mathbf{Porel}\) of posets as objects and relations as morphisms, then \(\mathbf{Feas} \cong \mathbf{Porel}\) .

`Actually, thinking about it, we could give a dual presentation by restricting \\(\mathbf{Feas}\\). Then restricting the posets in \\(\mathbf{Feas}\\) to only sets gives the category \\(\mathbf{Rel}\\) (as I noted above), restricting thre feasiblity relations in \\(\mathbf{Feas}\\) to only monotone functions gives the category \\(\mathbf{Poset}\\), and restricting in both way gives the category \\(\mathbf{Set}\\). If there was a category \\(\mathbf{Porel}\\) of posets as objects and relations as morphisms, then \\(\mathbf{Feas} \cong \mathbf{Porel}\\) .`

Keith, you are saying that your diagram is a diagram in the category of categories, but \(\mathbf{Feas}\) is an object in the diagram, so you are assuming that \(\mathbf{Feas}\) is a category, which is what you are supposed to be proving. That seems unhelpful, to say the least.

`Keith, you are saying that your diagram is a diagram in the category of categories, but \\(\mathbf{Feas}\\) is an object in the diagram, so you are assuming that \\(\mathbf{Feas}\\) is a category, which is what you are supposed to be proving. That seems unhelpful, to say the least.`

Okay, fair enough.

Are there particularly nice ways to make a category from old categories we have?

`Okay, fair enough. Are there particularly nice ways to make a category from old categories we have?`

Assume that \((\Psi \Phi)(x,z)\). We need to show:

Let's start with (1) and assume \(x' \leq x\). From the hypothesis that \((\Psi \Phi)(x,z)\) then there exists a \(y\) such that \(\Phi(x,y) = \text{true} \text{ and } \Psi(y,z) = \text{true}\). From this we know that since \(\Phi\) is a feasibility relation and \(x' \leq x\) we know \(\Phi(x',y) = \text{true}\). Hence $$ \Phi(x',y) = \text{true} \text{ and } \Psi(y,z) = \text{true} $$ which means

$$ (\Psi \Phi)(x,z) = \text{true} $$ The proof of (2) is similar.

Puzzle 176gives a composition rule. We have two things left to show:(1) follows from the following tautology in first order logic:

$$ \exists b. \phi(a,b) \wedge (\exists c. \psi(b,c) \wedge \upsilon(c,d)) \Longleftrightarrow \exists c. (\exists b. \phi(a,b) \wedge \psi(b,c)) \wedge \upsilon(c,d) $$ To see (2), all we need is an identity feasibility relation. This appears to be given by:

$$ \mathbf{1}_X(x,y) = \text{true} \Longleftrightarrow x \leq_X y $$

`> \[ \Psi \Phi : X \nrightarrow Z \] > > defined as follows: > > \[ (\Psi \Phi)(x,z) = \text{true} \] > > if and only if for some \\(y \in Y\\), > > \[ \Phi(x,y) = \text{true} \text{ and } \Psi(y,z) = \text{true}. \] > > **Puzzle 176.** Prove this! Show that \\(\Psi \Phi\\) really is a feasibility relation. Assume that \\((\Psi \Phi)(x,z)\\). We need to show: 1. If \\(x' \leq x\\) then \\((\Psi \Phi)(x',z)\\) and 2. If \\(z \leq z'\\) then \\((\Psi \Phi)(x,z')\\) Let's start with (1) and assume \\(x' \leq x\\). From the hypothesis that \\((\Psi \Phi)(x,z)\\) then there exists a \\(y\\) such that \\(\Phi(x,y) = \text{true} \text{ and } \Psi(y,z) = \text{true}\\). From this we know that since \\(\Phi\\) is a feasibility relation and \\(x' \leq x\\) we know \\(\Phi(x',y) = \text{true}\\). Hence \[ \Phi(x',y) = \text{true} \text{ and } \Psi(y,z) = \text{true} \] which means \[ (\Psi \Phi)(x,z) = \text{true} \] The proof of (2) is similar. > **Puzzle 180.** Prove that there is a category \\(\textbf{Feas}\\) whose objects are preorders and whose morphisms are feasibility relations, composed as above. **Puzzle 176** gives a composition rule. We have two things left to show: 1. \\((\Phi (\Psi \Upsilon))(a,d) = \text{true} \text{ if and only if } ((\Phi \Psi) \Upsilon)(a,d) = \text{true}\\) 2. Every poset has an identity feasibility relationship \\(\mathbf{1}_X: X \nrightarrow X \\) such that \\((\mathbf{1} \Phi (x,y)) = \text{true} \text{ if and only if } (\Phi \mathbf{1})(x,y) = \text{true} \text{ if and only if } \Phi(x,y) = \text{true} \\) (1) follows from the following tautology in first order logic: \[ \exists b. \phi(a,b) \wedge (\exists c. \psi(b,c) \wedge \upsilon(c,d)) \Longleftrightarrow \exists c. (\exists b. \phi(a,b) \wedge \psi(b,c)) \wedge \upsilon(c,d) \] To see (2), all we need is an identity feasibility relation. This appears to be given by: \[ \mathbf{1}_X(x,y) = \text{true} \Longleftrightarrow x \leq_X y \]`

Actually if we just let the objects in Cat be structures that are the right "shape" but don't necessarily follow the laws (call this Cat'), then we can prove the laws by finding an functor that is faithful and one to one on objects, to a object in Cat' that is in fact is a category.

`Actually if we just let the objects in Cat be structures that are the right "shape" but don't necessarily follow the laws (call this Cat'), then we can prove the laws by finding an functor that is faithful and one to one on objects, to a object in Cat' that is in fact is a category.`

re

Puzzle 181:If \((\Psi\Phi)(x,z) = \text{true}\), then by definition we must have \(\Phi(x,y) \wedge \Psi(y,z) = \text{true}\) for at least one \(y\in Y\), so the join of all \(\Phi(x,y) \wedge \Psi(y,z)\) must be \(\text{true}\).

If \((\Psi\Phi)(x,z) = \text{false}\), then we must have \(\Phi(x,y) \wedge \Psi(y,z) = \text{false}\) for all \(y\in Y\), so the join of all \(\Phi(x,y) \wedge \Psi(y,z)\) will be \(\text{false}\).

We can think of this join-of-meets as a sum-of-products, and specifically as the product of the "row vector" \((\Psi(y_0,z)\ \Psi(y_1,z)\ \Psi(y_2,z)...)\) with the "column vector" \((\Phi(x,y_0)\ \Phi(x,y_1)\ \Phi(x,y_2)...)^\textbf{T}\). This suggests the following construction:

Given any feasibility relation \(\Phi : X \nrightarrow Y\) we construct a matrix \(\text{M}(\Phi)\) with \(|X|\) columns, \(|Y|\) rows, and the boolean \(\Phi(x, y)\) at position \((y, x)\).

Then what we've just proved amounts to \(\text{M}(\Psi\Phi) = \text{M}(\Psi)\text{M}(\Phi)\).

`re **Puzzle 181**: > Suppose that \\(\Phi : X \nrightarrow Y, \Psi : Y \nrightarrow Z\\) are feasibility relations between preorders. Prove that > > \[ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) \] > > where \\(\wedge\\) is the meet in the poset \\(\textbf{Bool}\\), and \\(\bigvee\\) is the join. > > How is this formula related to matrix multiplication? If \\((\Psi\Phi)(x,z) = \text{true}\\), then by definition we must have \\(\Phi(x,y) \wedge \Psi(y,z) = \text{true}\\) for at least one \\(y\in Y\\), so the join of all \\(\Phi(x,y) \wedge \Psi(y,z)\\) must be \\(\text{true}\\). If \\((\Psi\Phi)(x,z) = \text{false}\\), then we must have \\(\Phi(x,y) \wedge \Psi(y,z) = \text{false}\\) for all \\(y\in Y\\), so the join of all \\(\Phi(x,y) \wedge \Psi(y,z)\\) will be \\(\text{false}\\). We can think of this join-of-meets as a sum-of-products, and specifically as the product of the "row vector" \\((\Psi(y_0,z)\ \Psi(y_1,z)\ \Psi(y_2,z)...)\\) with the "column vector" \\((\Phi(x,y_0)\ \Phi(x,y_1)\ \Phi(x,y_2)...)^\textbf{T}\\). This suggests the following construction: Given any feasibility relation \\(\Phi : X \nrightarrow Y\\) we construct a matrix \\(\text{M}(\Phi)\\) with \\(|X|\\) columns, \\(|Y|\\) rows, and the boolean \\(\Phi(x, y)\\) at position \\((y, x)\\). Then what we've just proved amounts to \\(\text{M}(\Psi\Phi) = \text{M}(\Psi)\text{M}(\Phi)\\).`

gotta say I'm puzzled about how the above sits with the identity feasibility relation, since \(\text{M}(\textbf{1}_X)\) typically won't be the identity matrix...

`gotta say I'm puzzled about how the above sits with the identity feasibility relation, since \\(\text{M}(\textbf{1}_X)\\) typically won't be the identity matrix...`

Some of you remarked that the composition of profunctors/Hets is given by Left Kan extensions. Are Left Kan extensions related to Matrix algebras by any chance?

`Some of you remarked that the composition of profunctors/Hets is given by Left Kan extensions. Are Left Kan extensions related to Matrix algebras by any chance?`

I have been wondering how this might generalize to Lawvere metric spaces.

A \([0-\infty]\)-feasibility relation \(\Lambda : X \nrightarrow Y\) would still be between two posets. Remembering that order is reversed for Lawvere spaces, we would have the following two rules:

$$ \begin{align} x' \leq x & \implies \Lambda(x',y) \leq \Lambda(x,y)\\ y \leq y' & \implies \Lambda(x,y') \leq \Lambda(x,y) \end{align} $$ My intuition here is that boolean feasibility relationships are a special case. Specifically \(\text{true} = 0\) and \(\text{false} = \infty\) because of the flipped order.

Now composition would be defined as

$$ (\Lambda\Omega)(x,z) = \inf_{y \in Y} \Lambda(x,y) + \Omega(y,z) $$ We could alternatively write \((\Lambda\Omega)(x,z) = \bigwedge_{y \in Y} \Lambda(x,y) + \Omega(y,z)\). Again \(\wedge\) and \(\vee\) are swapped because order is reversed.

The identity I wrote for boolean feasibility should generalize as an identity for \([0-\infty]\)-feasibility relations:

$$ \mathbf{1}_X(x,y) = \begin{cases} 0 & x\leq y \\ \infty & \text{otherwise} \end{cases} $$

`I have been wondering how this might generalize to Lawvere metric spaces. A \\([0-\infty]\\)-feasibility relation \\(\Lambda : X \nrightarrow Y\\) would still be between two posets. Remembering that order is reversed for Lawvere spaces, we would have the following two rules: \[ \begin{align} x' \leq x & \implies \Lambda(x',y) \leq \Lambda(x,y)\\\\ y \leq y' & \implies \Lambda(x,y') \leq \Lambda(x,y) \end{align} \] My intuition here is that boolean feasibility relationships are a special case. Specifically \\(\text{true} = 0\\) and \\(\text{false} = \infty\\) because of the flipped order. Now composition would be defined as \[ (\Lambda\Omega)(x,z) = \inf_{y \in Y} \Lambda(x,y) + \Omega(y,z) \] We could alternatively write \\((\Lambda\Omega)(x,z) = \bigwedge_{y \in Y} \Lambda(x,y) + \Omega(y,z)\\). Again \\(\wedge\\) and \\(\vee\\) are swapped because order is reversed. The identity I wrote for boolean feasibility should generalize as an identity for \\([0-\infty]\\)-feasibility relations: \[ \mathbf{1}_X(x,y) = \begin{cases} 0 & x\leq y \\\\ \infty & \text{otherwise} \end{cases} \]`

But you're not using usual matrix multiplication, i.e. with the usual arithmetic operations. Is this matrix the identity with respect to the alternative multiplication using meet and join?

`But you're not using usual matrix multiplication, i.e. with the usual arithmetic operations. Is this matrix the identity with respect to the alternative multiplication using meet and join?`

Simon wrote:

Whenever we have a semi-ring or

rig\((R, +, \cdot)\) then we can define matrix multiplication and addition.In the special case John is thinking of here:

I suspect it follows that matrix multiplication is associative, even for infinite matrices, because \(\mathbf{Bool}\) is a complete semiring. However, I am not sure..

While I'm busy gearing up to be wrong, I also suspect every monoidal poset gives rise to a complete semiring. This might explain the \([0-\infty]\)-feasibility construction I was wondering about earlier.

`[Simon](https://forum.azimuthproject.org/discussion/comment/19999/#Comment_19999) wrote: > But you're not using usual matrix multiplication, i.e. with the usual arithmetic operations. Is this matrix the identity with respect to the alternative multiplication using meet and join? Whenever we have a [semi-ring or *rig*](https://en.wikipedia.org/wiki/Semiring) \\((R, +, \cdot)\\) then we can define matrix multiplication and addition. In the special case John is thinking of here: > \[ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) \] I suspect it follows that matrix multiplication is associative, even for infinite matrices, because \\(\mathbf{Bool}\\) is a [complete semiring](https://en.wikipedia.org/wiki/Semiring#Complete_and_continuous_semirings). However, I am not sure.. While I'm busy gearing up to be wrong, I also suspect every monoidal poset gives rise to a complete semiring. This might explain the \\([0-\infty]\\)-feasibility construction I was wondering about earlier.`

Matthew: my comment was directed at Anindya, but I foolishly didn't address it to him as my comment was directly below his. You and Keith managed to sneak inbetween whilst I was writing my comment!

`Matthew: my comment was directed at Anindya, but I foolishly didn't address it to him as my comment was directly below his. You and Keith managed to sneak inbetween whilst I was writing my comment!`

thinking about this I reckon the issue is not so much that the operations aren't exactly sum/product but that the matrices are a bit weird. any feasibility relation corresponds to a matrix but not vice versa. and I suspect the \(\text{M}(\textbf{1}_X\))s act as multiplicative identities in the sub-wotsit of special matrices that correspond to feasibility relations.

`thinking about this I reckon the issue is not so much that the operations aren't exactly sum/product but that the matrices are a bit weird. any feasibility relation corresponds to a matrix but not vice versa. and I suspect the \\(\text{M}(\textbf{1}_X\\))s act as multiplicative identities in the sub-wotsit of special matrices that correspond to feasibility relations.`

Matthew wrote:

Great! That's coming soon!

This chapter is secretly about profunctors between \(\mathcal{V}\)-enriched categories. When \(\mathcal{V}\) is \(\mathbf{Bool}\), we get feasibility relations between preorders... and those are an easy way to get some intuition for the general case, so I want to talk about those first. The next simplest example is \(\mathcal{V} = \mathbf{Cost}\), and then we get

some sortof maps going between Lawvere metric spaces. It's fun to guess what those might be.But before doing the case \(\mathcal{V} = \mathbf{Cost}\), I should probably explain the general theory. At least in the way Fong and Spivak do it, they assume \(\mathcal{V}\) is more than a symmetric monoidal preorder: they take it to be a commutative unital quantale. So I need to explain those. Maybe I should have done it sooner, because they showed up in an earlier chapter! But now is a good time.

`Matthew wrote: > I have been wondering how this might generalize to Lawvere metric spaces. Great! That's coming soon! This chapter is secretly about profunctors between \\(\mathcal{V}\\)-enriched categories. When \\(\mathcal{V}\\) is \\(\mathbf{Bool}\\), we get feasibility relations between preorders... and those are an easy way to get some intuition for the general case, so I want to talk about those first. The next simplest example is \\(\mathcal{V} = \mathbf{Cost}\\), and then we get _some sort_ of maps going between [Lawvere metric spaces](https://forum.azimuthproject.org/discussion/2128/lecture-31-chapter-2-lawvere-metric-spaces/p1). It's fun to guess what those might be. But before doing the case \\(\mathcal{V} = \mathbf{Cost}\\), I should probably explain the general theory. At least in the way Fong and Spivak do it, they assume \\(\mathcal{V}\\) is more than a symmetric monoidal preorder: they take it to be a commutative unital quantale. So I need to explain those. Maybe I should have done it sooner, because they showed up in an earlier chapter! But now is a good time.`

Keith wrote:

Your intuition is good here!

\(\mathbf{Feas}\) is a kind of 'blend' of \(\mathbf{Poset} \) and \(\mathbf{Rel}\), which in turn both contain \(\mathbf{Set}\). Once we get \(\mathbf{Feas}\) up and running, we'll see there's a commutative diagram of functors just like you drew, where \(\mathbf{Set}\) is included in \(\mathbf{Poset}\) as the discrete posets, and \(\mathbf{Set}\) is included in \(\mathbf{Rel}\) with functions getting mapped to the total deterministic relations.

Using this intuition to actually

constructthe category \(\mathbf{Feas}\) is not so easy. I instantly think of a 'pushout'. Very often in math you have a diagram likeand you want to find the "best" way to fill in the diagram in a commutative way. This is called a pushout. You can think of it as a systematic approach to analogies. When we have a pushout we are filling in the analogy

but also

So, you could try to create \(\mathbf{Feas}\) using a pushout in \(\mathbf{Cat}\). But since \(\mathbf{Cat}\) is a 2-category, it'd be wiser to use a "2-pushout", which makes the square commute only up to natural isomorphism.

I'm not sure this would actually give \(\mathbf{Feas}\). I am sure, however, that this is about fifty times harder than simply checking that composition as I defined it for \(\mathbf{Feas}\) is associative, and that there exist identity morphisms. I could have done that in less time than it took to write this comment!

Tons! Category theory is full of ways of creating new objects from old, like pushouts... and many of them apply to the category \(\mathbf{Cat}\). Category theory pulls itself up by its boostraps. We are, unfortunately, just getting started: we are very weak, so it pays to just build tools by hand. Later we can use these tools to build other tools, and eventually we'll get tools that can more efficiently build the tools we're building by hand now!

`Keith wrote: > \\[ \begin{matrix} \mathbf{Set} & \overset{i}\rightarrow & \mathbf{Rel}\\\\ \iota\downarrow & & \downarrow \iota' \\\\ \mathbf{Poset} & \underset{i'}\rightarrow & \mathbf{Feas} \end{matrix} \\] > So, if I understand, the composition in \\(\mathbf{Feas}\\) should be like that of both \\(\mathbf{Rel}\\), where we compose on an existence of common objects, and that of \\(\mathbf{Poset}\\), where composition preserves the monotone structure. Your intuition is good here! \\(\mathbf{Feas}\\) is a kind of 'blend' of \\(\mathbf{Poset} \\) and \\(\mathbf{Rel}\\), which in turn both contain \\(\mathbf{Set}\\). Once we get \\(\mathbf{Feas}\\) up and running, we'll see there's a commutative diagram of functors just like you drew, where \\(\mathbf{Set}\\) is included in \\(\mathbf{Poset}\\) as the discrete posets, and \\(\mathbf{Set}\\) is included in \\(\mathbf{Rel}\\) with functions getting mapped to the total deterministic relations. Using this intuition to actually _construct_ the category \\(\mathbf{Feas}\\) is not so easy. I instantly think of a 'pushout'. Very often in math you have a diagram like > \\[ \begin{matrix} \mathcal{A} & \overset{i}\rightarrow & \mathcal{B}\\\\ \iota\downarrow & & \downarrow ? \\\\ \mathcal{C} & \underset{?}\rightarrow & ? \end{matrix} \\] and you want to find the "best" way to fill in the diagram in a commutative way. This is called a [pushout](https://en.wikipedia.org/wiki/Pushout_(category_theory)). You can think of it as a systematic approach to analogies. When we have a pushout we are filling in the analogy > ? is to B as C is to A but also > ? is to C as B is to A. So, you could try to create \\(\mathbf{Feas}\\) using a pushout in \\(\mathbf{Cat}\\). But since \\(\mathbf{Cat}\\) is a 2-category, it'd be wiser to use a "2-pushout", which makes the square commute only up to natural isomorphism. I'm not sure this would actually give \\(\mathbf{Feas}\\). I am sure, however, that this is about fifty times harder than simply checking that composition as I defined it for \\(\mathbf{Feas}\\) is associative, and that there exist identity morphisms. I could have done that in less time than it took to write this comment! > Are there particularly nice ways to make a category from old categories we have? Tons! Category theory is full of ways of creating new objects from old, like pushouts... and many of them apply to the category \\(\mathbf{Cat}\\). Category theory pulls itself up by its boostraps. We are, unfortunately, just getting started: we are very weak, so it pays to just build tools by hand. Later we can use these tools to build other tools, and eventually we'll get tools that can more efficiently build the tools we're building by hand now!`

I'm sad that nobody has done Puzzle 180:

It's really important and it's also not hard. Perhaps the word "prove" is scary to some people? This proof is mostly just a calculation. The main thing to check is that composition is

associative. So, assume we've got feasibility relations$$ F: W \nrightarrow X, \; G : X \nrightarrow Y, \; H: Y \nrightarrow Z .$$ We want to compute

$$ (HG) F $$ and

$$ H(G F) $$ and show they are equal. We know the formula for composition. So,

dive in!Checking that our would-be category \(\mathbf{Feas}\) has identity morphisms requires a bit more creativity, because we need to

guesswhat the identity morphismsarebefore checking that they work.But as usual in category theory, this guessing is easy because there's so little to work with - any answer you can write down has a chance of making sense is likely to be right. We have a preorder \(X\) and we want to cook up a feasibility relation from \(X\) to itself.

What can we possibly do???`I'm sad that nobody has done Puzzle 180: > **Puzzle 180.** Prove that there is a category \\(\textbf{Feas}\\) whose objects are preorders and whose morphisms are feasibility relations, composed as above. It's really important and it's also not hard. Perhaps the word "prove" is scary to some people? This proof is mostly just a calculation. The main thing to check is that composition is _associative_. So, assume we've got feasibility relations \[ F: W \nrightarrow X, \; G : X \nrightarrow Y, \; H: Y \nrightarrow Z .\] We want to compute \[ (HG) F \] and \[ H(G F) \] and show they are equal. We know the formula for composition. So, _dive in!_ Checking that our would-be category \\(\mathbf{Feas}\\) has identity morphisms requires a bit more creativity, because we need to _guess_ what the identity morphisms _are_ before checking that they work. But as usual in category theory, this guessing is easy because there's so little to work with - any answer you can write down has a chance of making sense is likely to be right. We have a preorder \\(X\\) and we want to cook up a feasibility relation from \\(X\\) to itself. _What can we possibly do???_`

Hey John,

Don't be sad! I tried to do it in #7!

As I mentioned, associativity of composition reflects a tautology in first order propositional logic:

$$ \exists b. \phi(a,b) \wedge (\exists c. \psi(b,c) \wedge \upsilon(c,d)) \Longleftrightarrow \exists c. (\exists b. \phi(a,b) \wedge \psi(b,c)) \wedge \upsilon(c,d) $$ And the identity feasibility relation is:

$$ \mathbf{1}_X(x,y) := x \leq_X y $$

`Hey John, > I'm sad that nobody has done Puzzle 180: > > **Puzzle 180.** Prove that there is a category \\(\textbf{Feas}\\) whose objects are preorders and whose morphisms are feasibility relations, composed as above. Don't be sad! I tried to do it in [#7](https://forum.azimuthproject.org/discussion/comment/19990/#Comment_19990)! As I mentioned, associativity of composition reflects a tautology in first order propositional logic: \[ \exists b. \phi(a,b) \wedge (\exists c. \psi(b,c) \wedge \upsilon(c,d)) \Longleftrightarrow \exists c. (\exists b. \phi(a,b) \wedge \psi(b,c)) \wedge \upsilon(c,d) \] And the identity feasibility relation is: \[ \mathbf{1}_X(x,y) := x \leq_X y \]`

Okay, I'm not sad anymore! Sorry, Matthew - I just overlooked it!

Does anyone have a question about Matthew's answer? All those symbols might make certain people's eyes glaze over; I'll be glad to explain this stuff in words.

`Okay, I'm not sad anymore! Sorry, Matthew - I just overlooked it! Does anyone have a question about Matthew's answer? All those symbols might make certain people's eyes glaze over; I'll be glad to explain this stuff in words.`

Is the identity our old friend \(\mathrm{hom}\)? Or rather, in this case, \(\mathrm{hom}_{\mathbf{Bool}}\)?

Edit: I ask because it has the same typing constraints,

\[ \mathrm{hom} : X^{op} \times X \to \mathbf{Set} \\ \mathrm{hom}_{\mathbf{Bool}} : X^{op} \times X \to \mathbf{Bool}. \]

`Is the identity our old friend \\(\mathrm{hom}\\)? Or rather, in this case, \\(\mathrm{hom}\_{\mathbf{Bool}}\\)? Edit: I ask because it has the same typing constraints, \\[ \mathrm{hom} : X^{op} \times X \to \mathbf{Set} \\\\ \mathrm{hom}\_{\mathbf{Bool}} : X^{op} \times X \to \mathbf{Bool}. \\]`

I will also to try to explain that evil symbol salad if anyone wants me to.

`> All those symbols might make certain people's eyes glaze over; I'll be glad to explain this stuff in words. I will also to try to explain that evil symbol salad if anyone wants me to.`

Personally, I've read lots of logic texts, so the above

'symbol salad'seems completely normal to me.In English, your

'symbol salad'just says, "there exists some common \(b\) such that the feasibility relation \(\phi\) between \(a\) and \(b\) holds, and there exists some common \(c\) such that the feasibility relations \(\psi\) between \(b\) and \(c\) holds, and the feasibility relation \(\nu\) between \(c\) and \(d\) holds, if, and only if, there exists some common \(c\) such that there exists some common \(b\) such that the feasibility relation \(\phi\) between \(a\) and \(b\) holds, the feasibility relations \(\psi\) between \(b\) and \(c\) holds, and the feasibility relation \(\nu\) between \(c\) and \(d\) holds."`Personally, I've read lots of logic texts, so the above *'symbol salad'* seems completely normal to me. In English, your *'symbol salad'* just says, "there exists some common \\(b\\) such that the feasibility relation \\(\phi\\) between \\(a\\) and \\(b\\) holds, and there exists some common \\(c\\) such that the feasibility relations \\(\psi\\) between \\(b\\) and \\(c\\) holds, and the feasibility relation \\(\nu\\) between \\(c\\) and \\(d\\) holds, if, and only if, there exists some common \\(c\\) such that there exists some common \\(b\\) such that the feasibility relation \\(\phi\\) between \\(a\\) and \\(b\\) holds, the feasibility relations \\(\psi\\) between \\(b\\) and \\(c\\) holds, and the feasibility relation \\(\nu\\) between \\(c\\) and \\(d\\) holds."`

Ugh, that's awful in words.

`Ugh, that's awful in words.`

Thank Diophantus of Alexandria for coming with the idea of using symbols in place of fully written out sentences.

`Thank Diophantus of Alexandria for coming with the idea of using symbols in place of fully written out sentences.`

I will.

`I will.`

There are ways to be clearer than either

or

Of course, Keith was deliberately trying to make the words as tiring as possible!

Look what happens when we replace things like "the feasibility relation \(\phi\) between \(a\) and \(b\) holds" by "\(\phi(a,b)\)" everywhere. I think this is getting better:

But it's still clunky. There are ways to improve the writing style to make it more digestible. For example:

I spend a few hours a day writing math, thinking about these things. It's frustrating but also fun.

Matthew's original formula is not bad if one is trying to understand exactly what rules of logic we're using to show the equivalence! We're using 'regular logic' , which is the fragment of logic governing \(\exists\) and \(\wedge\). This is the fragment of logic one needs to work with composition of relations. It holds in any topos, but even more generally, like in the category of vector spaces.

However, if one is not in the mood for logic, it's often less stressful to read text where the logical connectives are written as words.

`There are ways to be clearer than either > \[ \exists b. \phi(a,b) \wedge (\exists c. \psi(b,c) \wedge \upsilon(c,d)) \Longleftrightarrow \exists c. (\exists b. \phi(a,b) \wedge \psi(b,c)) \wedge \upsilon(c,d) \] or > There exists some common \\(b\\) such that the feasibility relation \\(\phi\\) between \\(a\\) and \\(b\\) holds, and there exists some common \\(c\\) such that the feasibility relation \\(\psi\\) between \\(b\\) and \\(c\\) holds, and the feasibility relation \\(\nu\\) between \\(c\\) and \\(d\\) holds, if, and only if, there exists some common \\(c\\) such that there exists some common \\(b\\) such that the feasibility relation \\(\phi\\) between \\(a\\) and \\(b\\) holds, the feasibility relations \\(\psi\\) between \\(b\\) and \\(c\\) holds, and the feasibility relation \\(\nu\\) between \\(c\\) and \\(d\\) holds. Of course, Keith was deliberately trying to make the words as tiring as possible! Look what happens when we replace things like "the feasibility relation \\(\phi\\) between \\(a\\) and \\(b\\) holds" by "\\(\phi(a,b)\\)" everywhere. I think this is getting better: > There exists some common \\(b\\) such that \\(\phi(a,b)\\) and there exists some common \\(c\\) such that \\(\psi(b,c)\\) and \\(\nu(c,d)\\) if, and only if, there exists some common \\(c\\) such that there exists some common \\(b\\) such that \\(\phi\(a,b)\\), \\(\psi(b,c)\\), and \\(\nu(c,d)\\). But it's still clunky. There are ways to improve the writing style to make it more digestible. For example: > The following are equivalent: > * There exists \\(b\\) such that \\(\phi(a,b)\\) and for some \\(c\\) we have \\(\psi(b,c)\\) and \\(\nu(c,d)\\). > * There exists \\(c\\) such that for some \\(b\\) we have \\(\phi\(a,b)\\), \\(\psi(b,c)\\), and \\(\nu(c,d)\\). I spend a few hours a day writing math, thinking about these things. It's frustrating but also fun. Matthew's original formula is not bad if one is trying to understand exactly what rules of logic we're using to show the equivalence! We're using ['regular logic'](http://www.brics.dk/LS/98/2/BRICS-LS-98-2.pdf) , which is the fragment of logic governing \\(\exists\\) and \\(\wedge\\). This is the fragment of logic one needs to work with composition of relations. It holds in any topos, but even more generally, like in the category of vector spaces. However, if one is not in the mood for logic, it's often less stressful to read text where the logical connectives are written as words.`

Well, if we want to give an unclunky, simplified version of the above rule, then it would be something like,

"Gluing on the common elements \(b\) with the feasibility relations \(\phi(a,b)\) and \(\psi(b,c)\), and then gluing on the common elements \(c\) with \(\nu(c,d)\), is logically the same as gluing on the common elements \(c\) with the feasibility relations \(\psi(b,c)\) and \(\nu(c,d)\), and then gluing on the common elements \(b\), with the feasiblity relation \(\phi(a,b)\)."

`Well, if we want to give an unclunky, simplified version of the above rule, then it would be something like, "Gluing on the common elements \\(b\\) with the feasibility relations \\(\phi(a,b)\\) and \\(\psi(b,c)\\), and then gluing on the common elements \\(c\\) with \\(\nu(c,d)\\), is logically the same as gluing on the common elements \\(c\\) with the feasibility relations \\(\psi(b,c)\\) and \\(\nu(c,d)\\), and then gluing on the common elements \\(b\\), with the feasiblity relation \\(\phi(a,b)\\)."`

That's nice!

The main moral is that

the associative law for composition of feasibility relations follows from a kind of associative law built into logic.`That's nice! The main moral is that _the associative law for composition of feasibility relations follows from a kind of associative law built into logic_.`

\(\mathbf{Bool}\) is isomorphic to Boolean algebra. We can relabel \(\mathbf{Bool}\) as \({0,1}\) where \(\text{True} = 1\) and \(\text{False} = 0\). The meet of \(\mathbf{Bool}\) is the same as multiplication \(\cdot\) such that \(0 \cdot 0 = 0\), \(1 \cdot 0 = 0\), \(0 \cdot 1 = 0\), and \(1 \cdot 1 = 1\). Logically, this means \(xy\) is asking if both \(x\) and \(y\) are true (i.e., the operation AND). The join of \(\mathbf{Bool}\) becomes the same as addition \(0 + 0 = 0\), \(1 + 0 = 1\), \(0 + 1 = 1\), and \(1 + 1 = 1\). Logically, this means \(x + y\) is asking whether either or both \(x\) and \(y\) are true (i.e., the operation OR).

Let \( A_{yx} = \Phi(x,y)\), \( B_{zy} = \Psi(y,z) \), and \(C_{zx} = (\Psi\Phi)(x,z) \) where \(A, B\), and \(C\) are matrices whose entries are either 0 or 1. Thus the formula

$$ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) $$ is isomorphic to matrix multiplication in Boolean algebra:

$$C_{zx} = \sum_y B_{zy} A_{yx} $$ Logically, this formula states that \(C_{zx}\) is true if for at least one \(y\), \(\Phi(x,y)\) and \(\Psi(y,z)\) are true.

`> **Puzzle 181.** Suppose that \\(\Phi : X \nrightarrow Y, \Psi : Y \nrightarrow Z\\) are feasibility relations between preorders. Prove that > \[ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) \] > where \\(\wedge\\) is the meet in the poset \\(\textbf{Bool}\\), and \\(\bigvee\\) is the join. How is this formula related to matrix multiplication? \\(\mathbf{Bool}\\) is isomorphic to [Boolean algebra](https://en.wikipedia.org/wiki/Boolean_algebra). We can relabel \\(\mathbf{Bool}\\) as \\(\{0,1\}\\) where \\(\text{True} = 1\\) and \\(\text{False} = 0\\). The meet of \\(\mathbf{Bool}\\) is the same as multiplication \\(\cdot\\) such that \\(0 \cdot 0 = 0\\), \\(1 \cdot 0 = 0\\), \\(0 \cdot 1 = 0\\), and \\(1 \cdot 1 = 1\\). Logically, this means \\(xy\\) is asking if both \\(x\\) and \\(y\\) are true (i.e., the operation AND). The join of \\(\mathbf{Bool}\\) becomes the same as addition \\(0 + 0 = 0\\), \\(1 + 0 = 1\\), \\(0 + 1 = 1\\), and \\(1 + 1 = 1\\). Logically, this means \\(x + y\\) is asking whether either or both \\(x\\) and \\(y\\) are true (i.e., the operation OR). Let \\( A_{yx} = \Phi(x,y)\\), \\( B_{zy} = \Psi(y,z) \\), and \\(C_{zx} = (\Psi\Phi)(x,z) \\) where \\(A, B\\), and \\(C\\) are matrices whose entries are either 0 or 1. Thus the formula \[ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) \] is isomorphic to matrix multiplication in Boolean algebra: \[C_{zx} = \sum_y B_{zy} A_{yx} \] Logically, this formula states that \\(C_{zx}\\) is true if for at least one \\(y\\), \\(\Phi(x,y)\\) and \\(\Psi(y,z)\\) are true.`

Scott - great! You've explained how my formula is really just matrix multiplication.

\(\textbf{Bool}\) has a 'multiplication' \(\wedge\) which distributes over the 'addition' \(\vee\), so it's a lot like a ring; the only thing missing is subtraction, so we call it a

rig, which is a joking shorthand for 'ring withoutnegatives'.We can multiply matrices whose entries lie in any rig, and if we do this for \(\textbf{Bool}\) we get exactly the formula for composing relations. Composing feasibility relations is a special case.

My one persnickety objection is that people don't call \(\textbf{Bool}\) 'Boolean algebra'. The phrase 'Boolean algebra' means a couple of things:

There's a

branch of mathcalled Boolean algebra, which that Wikipedia article discusses: "Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively."Mathematicians have defined a

special kind of rigcalled a Boolean algebra, and \(\textbf{Bool}\) isan example ofa Boolean algebra, in fact the most important and fundamental example.As a mathematician I hear my friends talk about 2 all the time but never 1.

I said a Boolean algebra is a special kind of rig; the link I gave lists the rules, but here's a high-powered description. It's a poset that has finite meets and joins, where \(\vee\) and \(\wedge\) distribute over each other, and such that every element \(x\) has an element \(\neg x\) such that \(x \vee \neg x = \top\) and \(x \wedge \neg x = \bot\).

The poset of subsets of any set \(X\), called the

power set\(P(X)\), is the classic example of a Boolean algebra. If \(X\) has just one element we get \(\mathbf{Bool}\).`Scott - great! You've explained how my formula is really just matrix multiplication. \\(\textbf{Bool}\\) has a 'multiplication' \\(\wedge\\) which distributes over the 'addition' \\(\vee\\), so it's a lot like a [ring](https://en.wikipedia.org/wiki/Ring_(mathematics)#Definition); the only thing missing is subtraction, so we call it a **rig**, which is a joking shorthand for 'ri**n**g without **n**egatives'. We can multiply matrices whose entries lie in any rig, and if we do this for \\(\textbf{Bool}\\) we get exactly the formula for composing relations. Composing feasibility relations is a special case. My one persnickety objection is that people don't call \\(\textbf{Bool}\\) 'Boolean algebra'. The phrase 'Boolean algebra' means a couple of things: 1. There's a _branch of math_ called [Boolean algebra](https://en.wikipedia.org/wiki/Boolean_algebra), which that Wikipedia article discusses: "Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively." 2. Mathematicians have defined a _special kind of rig_ called a [Boolean algebra](https://en.wikipedia.org/w/index.php?title=Boolean_algebra_(structure)&action=edit§ion=2), and \\(\textbf{Bool}\\) is _an example of_ a Boolean algebra, in fact the most important and fundamental example. As a mathematician I hear my friends talk about 2 all the time but never 1. I said a Boolean algebra is a special kind of rig; the link I gave lists the rules, but here's a high-powered description. It's a poset that has finite meets and joins, where \\(\vee\\) and \\(\wedge\\) distribute over each other, and such that every element \\(x\\) has an element \\(\neg x\\) such that \\(x \vee \neg x = \top\\) and \\(x \wedge \neg x = \bot\\). The poset of subsets of any set \\(X\\), called the **power set** \\(P(X)\\), is the classic example of a Boolean algebra. If \\(X\\) has just one element we get \\(\mathbf{Bool}\\).`

That's fair... I remember studying something like a combination of 1) and 2) in a Discrete math class a few years ago. But the description of \(\mathbf{Bool}\) as a rig seems straight forward. That would be the minimum definition for the matrix multiplication formula to make sense (or at least be well-defined).

`> My one persnickety objection is that people don't call \\(\textbf{Bool}\\) 'Boolean algebra'. The phrase 'Boolean algebra' means a couple of things: That's fair... I remember studying something like a combination of 1) and 2) in a Discrete math class a few years ago. But the description of \\(\mathbf{Bool}\\) as a rig seems straight forward. That would be the minimum definition for the matrix multiplication formula to make sense (or at least be well-defined).`

People call \(\text{Bool}\)

aBoolean algebra, just not 'Boolean algebra'. We'll see matrix multiplication for many rigs in our course pretty soon.`People call \\(\text{Bool}\\) _a_ Boolean algebra, just not 'Boolean algebra'. We'll see matrix multiplication for many rigs in our course pretty soon.`

Puzzle 181This is not a solution but an appendix. I drew out how boolean matrix multiplication works based on the formula :

$$ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) $$ It's color coded (red meet and blue join) so that the you can the relationship between all three versions (1. formula, 2. feasibility relations, 3. matrices).

You can easily see how this would work with \(\mathbf{Cost}\). The boolean version gives the backbone for interaction with the bottom element and the rest is just number multiplication.

`**Puzzle 181** This is not a solution but an appendix. I drew out how boolean matrix multiplication works based on the formula : \[ (\Psi\Phi)(x,z) = \bigvee_{y \in Y} \Phi(x,y) \wedge \Psi(y,z) \] It's color coded (red meet and blue join) so that the you can the relationship between all three versions (1. formula, 2. feasibility relations, 3. matrices). ![boolean matrix multiplication](http://aether.co.kr/images/matrix_multiplication.svg) You can easily see how this would work with \\(\mathbf{Cost}\\). The boolean version gives the backbone for interaction with the bottom element and the rest is just number multiplication.`

small point but if we define the matrices to have the boolean \(\Phi(x, y)\) at position row \(y\) and column \(x\) (ie the transpose of the above) then our matrix multiplication rule matches up nicely with composition, ie we get \(\text{M}(\Psi\Phi) = \text{M}(\Psi)\text{M}(\Phi)\).

`small point but if we define the matrices to have the boolean \\(\Phi(x, y)\\) at position row \\(y\\) and column \\(x\\) (ie the transpose of the above) then our matrix multiplication rule matches up nicely with composition, ie we get \\(\text{M}(\Psi\Phi) = \text{M}(\Psi)\text{M}(\Phi)\\).`

Anindya

That would very satisfying although we would have to throw away usual definition of adjacency matrices but that's just convention so here is the other version you suggested. I'm not sure if adjacency matrices were set up the way they are for some other reasons but its nice having it match up with composition conventions. Thanks!

`Anindya That would very satisfying although we would have to throw away usual definition of adjacency matrices but that's just convention so here is the other version you suggested. I'm not sure if adjacency matrices were set up the way they are for some other reasons but its nice having it match up with composition conventions. Thanks! ![boolean matrix multiplication 2](http://aether.co.kr/images/matrix_multiplication_2.svg)`

thing is that if we had a linear map \(\phi : \mathbb{R}^3 \rightarrow \mathbb{R}^2\) we would represent that as a matrix with 2 rows and 3 columns.

so by analogy the feasibility relation \(\Phi : X \nrightarrow Y\) should be represented by a matrix with \(|Y|\) rows and \(|X|\) columns.

(fwiw I always imagine matrices as taking inputs through the "top side" and spitting out results through the "left side" – but I'm sure everyone has their own way of visualising this...)

`thing is that if we had a linear map \\(\phi : \mathbb{R}^3 \rightarrow \mathbb{R}^2\\) we would represent that as a matrix with 2 rows and 3 columns. so by analogy the feasibility relation \\(\Phi : X \nrightarrow Y\\) should be represented by a matrix with \\(|Y|\\) rows and \\(|X|\\) columns. (fwiw I always imagine matrices as taking inputs through the "top side" and spitting out results through the "left side" – but I'm sure everyone has their own way of visualising this...)`

Bartosz Milewski gives composition of profunctors as a coend,

\[ (\Phi\circ\Psi)(x,z) = \int^{y\text{ in }Y}\Phi(x,y)\otimes\Psi(y,z), \]

but for \(Bool\)-profunctors we have

\[ (\Phi\circ\Psi)(x,z) = \exists_{y\text{ in }Y}.\Phi(x,y) \land \Psi(y,z), \]

and also above we've also seen that composition of profunctors this could also be seen as matrix multiplication,

\[ (\Phi\circ\Psi)(x,z) = \sum_{y\text{ in }Y}\Phi_x^y \times \Psi_y^z . \]

Are coends (\(\int^c\)), sums (\(\sum_c\)), and existential quantifiers (\(\exists_c\)) all related in some way ( I suspect they're all examples of folds of some sort)?

`[Bartosz Milewski](https://bartoszmilewski.com/2017/07/07/profunctor-optics-the-categorical-view/) gives composition of profunctors as a coend, \\[ (\Phi\circ\Psi)(x,z) = \int^{y\text{ in }Y}\Phi(x,y)\otimes\Psi(y,z), \\] but for \\(Bool\\)-profunctors we have \\[ (\Phi\circ\Psi)(x,z) = \exists_{y\text{ in }Y}.\Phi(x,y) \land \Psi(y,z), \\] and also above we've also seen that composition of profunctors this could also be seen as matrix multiplication, \\[ (\Phi\circ\Psi)(x,z) = \sum_{y\text{ in }Y}\Phi_x^y \times \Psi_y^z . \\] Are coends (\\(\int^c\\)), sums (\\(\sum_c\\)), and existential quantifiers (\\(\exists_c\\)) all related in some way ( I suspect they're all examples of folds of some sort)?`

Well sums and existential quantifiers are almost the same thing in type theory.

My guess is that sums and existentials /are/ coends of some sort, because then the formula given would be valid for all cases.

`Well sums and existential quantifiers are almost the same thing in type theory. My guess is that sums and existentials /are/ coends of some sort, because then the formula given would be valid for all cases.`

Anindya

Yeah that makes sense. Thanks.

`Anindya >thing is that if we had a linear map \\(\phi : \mathbb{R}^3 \rightarrow \mathbb{R}^2\\) we would represent that as a matrix with 2 rows and 3 columns. >so by analogy the feasibility relation \\(\Phi : X \nrightarrow Y\\) should be represented by a matrix with \\(|Y|\\) rows and \\(|X|\\) columns. Yeah that makes sense. Thanks.`

Keith wrote:

Yes, good point! As you suggest, these are two special cases of something a bit more general.

Bartosz is composing profunctors between

categories, and above you are composing functors between\(\mathbf{Bool}\)-enriched categories. In Lecture 63 we will see how to compose profunctors betweencategories enriched over commutative quantales. These include \(\mathbf{Bool}\)-enriched categories as a special case... but not categories!So, to get a generalization that covers the two examples you just gave, we'd need to go a bit higher, and consider profunctor between

categories enriched over closed cocomplete symmetric monoidal categories.We may not reach that level of generality in this course. But that's the right level of generality for fully understanding enriched profunctors. Sums, existential quantifiers, colimits and coends are all special cases of 'weighted colimits'. Basically they're all ways to think about

addition.This seems like a fun introduction to these issues, but learning this stuff takes real work:

Nice reference to the Doors song!

`<img width = "100" src = "http://math.ucr.edu/home/baez/mathematical/warning_sign.jpg"> Keith wrote: > [Bartosz Milewski](https://bartoszmilewski.com/2017/07/07/profunctor-optics-the-categorical-view/) gives composition of profunctors as a coend, > \\[ > (\Phi\circ\Psi)(x,z) = \int^{y\text{ in }Y}\Phi(x,y)\otimes\Psi(y,z), > \\] >but for \\(\mathbf{Bool}\\)-profunctors we have > \\[ >(\Phi\circ\Psi)(x,z) = \exists_{y\text{ in }Y}.\Phi(x,y) \land \Psi(y,z), > \\] Yes, good point! As you suggest, these are two special cases of something a bit more general. Bartosz is composing profunctors between _categories_, and above you are composing functors between _\\(\mathbf{Bool}\\)-enriched categories_. In Lecture 63 we will see how to compose profunctors between _categories enriched over commutative quantales_. These include \\(\mathbf{Bool}\\)-enriched categories as a special case... but not categories! So, to get a generalization that covers the two examples you just gave, we'd need to go a bit higher, and consider profunctor between _categories enriched over closed cocomplete symmetric monoidal categories_. We may not reach that level of generality in this course. But that's the right level of generality for fully understanding enriched profunctors. Sums, existential quantifiers, colimits and coends are all special cases of 'weighted colimits'. Basically they're all ways to think about _addition_. This seems like a fun introduction to these issues, but learning this stuff takes real work: * Fosco Loregian, [This is the (co)end, my only (co)friend](https://arxiv.org/abs/1501.02503). Nice reference to the Doors song!`