It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 494
- ACT Study Group 6
- Green Mathematics 1
- Azimuth Math Review 6
- MIT 2020: Programming with Categories 53
- MIT 2020: Lectures 21
- MIT 2020: Exercises 25
- MIT 2019: Applied Category Theory 339
- MIT 2019: Lectures 79
- MIT 2019: Exercises 149
- MIT 2019: Chat 50
- UCR ACT Seminar 4
- General 64
- Azimuth Code Project 110
- Statistical methods 2
- Drafts 1
- Math Syntax Demos 15
- Wiki - Latest Changes 1
- Strategy 110
- Azimuth Project 1.1K

## Comments

John #97 wrote:

I think I'm confused about why not every DAG, when interpreted as a Hasse diagram, describes some poset.

Let \((V, E)\) be a DAG and construct \(\leq\) as follows: for all \(u, v \in V\), \(u\leq v\) iff there exists a path from \(u\) to \(v\). (Note that this includes the trivial path from \(v\) to \(v\).) Now I say that \(V, \leq\) is a poset.

Reflexivity:satisfied by construction.Transitivity:If \(u \leq v\) and \(v \leq w\), then there exists a path from \(u\) to \(v\) and a path from \(v\) to \(w\). But you can concatenate those paths to get a path from \(u\) to \(w\), hence \(u \leq w\) as well.Antisymmetry:Suppose that \(u \leq v\) and \(v \leq u\). This means that there exists a path from \(u\) to \(v\) and a path from \(v\) to \(u\). But if \(u\) and \(v\) are distinct then we have a cycle, which contradicts the assumption that \((V, E)\) is a DAG. Hence we must have \(u = v\).Am I messing something up?

`John #97 wrote: <blockquote>What is the smallest DAG that is not the Hasse diagram of a poset?</blockquote> I think I'm confused about why not every DAG, when interpreted as a Hasse diagram, describes some poset. Let \\((V, E)\\) be a DAG and construct \\(\leq\\) as follows: for all \\(u, v \in V\\), \\(u\leq v\\) iff there exists a path from \\(u\\) to \\(v\\). (Note that this includes the trivial path from \\(v\\) to \\(v\\).) Now I say that \\(V, \leq\\) is a poset. <b>Reflexivity:</b> satisfied by construction. <b>Transitivity:</b> If \\(u \leq v\\) and \\(v \leq w\\), then there exists a path from \\(u\\) to \\(v\\) and a path from \\(v\\) to \\(w\\). But you can concatenate those paths to get a path from \\(u\\) to \\(w\\), hence \\(u \leq w\\) as well. <b>Antisymmetry:</b> Suppose that \\(u \leq v\\) and \\(v \leq u\\). This means that there exists a path from \\(u\\) to \\(v\\) and a path from \\(v\\) to \\(u\\). But if \\(u\\) and \\(v\\) are distinct then we have a cycle, which contradicts the assumption that \\((V, E)\\) is a DAG. Hence we must have \\(u = v\\). Am I messing something up?`

Puzzle 7.Let \(\leq\) be a reflexive and transitive relation. The pairs in our relation are morphisms of a category. The Objects are the elements of the underlying set. The composition is defined as $$(x, y) \circ (y, z) = (x, z)$$ for all \((x,y)\) and \((y,z) \in \leq \).This is well defined, because if \((x,y), (y,z) \in \leq\) we have \((x,z) \in \leq\) because our relation is transitive. We also have for every object x, \((x,x) \in \leq\) because our relation is reflexive. These satisfy, for example: $$(x,x) \circ (x, y) = (x, y) = (x, y) \circ (y, y)$$ That is, we have left and right identities for every object.

There is at most one morphism between any two objects, because the pairs are elements of a set. And sets, by definition, contain at most one of their elements. This completes the proof.

Also, question on style. Does this read as being too verbose?

`**Puzzle 7.** Let \\(\leq\\) be a reflexive and transitive relation. The pairs in our relation are morphisms of a category. The Objects are the elements of the underlying set. The composition is defined as $$(x, y) \circ (y, z) = (x, z)$$ for all \\((x,y)\\) and \\((y,z) \in \leq \\). This is well defined, because if \\((x,y), (y,z) \in \leq\\) we have \\((x,z) \in \leq\\) because our relation is transitive. We also have for every object x, \\((x,x) \in \leq\\) because our relation is reflexive. These satisfy, for example: $$(x,x) \circ (x, y) = (x, y) = (x, y) \circ (y, y)$$ That is, we have left and right identities for every object. There is at most one morphism between any two objects, because the pairs are elements of a set. And sets, by definition, contain at most one of their elements. This completes the proof. Also, question on style. Does this read as being too verbose?`

They are

trivialin the sense that : \(\forall x, y. x \leq y\).Proof.Fix \(x\) and \(y\) as arbitrary elements of the preorder. We want to show \(x \leq y\).

In a total order, either \(x \leq y\) or \(y \leq x\). In the first case we are done. In the case that \(y \leq x\), we have \(x \leq y\) by symmetry. \(\Box\)

`> What are all the total orders that are also equivalence relations? They are _trivial_ in the sense that : \\(\forall x, y. x \leq y\\). **Proof.** Fix \\(x\\) and \\(y\\) as arbitrary elements of the preorder. We want to show \\(x \leq y\\). In a total order, either \\(x \leq y\\) or \\(y \leq x\\). In the first case we are done. In the case that \\(y \leq x\\), we have \\(x \leq y\\) by symmetry. \\(\Box\\)`

Patrick #101, John #97: I could be confused myself, but I think that e.g. the DAG on 3 vertices a,b,c with arrows a->b, b->c and a->c isn't a Hasse diagram because the arrow a->c is transitively redundant with the other arrows, and a Hasse diagram would not display it.

`Patrick #101, John #97: I could be confused myself, but I think that e.g. the DAG on 3 vertices a,b,c with arrows a->b, b->c and a->c isn't a Hasse diagram because the arrow a->c is transitively redundant with the other arrows, and a Hasse diagram would not display it.`

Unfortunately the book does not make it clear that \(a \rightarrow c\) in Jerry's example is actually prohibited in a Hasse diagram (according to Wikipedia and Wolfram MathWorld, at least), not just redundant. I'm grateful to this example for spurring me to get clarity on the actual rules.

[It appears that the process of removing "Hasse-redundant" edges in a DAG is called

transitive reduction. The opposite (adding an arrow \(x\rightarrow y\) whenever there is a path from \(x\) to \(y\)) is calledtransitive closure.]`Unfortunately the book does not make it clear that \\(a \rightarrow c\\) in Jerry's example is actually prohibited in a Hasse diagram (according to Wikipedia and Wolfram MathWorld, at least), not just redundant. I'm grateful to this example for spurring me to get clarity on the actual rules. [It appears that the process of removing "Hasse-redundant" edges in a DAG is called *transitive reduction*. The opposite (adding an arrow \\(x\rightarrow y\\) whenever there is a path from \\(x\\) to \\(y\\)) is called *transitive closure*.]`

Ah, yep, of course! I was assuming the transitive reduction without stating it. Thanks Jerry and Dan.

`Ah, yep, of course! I was assuming the transitive reduction without stating it. Thanks Jerry and Dan.`

@JohnBaez #100. Thx for pointing the problem with my 2nd example. Yep, now that I think about it , you are correct. The borrower example does not have transitivity property. And yes, agree with you also, that for each concept, it would help to have counter examples.

`@JohnBaez #100. Thx for pointing the problem with my 2nd example. Yep, now that I think about it , you are correct. The borrower example does not have transitivity property. And yes, agree with you also, that for each concept, it would help to have counter examples.`

Jerry Wedekind wrote:

Right! And I believe this answers my question in #97: namely, I think this graph with 3 vertices and 3 arrows is the

smallestDAG that's not the Hasse diagram of a poset.By the way, I wish you hadn't deleted your earlier guess that every DAG was the Hasse diagram of a poset, because it was an interesting guess. It's often better to be wrong in an interesting way than right in a boring way!

For example, here is a nice puzzle your original guess leads to:

Puzzle.Which DAGs are Hasse diagrams of posets? How can we take a DAG and tell if it's the Hasse diagram of a poset?(If every DAG were the Hasse diagram of a poset, we'd be well on our way toward proving that the category of DAGs was equivalent to the category of posets. But they're not: there's a functor from the category of posets to the category of DAGs, sending each poset to its Hasse diagram, and my puzzle is starting to study this functor. We can do this without knowing anything about functors.)

`[Jerry Wedekind](https://forum.azimuthproject.org/discussion/comment/16213/#Comment_16213) wrote: > e.g. the DAG on 3 vertices a,b,c with arrows a->b, b->c and a->c isn't a Hasse diagram because the arrow a->c is transitively redundant with the other arrows, and a Hasse diagram would not display it. Right! And I believe this answers my question in #97: namely, I think this graph with 3 vertices and 3 arrows is the _smallest_ DAG that's not the Hasse diagram of a poset. By the way, I wish you hadn't deleted your earlier guess that every DAG was the Hasse diagram of a poset, because it was an interesting guess. It's often better to be wrong in an interesting way than right in a boring way! For example, here is a nice puzzle your original guess leads to: **Puzzle.** Which DAGs are Hasse diagrams of posets? How can we take a DAG and tell if it's the Hasse diagram of a poset? (If every DAG were the Hasse diagram of a poset, we'd be well on our way toward proving that the category of DAGs was equivalent to the category of posets. But they're not: there's a functor from the category of posets to the category of DAGs, sending each poset to its Hasse diagram, and my puzzle is starting to study this functor. We can do this without knowing anything about functors.)`

Alex Kreizberg solved Puzzle 7 in comment #102. Nice!

That judgement really varies from reader to reader. We've got over 150 people here, in theory. Some are expert mathematicians and can skip enormous numbers of steps; others want to see detailed proofs; others are probably such beginners that they get lost in the details of a proof.

I'll just add this "high-level, conceptual" argument. Once we've defined composition and identities we need to check 3 laws to be sure we have a category: associativity and the right and left unit laws. These laws are all equations between two morphisms from some object \(x\) to some object \(x\). In a preorder there's at most one morphism from \(x\) to \(x\). Therefore these equations must true!

What's nice about this argument is that we don't even need to look at the laws in detail!

Here's why preorders are so much easier than other categories: in a preorder, whenever we have to check that two morphisms from some object \(x\) to some object \(y\) are equal, we're instantly done! In the jargon of category theory, "all diagrams commute".

`[Alex Kreizberg](https://forum.azimuthproject.org/discussion/comment/16211/#Comment_16211) solved Puzzle 7 in comment #102. Nice! > Also, question on style. Does this read as being too verbose? That judgement really varies from reader to reader. We've got over 150 people here, in theory. Some are expert mathematicians and can skip enormous numbers of steps; others want to see detailed proofs; others are probably such beginners that they get lost in the details of a proof. I'll just add this "high-level, conceptual" argument. Once we've defined composition and identities we need to check 3 laws to be sure we have a category: associativity and the right and left unit laws. These laws are all equations between two morphisms from some object \\(x\\) to some object \\(x\\). In a preorder there's at most one morphism from \\(x\\) to \\(x\\). Therefore these equations must true! What's nice about this argument is that we don't even need to look at the laws in detail! Here's why preorders are so much easier than other categories: in a preorder, whenever we have to check that two morphisms from some object \\(x\\) to some object \\(y\\) are equal, we're instantly done! In the jargon of category theory, "all diagrams commute".`

Puzzle 4: Derivability of propositions is a preorder.

It is only a total ordering when the logic in question is complete.

`Puzzle 4: Derivability of propositions is a preorder. It is only a total ordering when the logic in question is complete.`

In a complete logic, every proposition is derivable from every other proposition?

`In a complete logic, every proposition is derivable from every other proposition?`

@Joseph, 'completeness' means that everything true (in some semantics) is provable (in some proof system). (It's a property of a pair consisting of a semantics and a proof system.) The proof systems in which everything is derivable from everything are the inconsistent proof systems!

In particular, I don't think @Justin's second line is right. For derivability to be a total order means that if \(p,q\) are propositions with \(p \implies q\) and \(q \implies p\) derivable then \(p = q\). And equality of propositions is normally meant as syntactic identity, i.e. they literally consist of the same sequence of symbols when written down. I can't imagine any proof system with this property.

`@Joseph, 'completeness' means that everything true (in some semantics) is provable (in some proof system). (It's a property of a pair consisting of a semantics and a proof system.) The proof systems in which everything is derivable from everything are the inconsistent proof systems! In particular, I don't think @Justin's second line is right. For derivability to be a total order means that if \\(p,q\\) are propositions with \\(p \implies q\\) and \\(q \implies p\\) derivable then \\(p = q\\). And equality of propositions is normally meant as syntactic identity, i.e. they literally consist of the same sequence of symbols when written down. I can't imagine any proof system with this property.`

You're correct. The relation of derivability is total, but it is not a total ordering, because it would not be antisymmetric. I confused the two in my comment.

If I'm reading correctly, this is also known as a total preorder? https://en.wikipedia.org/wiki/Weak_ordering#Strict_weak_orderings

`You're correct. The relation of derivability is total, but it is not a total ordering, because it would not be antisymmetric. I confused the two in my comment. If I'm reading correctly, this is also known as a total preorder? https://en.wikipedia.org/wiki/Weak_ordering#Strict_weak_orderings`

Joseph Moeller says "Given a preorder, there is a general construction which turns it into a partial order." and later adds "the difference between partial orders and preorders is that partial orders demand equality where preorders only demand equivalence. Equality is sometimes called "evil" in category theory."

If I understand, the general construction for turning a preorder into a partial order is to consider partitions generated by the equivalence classes of the preorder. Is that correct?

`[Joseph Moeller](https://forum.azimuthproject.org/discussion/comment/16061/#Comment_16061) says "Given a preorder, there is a general construction which turns it into a partial order." and [later adds](https://forum.azimuthproject.org/discussion/comment/16074/#Comment_16074) "the difference between partial orders and preorders is that partial orders demand equality where preorders only demand equivalence. Equality is sometimes called "evil" in category theory." If I understand, the general construction for turning a preorder into a partial order is to consider partitions generated by the equivalence classes of the preorder. Is that correct?`

Puzzle 4

Some observations before giving examples:

1. Reflexivity just means all relations pointing to self gets negated.

2. Antisymmetry means couple of things which essentially mean the same thing:

A. All cycles get negated.

B. Binary relation is one sided or points in one direction i.e. an arrow

C. All arrows must point from start to end.

3. Totality is not required therefore :

A. Multiple starts and ends are possible i.e. superpositioning is welcome

B. Or in other words there are no restrictions on number of relations per object but not between pairs

C. Graphs of posets need not be complete

4. Transitivity requires :

A. The relation must be composable so the relation must be able to pass down into multiple generations. i.e. inclusions

The process involved in creating a general order:

1. choosing

2. measuring or labeling

3. comparing

4. sorting

Real World Examples:

1. The obvious example of any measurement using the relation "is less than" i.e. counting, weighing, etc

2. Any timelines with superposition using the relation "occurs before" i.e. Planning the execution of a project with multiple scenarios

3. Any classification structuring that uses the relation "is a part of" i.e. taxonomy of life (Life,Domain,Kingdom, ...)

Mathematical Examples :

I am not a mathematician so don't know any other mathematical objects and relations other than the ones already mentioned. So I guess I'll give an counter example I thought might have been partial orders

Stern-Brocot tree : I thought this was a partial order since its a binary tree that points in one direction but it disobeys transitivity. I think you can make it into partial order if you add \((\frac01,\frac02,\frac03,...)\), \((\frac11,\frac22,\frac33,...)\), \(( \frac10,\frac20,\frac30,...)\) and add more lines in.

Questions:

1. Must posets be connected? Are sums of posets posets?

2. Is total order the most restrictive type of order? Why can't we add more requirements?

3. There are different combinations of these 5 properties (the 4 above + symmetry) that we haven't talked about. Are they valid mathematical objects? For example, is there an order with reflexivity + totality only?

4. In wikipedia, it says a poset can only have at most one relation per pair. Why is this?

`Puzzle 4 Some observations before giving examples:<br> 1. Reflexivity just means all relations pointing to self gets negated. <br> 2. Antisymmetry means couple of things which essentially mean the same thing:<br> A. All cycles get negated. <br> B. Binary relation is one sided or points in one direction i.e. an arrow<br> C. All arrows must point from start to end.<br> 3. Totality is not required therefore :<br> A. Multiple starts and ends are possible i.e. superpositioning is welcome<br> B. Or in other words there are no restrictions on number of relations per object but not between pairs<br> C. Graphs of posets need not be complete<br> 4. Transitivity requires :<br> A. The relation must be composable so the relation must be able to pass down into multiple generations. i.e. inclusions<br> <br> The process involved in creating a general order:<br> 1. choosing <br> 2. measuring or labeling<br> 3. comparing<br> 4. sorting<br> <br> Real World Examples:<br> 1. The obvious example of any measurement using the relation "is less than" i.e. counting, weighing, etc<br> 2. Any timelines with superposition using the relation "occurs before" i.e. Planning the execution of a project with multiple scenarios<br> 3. Any classification structuring that uses the relation "is a part of" i.e. taxonomy of life (Life,Domain,Kingdom, ...)<br> <br> Mathematical Examples :<br> I am not a mathematician so don't know any other mathematical objects and relations other than the ones already mentioned. So I guess I'll give an counter example I thought might have been partial orders<br> <br> Stern-Brocot tree : I thought this was a partial order since its a binary tree that points in one direction but it disobeys transitivity. I think you can make it into partial order if you add \\((\frac01,\frac02,\frac03,...)\\), \\((\frac11,\frac22,\frac33,...)\\), \\(( \frac10,\frac20,\frac30,...)\\) and add more lines in. <br> <br> Questions:<br> 1. Must posets be connected? Are sums of posets posets?<br> 2. Is total order the most restrictive type of order? Why can't we add more requirements?<br> 3. There are different combinations of these 5 properties (the 4 above + symmetry) that we haven't talked about. Are they valid mathematical objects? For example, is there an order with reflexivity + totality only? <br> 4. In wikipedia, it says a poset can only have at most one relation per pair. Why is this? <br>`

I apologise @Justin, I also mixed up 'total' with 'not pre-'. Nevertheless, in a standard proof system (such as for propositional logic) derivability is not even a total preorder. (Yes, this sounds like the right term.) Pick two different atomic propositions \(p,q\) - note that these are specific symbols in the logic's syntax, not standing for some arbitrary formulas. Then neither \(p \implies q\) or \(q \implies p\) are derivable. (By the completeness theorem, a formula is derivable iff it is true for every instantiation of the atomic propositions.

@Matthew Yes, this is exactly right.

`I apologise @Justin, I also mixed up 'total' with 'not pre-'. Nevertheless, in a standard proof system (such as for propositional logic) derivability is not even a total preorder. (Yes, this sounds like the right term.) Pick two different atomic propositions \\(p,q\\) - note that these are specific symbols in the logic's syntax, not standing for some arbitrary formulas. Then neither \\(p \implies q\\) or \\(q \implies p\\) are derivable. (By the completeness theorem, a formula is derivable iff it is true for every instantiation of the atomic propositions. @Matthew Yes, this is exactly right.`

In comment # 101, Matthew Doty addressed this question:

And proved that for these orders: \(\forall x, y. x \leq y\).

That's correct, but a stronger conclusion follows: the only total orders which are equivalence relations have either zero or one elements.

For a contradiction, suppose there were two distinct elements \(x, y\) in such an order. Then by trichotomy, either \(x \le y\) or \(y \le x\) -- but not both. Suppose it was \(x \le y\). Then, since it's equivalence relation, we have that \(y \le x\) -- but this contradicts the trichotomy.

The point is that any total order which is symmetric cannot have multiple elements. Total orders arrange the elements in a "line," which precludes symmetry in all but these trivial cases.

`In comment # 101, Matthew Doty addressed this question: > What are all the total orders that are also equivalence relations? And proved that for these orders: \\(\forall x, y. x \leq y\\). That's correct, but a stronger conclusion follows: the only total orders which are equivalence relations have either zero or one elements. For a contradiction, suppose there were two distinct elements \\(x, y\\) in such an order. Then by trichotomy, either \\(x \le y\\) or \\(y \le x\\) -- but not both. Suppose it was \\(x \le y\\). Then, since it's equivalence relation, we have that \\(y \le x\\) -- but this contradicts the trichotomy. The point is that any total order which is symmetric cannot have multiple elements. Total orders arrange the elements in a "line," which precludes symmetry in all but these trivial cases.`

Michael Hong wrote:

Let S be a set, which is ordered only by the identity relation -- i.e., all that we have is \(x \le x\), for all \(x\). That's a poset. And it's as disconnected as it gets: there are no paths from one element to another.

Depends upon how you define "sum."

The disjoint union of two posets is a poset.

The Cartesian product of two posets is a poset.

The union of two posets may not be a poset, if the sets overlap, and cycles get created in the graph, when taking the union of the edges.

For example, suppose that S has two elements \(x\) and \(y\). Suppose that in one partial ordering on S, we have \(x \le y \), and in another, we have \(y \le x\). Then in the union of these orderings, we have a cycle between \(x\) and \(y\) -- which shows that the union is not a poset.

`Michael Hong wrote: > 1. Must posets be connected? Let S be a set, which is ordered only by the identity relation -- i.e., all that we have is \\(x \le x\\), for all \\(x\\). That's a poset. And it's as disconnected as it gets: there are no paths from one element to another. > Are sums of posets posets? Depends upon how you define "sum." * The disjoint union of two posets is a poset. * The Cartesian product of two posets is a poset. The union of two posets may not be a poset, if the sets overlap, and cycles get created in the graph, when taking the union of the edges. For example, suppose that S has two elements \\(x\\) and \\(y\\). Suppose that in one partial ordering on S, we have \\(x \le y \\), and in another, we have \\(y \le x\\). Then in the union of these orderings, we have a cycle between \\(x\\) and \\(y\\) -- which shows that the union is not a poset.`

@MikeHong

`@MikeHong 1. posets need not be connected. The discrete poset, for example, is not connected. 2. It depends on what you mean by "most restrictive". Certainly everything is connected, if we are talking about posets [partial orders] that is all you can do but if you mean preorders [which Spivak, Fong do] you can double the number of arrows by adding in all of the opposites. 3. Yes, most combinations of these properties have names. 4. "a poset can only have at most one relation per pair". {16311#Comment_16305}`

Michael Hong wrote:

Two relations per pair would mean both \(x \le y\) and \(y \le x\). By antisymmetry, then \(x = y\). Then the "pair" is just \(x\), and there's only one relation here, namely \(x = x\).

`Michael Hong wrote: > 4. In wikipedia, it says a poset can only have at most one relation per pair. Why is this? Two relations per pair would mean both \\(x \le y\\) and \\(y \le x\\). By antisymmetry, then \\(x = y\\). Then the "pair" is just \\(x\\), and there's only one relation here, namely \\(x = x\\).`

Puzzle 7.Why does any set with a reflexive and transitive relation \(\le\) yield a category with at most one morphism from any object \(x\) to any object \(y\)? That is: why are reflexivity and transitivityenough?To answer this, let's go through the construction of the category, to see where reflexivity and transitivity are used, and why they suffice.

Let S be a set with a reflexive and transitive relation. Define a category whose objects are the elements of S, and which will have one "designated" morphism from \(x\) to \(y\), whenever \(x \le y\). If there is a designated morphism from \(x\) to \(y\), and another from \(y\) to \(z\), that means \(x \le y\), and \(y \le \x), and so, using

transitivity, we know that \(x \le z\), which enables us to define the composition of the two morphisms as the designated morphism from \(x\) to \(z\).Using

reflexivity, we have that \(x \le x\), and so by our construction there will always be a designated morphism from \(x\) to \(x\). From the definition of composition just given, it follows that the designated morphism from \(x\) to \(x\) satisfies the requirements for being an identity morphism.Now we just need to ensure that all the requirements for a category are satisfied by this construction.

First, we should make sure that this system of morphisms, as defined, is closed under composition. This boils down to showing that the composition of any path of designated morphisms is still a designated morphism. That can easily be proved by induction on the length of the path.

Since the designated morphisms form a closed system, there is nothing besides them, and we drop the qualification of being "designated," and just refer to them as the morphisms of the category.

Second, we need to show that the composition law, as defined, is associative. But it is easy to see that for \(x \rightarrow y \rightarrow z \rightarrow w\), no matter how we parenthesize the compositions, we will end up with the designated morphism from \(x\) to \(w\).

`**Puzzle 7.** Why does any set with a reflexive and transitive relation \\(\le\\) yield a category with at most one morphism from any object \\(x\\) to any object \\(y\\)? That is: why are reflexivity and transitivity _enough?_ To answer this, let's go through the construction of the category, to see where reflexivity and transitivity are used, and why they suffice. Let S be a set with a reflexive and transitive relation. Define a category whose objects are the elements of S, and which will have one "designated" morphism from \\(x\\) to \\(y\\), whenever \\(x \le y\\). If there is a designated morphism from \\(x\\) to \\(y\\), and another from \\(y\\) to \\(z\\), that means \\(x \le y\\), and \\(y \le \\x), and so, using _transitivity_, we know that \\(x \le z\\), which enables us to define the composition of the two morphisms as the designated morphism from \\(x\\) to \\(z\\). Using _reflexivity_, we have that \\(x \le x\\), and so by our construction there will always be a designated morphism from \\(x\\) to \\(x\\). From the definition of composition just given, it follows that the designated morphism from \\(x\\) to \\(x\\) satisfies the requirements for being an identity morphism. Now we just need to ensure that all the requirements for a category are satisfied by this construction. First, we should make sure that this system of morphisms, as defined, is closed under composition. This boils down to showing that the composition of any path of designated morphisms is still a designated morphism. That can easily be proved by induction on the length of the path. Since the designated morphisms form a closed system, there is nothing besides them, and we drop the qualification of being "designated," and just refer to them as the morphisms of the category. Second, we need to show that the composition law, as defined, is associative. But it is easy to see that for \\(x \rightarrow y \rightarrow z \rightarrow w\\), no matter how we parenthesize the compositions, we will end up with the designated morphism from \\(x\\) to \\(w\\).`

Michael Hong wrote:

David Tanzer wrote:

There can be other problems too when the sets overlap: you could violate transitivity in the union by having \(x\leq y\) in one poset and \(y\leq z\) in another, but be missing \(x\leq z\) in both. So not only is the union of two posets not necessarily a poset, but the union of two preorders is not necessarily a preorder.

`Michael Hong wrote: > Are sums of posets posets? David Tanzer wrote: > The union of two posets may not be a poset, if the sets overlap, and cycles get created in the graph, when taking the union of the edges. There can be other problems too when the sets overlap: you could violate transitivity in the union by having \\(x\leq y\\) in one poset and \\(y\leq z\\) in another, but be missing \\(x\leq z\\) in both. So not only is the union of two posets not necessarily a poset, but the union of two preorders is not necessarily a preorder.`

What is the problem with creating cycles? Is it not true that?...

I guess the question is really what would constitute a sum of posets? The types of sums you describe are not posets but if we a join as described in the lecture then a poset is formed.

`What is the problem with creating cycles? Is it not true that?... * For posets cycles imply the elements are equal. * For preorders cycles imply elements are equivalent but may still be distinct. I guess the question is really what would constitute a sum of posets? The types of sums you describe are not posets but if we a join as described in the lecture then a poset is formed.`

Fredrick Eisele wrote:

Yes, nontrivial cycles (cycles with not all elements equal) cannot exist in posets, but are perfectly fine in preorders. This creates a problem when trying to take the union of two poset structures on overlapping sets; for example, consider the set \(\{x,y\}\) with Poset Structure 1: \(\{x\leq x, x\leq y, y\leq y\}\) and Poset Structure 2: \(\{x\leq x, y\leq x, y\leq y\}\). Each of these is perfectly fine by itself, but when we take their union, we get the relation \(\{x\leq x, x\leq y, y\leq x, y\leq y\}\), which is not a poset structure because \(x\leq y\leq x\) but \(x\) and \(y\) are not equal.

`Fredrick Eisele wrote: > What is the problem with creating cycles? Is it not true that?... For posets cycles imply the elements are equal. For preorders cycles imply elements are equivalent but may still be distinct. Yes, nontrivial cycles (cycles with not all elements equal) cannot exist in posets, but are perfectly fine in preorders. This creates a problem when trying to take the union of two poset structures on overlapping sets; for example, consider the set \\(\\{x,y\\}\\) with Poset Structure 1: \\(\\{x\leq x, x\leq y, y\leq y\\}\\) and Poset Structure 2: \\(\\{x\leq x, y\leq x, y\leq y\\}\\). Each of these is perfectly fine by itself, but when we take their union, we get the relation \\(\\{x\leq x, x\leq y, y\leq x, y\leq y\\}\\), which is not a poset structure because \\(x\leq y\leq x\\) but \\(x\\) and \\(y\\) are not equal.`

@Owen-Biesel What prevents x and y from being equal?

`@Owen-Biesel What prevents x and y from being equal?`

Question I just thought of: Given preorders \(X\) and \(Y\), define a new one \(X \oplus Y\) to be the transitive closure of \(X \cup Y\), i.e. the 'smallest' preorder that contains both \(X\) and \(Y\). What properties does this satisfy? (Also, does it already appear in the quite long literature on order theory?) Same question for posets, where the definition of \(X \oplus Y\) also involves taking the relevant quotient to collapse cycles.

`Question I just thought of: Given preorders \\(X\\) and \\(Y\\), define a new one \\(X \oplus Y\\) to be the transitive closure of \\(X \cup Y\\), i.e. the 'smallest' preorder that contains both \\(X\\) and \\(Y\\). What properties does this satisfy? (Also, does it already appear in the quite long literature on order theory?) Same question for posets, where the definition of \\(X \oplus Y\\) also involves taking the relevant quotient to collapse cycles.`

Jules Hedges just wrote:

If you're using \(X\) and \(Y\) as abbreviations for preorders \((X,\le_X)\) and \((Y, \le_Y)\) (which is wise) and using \(X \cup Y\) to mean the

disjointunion of \(X\) and \(Y\) (which is unwise - I would call it \(X + Y\)), then this thing you're talking about is the coproduct of \((X,\le_X)\) and \((Y, \le_Y)\) in the category of preorders.If you're taking \(X\) and \(Y\) to be subsets of some larger set that you didn't mention (which is unwise), each equipped with their own separate \(\le\) relation, and taking \(X \cup Y\) to be their union inside that set, things get a lot more complicated - interesting, but complicated.

(A true category theorist would never write \(X \cup Y\) without having mentioned beforehand that \(X\) and \(Y\) are subsets of some set \(S\), so I'm struggling to interpret your question. I can even think of another interpretation, but I'll refrain from explaining it.)

`[Jules Hedges](https://forum.azimuthproject.org/discussion/comment/16318/#Comment_16318) just wrote: > Question I just thought of: Given preorders X and Y, define a new one X⊕Y to be the transitive closure of X∪Y, i.e. the 'smallest' preorder that contains both X and Y. If you're using \\(X\\) and \\(Y\\) as abbreviations for preorders \\((X,\le_X)\\) and \\((Y, \le_Y)\\) (which is wise) and using \\(X \cup Y\\) to mean the _disjoint_ union of \\(X\\) and \\(Y\\) (which is unwise - I would call it \\(X + Y\\)), then this thing you're talking about is the coproduct of \\((X,\le_X)\\) and \\((Y, \le_Y)\\) in the category of preorders. If you're taking \\(X\\) and \\(Y\\) to be subsets of some larger set that you didn't mention (which is unwise), each equipped with their own separate \\(\le\\) relation, and taking \\(X \cup Y\\) to be their union inside that set, things get a lot more complicated - interesting, but complicated. (A true category theorist would never write \\(X \cup Y\\) without having mentioned beforehand that \\(X\\) and \\(Y\\) are subsets of some set \\(S\\), so I'm struggling to interpret your question. I can even think of another interpretation, but I'll refrain from explaining it.)`

David Tanzer wrote:

Thanks for the clarification. I had disjoint union in mind but your answer gave me a better perspective into not only this question but also my second question.

What if we added a weighting property where you can have double arrows in the same direction \(x \le \le y\). Does this somehow disobey any of the properties of a poset?

`[David Tanzer](https://forum.azimuthproject.org/discussion/comment/16303/#Comment_16303) wrote: >Depends upon how you define "sum." >The disjoint union of two posets is a poset. >The Cartesian product of two posets is a poset. >The union of two posets may not be a poset, if the sets overlap, and cycles get created in the graph, when taking the union of the edges. Thanks for the clarification. I had disjoint union in mind but your answer gave me a better perspective into not only this question but also my second question. >Two relations per pair would mean both \\(x \le y\\) and \\(y \le x\\). By antisymmetry, then \\(x = y\\). Then the "pair" is just \\(x\\), and there's only one relation here, namely \\(x = x\\). What if we added a weighting property where you can have double arrows in the same direction \\(x \le \le y\\). Does this somehow disobey any of the properties of a poset?`

[Fredrick Eisele]https://forum.azimuthproject.org/discussion/comment/16304/#Comment_16304 wrote:

Sorry, should have done a quick search on the matter...

https://en.wikipedia.org/wiki/List_of_order_structures_in_mathematics

Cyclic orders, orderings in which triples of elements are either clockwise or counterclockwiseLattices, partial orders in which each pair of elements has a greatest lower bound and a least upper bound. Many different types of lattice have been studied; see map of lattices for a list.Partially ordered sets (or posets), orderings in which some pairs are comparable and others might not bePreorders, a generalization of partial orders allowing ties (represented as equivalences and distinct from incomparabilities)Semiorders, partial orders determined by comparison of numerical values, in which values that are too close to each other are incomparable; a subfamily of partial orders with certain restrictionsTotal orders, orderings that specify, for every two distinct elements, which one is less than the otherWeak orders, generalizations of total orders allowing ties (represented either as equivalences or, in strict weak orders, as transitive incomparabilities)Well-orders, total orders in which every non-empty subset has a least elementWell-quasi-orderings, a class of preorders generalizing the well-ordersLooking at this list, it seems transitivity is the non-trivial bare minimum property required for an order which is just a preorder. This is by definition of what an order is.

What if the totality property stands by itself? This would lead to directed connected graphs but the relations don't necessarily imply an ordering unless of course the special cases where it obeys transitivity.

Sorry for the rambling but I guess I am having a hard time realizing the scope of consequences each property has on the structure because none of it is spelled out for you. Thanks for your insight and reply!

`[Fredrick Eisele]https://forum.azimuthproject.org/discussion/comment/16304/#Comment_16304 wrote: >3. Yes, most combinations of these properties have names. Sorry, should have done a quick search on the matter... https://en.wikipedia.org/wiki/List_of_order_structures_in_mathematics <br> **Cyclic orders**, orderings in which triples of elements are either clockwise or counterclockwise<br> **Lattices**, partial orders in which each pair of elements has a greatest lower bound and a least upper bound. Many different types of lattice have been studied; see map of lattices for a list.<br> **Partially ordered sets (or posets)**, orderings in which some pairs are comparable and others might not be<br> **Preorders**, a generalization of partial orders allowing ties (represented as equivalences and distinct from incomparabilities)<br> **Semiorders**, partial orders determined by comparison of numerical values, in which values that are too close to each other are incomparable; a subfamily of partial orders with certain restrictions<br> **Total orders**, orderings that specify, for every two distinct elements, which one is less than the other<br> **Weak orders**, generalizations of total orders allowing ties (represented either as equivalences or, in strict weak orders, as transitive incomparabilities)<br> **Well-orders**, total orders in which every non-empty subset has a least element<br> **Well-quasi-orderings**, a class of preorders generalizing the well-orders<br> Looking at this list, it seems transitivity is the non-trivial bare minimum property required for an order which is just a preorder. This is by definition of what an order is. What if the totality property stands by itself? This would lead to directed connected graphs but the relations don't necessarily imply an ordering unless of course the special cases where it obeys transitivity. Sorry for the rambling but I guess I am having a hard time realizing the scope of consequences each property has on the structure because none of it is spelled out for you. Thanks for your insight and reply!`

@JohnBaez I mean \(\cup\) as in ordinary union of ordinary sets, which might or might not have some elements in common. See posts #122 and #124 by @OwenBiesel. (You could say that they are both subsets of a set theoretic universe, but that's serious overkill.) Perhaps it's bad style for a category theorist, but it's entirely mainstream terminology.

`@JohnBaez I mean \\(\cup\\) as in ordinary union of ordinary sets, which might or might not have some elements in common. See posts #122 and #124 by @OwenBiesel. (You could say that they are both subsets of a set theoretic universe, but that's serious overkill.) Perhaps it's bad style for a category theorist, but it's entirely mainstream terminology.`

@JulesHedges - yes, it's entirely mainstream and also bad style for a category theorist. A category theorist would point out that in ZFC everything is a set, including the number \(\pi\) or the function \(+ : \mathbb{R} \times \mathbb{R} \to \mathbb{R}\), so one can legitimately talk about the set \( \pi \cup + \), yet mathematicians have the good sense to never do this. Category theory enforces this particular kind of good sense more stringently.

But never mind; I'm getting distracted by my desire to spread the category-theoretic way of thinking - back to your actual question!

You have two sets \(X,Y \) with preorders \(\le_X, \le_Y\) on them, and you want to study the smallest preorder on their union \(S = X \cup Y\) that contains both \(\le_X\) and \(\le_Y\). That's a fine thing to do. If I wanted to study this process in detail, I'd break it up into two easier and well-studied steps:

The set of all preorders on a given set \(S\) is a poset under inclusion. This lets us talk about "the smallest preorder on \(S\) such that blah-di-blah is true." I believe this exists no mater what the condition is. I.e., I believe this poset has arbitrary least upper bounds. There are nice ways to think about this, and theorems about it.

Given \(X \subseteq S\) and a preorder \(\le_X\) on \(X\), since \(\le_X\) is a binary relation on \(X\) we have $$\le_X \subseteq X \times X.$$ We also have $$X \times X \subseteq S \times S,$$ so \(\le_X\) is also a binary relation on \(S\) and indeed a preorder. So, preorders on \(X \subseteq S\) give preorders on \(S\). More generally, given any function \(f : X \to S\) and a preorder on \(X\), we can "push it forward along \(f\)" and get a preorder on \(S\). There are nice ways to think about this too, and theorems about it.

Thus, given sets \(X,Y \subseteq S \) with preorders \(\le_X, \le_Y\) on them, we first push these preorders forward via 2 to get two preorders on \(S\), and then take their least upper bound as in 1.

This is a long-winded way of analyzing your idea, but the point is that there's really quite a lot to say about steps 1 and 2, depending on what one happens to be curious about.

If I replaced the word "preorder" by "poset" everywhere in what I just said, it would all remain true.

`@JulesHedges - yes, it's entirely mainstream and also bad style for a category theorist. A category theorist would point out that in ZFC everything is a set, including the number \\(\pi\\) or the function \\(+ : \mathbb{R} \times \mathbb{R} \to \mathbb{R}\\), so one can legitimately talk about the set \\( \pi \cup + \\), yet mathematicians have the good sense to never do this. Category theory enforces this particular kind of good sense more stringently. But never mind; I'm getting distracted by my desire to spread the category-theoretic way of thinking - back to your actual question! You have two sets \\(X,Y \\) with preorders \\(\le_X, \le_Y\\) on them, and you want to study the smallest preorder on their union \\(S = X \cup Y\\) that contains both \\(\le_X\\) and \\(\le_Y\\). That's a fine thing to do. If I wanted to study this process in detail, I'd break it up into two easier and well-studied steps: 1. The set of all preorders on a given set \\(S\\) is a poset under inclusion. This lets us talk about "the smallest preorder on \\(S\\) such that blah-di-blah is true." I believe this exists no mater what the condition is. I.e., I believe this poset has arbitrary least upper bounds. There are nice ways to think about this, and theorems about it. 2. Given \\(X \subseteq S\\) and a preorder \\(\le_X\\) on \\(X\\), since \\(\le_X\\) is a binary relation on \\(X\\) we have $$\le_X \subseteq X \times X.$$ We also have $$X \times X \subseteq S \times S,$$ so \\(\le_X\\) is also a binary relation on \\(S\\) and indeed a preorder. So, preorders on \\(X \subseteq S\\) give preorders on \\(S\\). More generally, given any function \\(f : X \to S\\) and a preorder on \\(X\\), we can "push it forward along \\(f\\)" and get a preorder on \\(S\\). There are nice ways to think about this too, and theorems about it. Thus, given sets \\(X,Y \subseteq S \\) with preorders \\(\le_X, \le_Y\\) on them, we first push these preorders forward via 2 to get two preorders on \\(S\\), and then take their least upper bound as in 1. This is a long-winded way of analyzing your idea, but the point is that there's really quite a lot to say about steps 1 and 2, depending on what one happens to be curious about. If I replaced the word "preorder" by "poset" everywhere in what I just said, it would all remain true.`

In the paper, 'Count nouns, mass nouns and their transformations: a unified category-theoretic semantics' by Marie La Palme Reyes et al, mass nouns as a poset. In fact, they use a generalization of posets: sup-lattices.

`>Puzzle 4. List some interesting and important examples of posets that haven't already been listed in other comments in this thread. In the paper, 'Count nouns, mass nouns and their transformations: a unified category-theoretic semantics' by Marie La Palme Reyes et al, mass nouns as a poset. In fact, they use a generalization of posets: sup-lattices.`

Keith - neat! So you mean for example that you can have "more mud"?

But sup-lattices aren't a generalization of posets; they are a specially nice kind of poset. A poset is a

sup-latticeif for any subset there exists a smallest element that is greater than or equal to everything in than subset. For example, for two elements \(x,y\) there exists a smallest element \(z\) such that \(x \le z\) and \( y \le z\). This element is called thesuporsupremumorjoinof \(x\) and \(y\), and it's denoted \(x \vee y\).The poset of all partitions of a set is a nice example of a sup-lattice, and we're already starting to do some exercises that involve this sup-lattice:

Exercise 2 - Chapter 1.

Exercise 3 - Chapter 1.

`Keith - neat! So you mean for example that you can have "more mud"? But sup-lattices aren't a generalization of posets; they are a specially nice kind of poset. A poset is a **sup-lattice** if for any subset there exists a smallest element that is greater than or equal to everything in than subset. For example, for two elements \\(x,y\\) there exists a smallest element \\(z\\) such that \\(x \le z\\) and \\( y \le z\\). This element is called the **sup** or **supremum** or **join** of \\(x\\) and \\(y\\), and it's denoted \\(x \vee y\\). The poset of all partitions of a set is a nice example of a sup-lattice, and we're already starting to do some exercises that involve this sup-lattice: * [Exercise 2 - Chapter 1](https://forum.azimuthproject.org/discussion/1872/exercise-2-chapter-1). * [Exercise 3 - Chapter 1](https://forum.azimuthproject.org/discussion/1876/exercise-3-chapter-1).`

Actually, does "generalization" and "specialization" form posets with 'is more general than (internally to a context)' and 'more specialized than (again internal to a context) the posets' \(\leq \) operator?

`Actually, does "generalization" and "specialization" form posets with 'is more general than (internally to a context)' and 'more specialized than (again internal to a context) the posets' \\(\leq \\) operator?`

Keith Peterson wrote:

Yes! What a great question! Category theory has a tendency to eat itself and this is an example. In our discussion of Lecture 3, Scott Finnie has already started drawing a poset of concepts specializing the concept of "preorder":

Category theory has a lot to say about this kind of thing. One is that it's better to treat specializations as forming a

categoryrather than a mere poset, because sometimes concept A can specialize concept B in more than one way. Using posets, we can answer "does concept A specialize concept B?", but not discuss the various ways that A might specialize B.`[Keith Peterson wrote:](https://forum.azimuthproject.org/discussion/comment/16372/#Comment_16372) > Actually, does "generalization" and "specialization" form posets with 'is more general than (internally to a context)' and 'more specialized than (again internal to a context) the posets' ≤ operator? Yes! What a great question! Category theory has a tendency to eat itself and this is an example. In our discussion of Lecture 3, [Scott Finnie has already started drawing](https://forum.azimuthproject.org/discussion/comment/16340/#Comment_16340) a poset of concepts specializing the concept of "preorder": <img src = "https://raw.githubusercontent.com/sfinnie/CategoryTheoryCourseNotes/master/posets/img/orderVennDiagram.gif"> Category theory has a lot to say about this kind of thing. One is that it's better to treat specializations as forming a _category_ rather than a mere poset, because sometimes concept A can specialize concept B in more than one way. Using posets, we can answer "does concept A specialize concept B?", but not discuss the various ways that A might specialize B.`

Question: I just noticed that the book sometimes (as in Remark 1.16) uses a double arrowhead (like "->>", except the horizontal line touches both angle vertices) when describing a function, e.g. "f: A ->> B" to denote a function f from set A to set B.

From context, it looks like this is an optional notation to indicate a surjective function (I say optional because the surjective function in Exercise 1.15 is described using a single arrowhead). Is that right?

`Question: I just noticed that the book sometimes (as in Remark 1.16) uses a double arrowhead (like "->>", except the horizontal line touches both angle vertices) when describing a function, e.g. "f: A ->> B" to denote a function f from set A to set B. From context, it looks like this is an optional notation to indicate a surjective function (I say optional because the surjective function in Exercise 1.15 is described using a single arrowhead). Is that right?`

Regarding the question as to the simplest DAG that's not a Hasse Diagram (#96), what about the graph with \(V = \{x, y\}, E = \{\{x, y\}, \{x, y\}\}\)? i.e. two vertices and two edges going in the same direction connecting them?

It's a valid DAG but the two edges should be disallowed in the Hasse Diagram by transitive reduction and the reflexivity property, or just by the fact that there is only one "kind of thing" (\(\leq\)) from which to create edges.

`Regarding the question as to the simplest DAG that's not a Hasse Diagram (#96), what about the graph with \\(V = \\{x, y\\}, E = \\{\\{x, y\\}, \\{x, y\\}\\}\\)? i.e. two vertices and two edges going in the same direction connecting them? It's a valid DAG but the two edges should be disallowed in the Hasse Diagram by transitive reduction and the reflexivity property, or just by the fact that there is only one "kind of thing" (\\(\\leq\\)) from which to create edges.`

I found this statement in section 1.1.1 confusing:

David Chudzicki's answer above helped:

I understand David's articulation, and the general idea that the join operation is, essentially, "information additive". It's the specific wording in the book that I'm still unsure about: "there is no operation on the booleans true, false that will always follow suit with the joining of systems".

The book text uses join as an example of an operation. So does the text mean:

That seems trivially false. The Boolean \(xor\) operator would give rise to the same result. I'd be grateful if someone could clarify the wording - thanks in advance.

`I found this statement in section 1.1.1 confusing: > There is no operation on the booleans true,false that will always follow suit with the joining of systems: our observation is inherently lossy [David Chudzicki's answer above](https://forum.azimuthproject.org/discussion/comment/15946/#Comment_15946) helped: > I guess the sense in which we lose information is just this: If you apply the function before joining, you "lose information" about what you would have gotten by joining first and then applying the function. I understand David's articulation, and the general idea that the join operation is, essentially, "information additive". It's the specific wording in the book that I'm still unsure about: "there is no operation on the booleans true, false that will always follow suit with the joining of systems". The book text uses join as an example of an operation. So does the text mean: 1. There is no alternative operation to "join" such that \\(\Phi(A) = false\\) and \\(\Phi(B) = false\\) and \\(\Phi(A op B) = true\\) where "op" is the alternative operation That seems trivially false. The Boolean \\(xor\\) operator would give rise to the same result. I'd be grateful if someone could clarify the wording - thanks in advance.`

Scott wrote:

Yes. I find the talk of 'systems' a bit fuzzy: Fong and Spivak are really talking about 'partitions' and the 'join' operation \(\vee\) on partitions. Read explanations and pictures of these things here:

Exercise 2 - Chapter 1.

Exercise 3 - Chapter 1.

For any partition \(P\) of some set and any two elements of that set, we can ask if those two elements are in the same 'block' (defined in my comments) of the partition. We get an answer \(F(P\) that is true or false.

I think what David and the book are saying is this. There is no binary operation \(\ast\) on the booleans {true,false} such that

$$ F(P \vee P') = F(P) \ast F(P') $$ for all partitions \(P,P'\) .

`Scott wrote: > I found this statement in section 1.1.1 confusing: > > There is no operation on the booleans {true,false} that will always follow suit with the joining of systems: our observation is inherently lossy. > [David Chudzicki's answer above](https://forum.azimuthproject.org/discussion/comment/15946/#Comment_15946) helped. Yes. I find the talk of 'systems' a bit fuzzy: Fong and Spivak are really talking about 'partitions' and the 'join' operation \\(\vee\\) on partitions. Read explanations and pictures of these things here: * [Exercise 2 - Chapter 1.](https://forum.azimuthproject.org/discussion/1872/exercise-2-chapter-1) * [Exercise 3 - Chapter 1.](https://forum.azimuthproject.org/discussion/1876/exercise-3-chapter-1) For any partition \\(P\\) of some set and any two elements of that set, we can ask if those two elements are in the same 'block' (defined in my comments) of the partition. We get an answer \\(F(P\\) that is true or false. I think what David and the book are saying is this. There is no binary operation \\(\ast\\) on the booleans {true,false} such that $$ F(P \vee P') = F(P) \ast F(P') $$ for all partitions \\(P,P'\\) .`

Michael Hong wrote:

There's a lot of math, so it takes a long time to learn. There's a lot to say about each of the different kinds of orders you mentioned - very fun stuff, too! But going through all this stuff could take weeks, so I'll regretfully avoid it.

In this class we're focusing on preorders and partial orders because they make a really good warmup for category theory. The first chapter of

Seven Sketchesis a good intro to preorders and partial orders, and the second chapter is about 'symmetric monoidal' preorders and partial orders. Everything in the first chapter has a generalization to categories; everything in the second chapter has a generalization to symmetric monoidal categories. That's why Fong and Spivak are telling us this stuff.`[Michael Hong wrote](https://forum.azimuthproject.org/discussion/comment/16331/#Comment_16331): > Sorry for the rambling but I guess I am having a hard time realizing the scope of consequences each property has on the structure because none of it is spelled out for you. There's a lot of math, so it takes a long time to learn. There's a lot to say about each of the different kinds of orders you mentioned - very fun stuff, too! But going through all this stuff could take weeks, so I'll regretfully avoid it. In this class we're focusing on preorders and partial orders because they make a really good warmup for category theory. The first chapter of _Seven Sketches_ is a good intro to preorders and partial orders, and the second chapter is about 'symmetric monoidal' preorders and partial orders. Everything in the first chapter has a generalization to categories; everything in the second chapter has a generalization to symmetric monoidal categories. That's why Fong and Spivak are telling us this stuff.`

Puzzle 4There is a preorder for the binary partitions of ordered sets. These sets can be categorized based on the four properties we have been examining. There are generative effects on those partitions based on which properties are active.I have marked where the preorder and poset lie in that Hasse diagram.

`**Puzzle 4** There is a preorder for the binary partitions of ordered sets. These sets can be categorized based on the four properties we have been examining. There are generative effects on those partitions based on which properties are active. ![Figure](https://docs.google.com/drawings/d/e/2PACX-1vSDw8Gkciwpii_BSqRqShpkatYooyE6JSrp93BWsWJx5Hoo3k4nmRLTN1RAYGYGtpnL4N_dAv01hM4w/pub?w=628&h=948) I have marked where the preorder and poset lie in that Hasse diagram.`

Frederick wrote:

I forget what four properties you're talking about. The most common way to put a preorder on partitions of a set is the one I described in my first comment on Exercise 3. This preorder is discussed extensively in

Seven Sketches, starting around Example 1.37. We'll probably talk about it a lot.`Frederick wrote: > I suppose there is a preorder over the partitions of sets based on the four properties we have been examining. I forget what four properties you're talking about. The most common way to put a preorder on partitions of a set is the one I described in [my first comment on Exercise 3](https://forum.azimuthproject.org/discussion/comment/16368/#Comment_16368). This preorder is discussed extensively in _[Seven Sketches](https://arxiv.org/pdf/1803.05316)_, starting around Example 1.37. We'll probably talk about it a lot.`

Joel wrote::

Interesting observation!

That's a nice thing that's not a Hasse diagram, but it's also not a DAG. A

DAGis a directed acyclic graph, which Wikipedia defines as a "finite directed graph with no directed cycles", but then you need to look up directed graph to see exactly what that is! In the usual definition, a directed graph does not allow more than one edge going from a vertex \(x\) to a vertex \(y\). So, your counterexample is not a DAG.More precisely, a

directed graphis a pair \((V,A)\) where \(V\) is a set whose elements are calledverticesornodes, and \(E\) is a subset of \(V \times V\): that is, a set of ordered pairs of vertices, calleddirected edges.There are many different kinds of graphs, and if you want to allow more than one directed edge from a vertex \(x\) to a vertex \(y \) , you want a

directed multigraphorquiver. These are the most important kind of graphs in category theory, since every category is, for starters, a quiver: its objects are the vertices and its morphisms are the edges.Basically, whenever someone says "graph" you should ask them exactly what kind they mean. But everyone agrees on the definition of DAG.

`[Joel wrote:](https://forum.azimuthproject.org/profile/2082/Joel%20Jennings): > Regarding the question as to the simplest DAG that's not a Hasse Diagram (#96), what about the graph with V={x,y},E={{x,y},{x,y}}? i.e. two vertices and two edges going in the same direction connecting them? Interesting observation! That's a nice thing that's not a Hasse diagram, but it's also not a DAG. A **[DAG](https://en.wikipedia.org/wiki/Directed_acyclic_graph)** is a directed acyclic graph, which Wikipedia defines as a "finite directed graph with no directed cycles", but then you need to look up [directed graph](https://en.wikipedia.org/wiki/Directed_graph) to see exactly what that is! In the usual definition, a directed graph does not allow more than one edge going from a vertex \\(x\\) to a vertex \\(y\\). So, your counterexample is not a DAG. More precisely, a **directed graph** is a pair \\((V,A)\\) where \\(V\\) is a set whose elements are called **vertices** or **nodes**, and \\(E\\) is a subset of \\(V \times V\\): that is, a set of ordered pairs of vertices, called **directed edges**. There are many different kinds of graphs, and if you want to allow more than one directed edge from a vertex \\(x\\) to a vertex \\(y \\) , you want a **directed multigraph** or **[quiver](https://en.wikipedia.org/wiki/Quiver_(mathematics))**. These are the most important kind of graphs in category theory, since every category is, for starters, a quiver: its objects are the vertices and its morphisms are the edges. Basically, whenever someone says "graph" you should ask them exactly what kind they mean. But everyone agrees on the definition of DAG.`

Introducing category theory by means of order theory is a natural way to ease into the basics - functor, co/product, opposite, adjunction - but this has always seemed a bit odd to me. Mathematically, if not categorically, an element of a relation is quite distinct from a genuine morphism - a function actually

doessomething, f: X \(\to\) Y evaluates terms of type X, while X \(\leq\) Y is simply an existential assertion.In general, I want to better understand the distinction of operation and relation - it seems to be a very primitive form of algebraic-geometric duality. Intuitively there is a simple correspondence - associativity and transitivity, unitality and reflexivity; but it seems there is more subtlety to be expounded. Does categorical logic provide a more nuanced perspective of this topic, or where else should I look? Thank you.

`Introducing category theory by means of order theory is a natural way to ease into the basics - functor, co/product, opposite, adjunction - but this has always seemed a bit odd to me. Mathematically, if not categorically, an element of a relation is quite distinct from a genuine morphism - a function actually *does* something, f: X \\(\to\\) Y evaluates terms of type X, while X \\(\leq\\) Y is simply an existential assertion. In general, I want to better understand the distinction of operation and relation - it seems to be a very primitive form of algebraic-geometric duality. Intuitively there is a simple correspondence - associativity and transitivity, unitality and reflexivity; but it seems there is more subtlety to be expounded. Does categorical logic provide a more nuanced perspective of this topic, or where else should I look? Thank you.`

Christian: one approach to your question, taken up in Chapter 2 of

Seven Sketches, is enriched category theory.A set with either 0 or 1 elements is just a truth value: 0 means "false" and 1 means "true". A function between such sets is necessarily unique if it exists at all, and it's just a way of talking about implication: false implies false, false implies true, and true implies true... but true does not imply false, because there's no function from a 1-element set to a 0-element set.

So, let \(\mathrm{Boole}\) be the

monoidal category of truth values, namely the category with two objects "true" and "false", with implications as morphisms, and with "and" as its tensor product. Now consider categories enriched over Boole. These are just preorders!That is, instead of a

setof morphisms from one object \(x\) to another object \(y\), a preorder has a meretruth valueof morphisms from \(x\) to \(y\): either T, in which case we say \(x \le y\), or F, in which case we don't say this. Composition becomes$$ x \le y \textrm{ and } y \le z \textrm{ implies } x \le z .$$ See how the tensor product is "and" here, and the morphism is "implies".

All this stuff gets massively generalized in homotopy type theory. The sequence

$$ \textrm{truth values}, \textrm{sets}, \dots $$ goes on forever! We're just dipping our toe into the great sea. (For more, see my Lectures on n-Categories and Cohomology.)

`<img width = "150" src = "http://math.ucr.edu/home/baez/mathematical/warning_sign.jpg"> Christian: one approach to your question, taken up in Chapter 2 of _[Seven Sketches](https://arxiv.org/abs/1803.05316)_, is enriched category theory. A set with either 0 or 1 elements is just a truth value: 0 means "false" and 1 means "true". A function between such sets is necessarily unique if it exists at all, and it's just a way of talking about implication: false implies false, false implies true, and true implies true... but true does not imply false, because there's no function from a 1-element set to a 0-element set. So, let \\(\mathrm{Boole}\\) be the **monoidal category of truth values**, namely the category with two objects "true" and "false", with implications as morphisms, and with "and" as its tensor product. Now consider categories enriched over Boole. These are just preorders! That is, instead of a _set_ of morphisms from one object \\(x\\) to another object \\(y\\), a preorder has a mere _truth value_ of morphisms from \\(x\\) to \\(y\\): either T, in which case we say \\(x \le y\\), or F, in which case we don't say this. Composition becomes $$ x \le y \textrm{ and } y \le z \textrm{ implies } x \le z .$$ See how the tensor product is "and" here, and the morphism is "implies". All this stuff gets massively generalized in homotopy type theory. The sequence $$ \textrm{truth values}, \textrm{sets}, \dots $$ goes on forever! We're just dipping our toe into the great sea. (For more, see my [Lectures on n-Categories and Cohomology](http://math.ucr.edu/home/baez/cohomology.pdf).)`

Meanwhile, Christian, you and I are busily writing a paper on semantics in computer science, based on enriched category theory. We consider categories enriched over \(\textrm{Graph, Cat, Poset}\) and \(\textrm{Set}\), and we exploit the chain of monoidal functors

$$ \mathrm{Graph} \to \mathrm{Cat} \to \mathrm{Poset} \to \mathrm{Set} $$ but we haven't yet explored the final rather degenerate step

$$ \mathrm{Set} \to \mathrm{Boole} $$ Please look at what that does. It sends set-enriched categories (i.e. categories) to Boole-enriched categories (i.e. preorders). But what are Boole-enriched Lawvere theories like?

By the way: the full chain is more like

$$ \mathrm{Graph} \to \mathrm{Cat} \to \mathrm{Preord} \to \mathrm{Poset} \to \mathrm{Set} \to \mathrm{Boole} $$ There's a step I hadn't bothered to mention before, where you take a category and

firstsquash it down to a preorder by saying there's at most one morphism between objects, andthensquash that down to a poset by identifying isomorphic objects.If we were getting paid by the word I'd definitely stick that in. Maybe we should mention it, just for completeness.

`<img width = "150" src = "http://math.ucr.edu/home/baez/mathematical/warning_sign.jpg"> Meanwhile, Christian, you and I are busily writing a paper on semantics in computer science, based on enriched category theory. We consider categories enriched over \\(\textrm{Graph, Cat, Poset}\\) and \\(\textrm{Set}\\), and we exploit the chain of monoidal functors $$ \mathrm{Graph} \to \mathrm{Cat} \to \mathrm{Poset} \to \mathrm{Set} $$ but we haven't yet explored the final rather degenerate step $$ \mathrm{Set} \to \mathrm{Boole} $$ Please look at what that does. It sends set-enriched categories (i.e. categories) to Boole-enriched categories (i.e. preorders). But what are Boole-enriched Lawvere theories like? By the way: the full chain is more like $$ \mathrm{Graph} \to \mathrm{Cat} \to \mathrm{Preord} \to \mathrm{Poset} \to \mathrm{Set} \to \mathrm{Boole} $$ There's a step I hadn't bothered to mention before, where you take a category and _first_ squash it down to a preorder by saying there's at most one morphism between objects, and _then_ squash that down to a poset by identifying isomorphic objects. If we were getting paid by the word I'd definitely stick that in. Maybe we should mention it, just for completeness.`

Hi, I'm confused about the proof of proposition 1.88, where they show that if \(Q\) is a preordered set with all meets, then any monotone map \(g: Q \to P\) that preserves meets is a right adjoint. They do this by constructing a left adjoint \(f: P \to Q\) by defining \(f(p) := \bigwedge Q_p\), where \(Q_p = \{ q \in Q | p \leq g(q) \}\).

I'm confused about this: what should happen when \(Q_p\) is empty for some \(p \in P\)?

For example, take \(g_0\) to be unique function \(g_0: \emptyset \to \{0\}\), which is vacuously monotone and vacuously preserves meets. In this case, we cannot even define a function \(f_0: \{0\} \to \emptyset\) in the other direction, let alone a left adjoint.

Slightly less trivially, we could take the inclusion \(g_1: \{0\} \rightarrow \{0, 1\}\), where the preorder on these sets is the usual one for natural numbers. \(g_1\) is monotone and preserves meets, and there is only one function \(f_1 : \{0, 1\} \to \{0\}\), which sends everything to \(0\). But \(f_1\) is not left adjoint to \(g_1\), since \(1 \nleq g_1(0) = 0\) but \(f_1(1) = 0 \leq 0\).

Have I missed something?

`Hi, I'm confused about the proof of proposition 1.88, where they show that if \\(Q\\) is a preordered set with all meets, then any monotone map \\(g: Q \to P\\) that preserves meets is a right adjoint. They do this by constructing a left adjoint \\(f: P \to Q\\) by defining \\(f(p) := \bigwedge Q_p\\), where \\(Q_p = \\{ q \in Q | p \leq g(q) \\}\\). I'm confused about this: what should happen when \\(Q_p\\) is empty for some \\(p \in P\\)? For example, take \\(g_0\\) to be unique function \\(g_0: \emptyset \to \\{0\\}\\), which is vacuously monotone and vacuously preserves meets. In this case, we cannot even define a function \\(f_0: \\{0\\} \to \emptyset\\) in the other direction, let alone a left adjoint. Slightly less trivially, we could take the inclusion \\(g_1: \\{0\\} \rightarrow \\{0, 1\\}\\), where the preorder on these sets is the usual one for natural numbers. \\(g_1\\) is monotone and preserves meets, and there is only one function \\(f_1 : \\{0, 1\\} \to \\{0\\}\\), which sends everything to \\(0\\). But \\(f_1\\) is not left adjoint to \\(g_1\\), since \\(1 \nleq g_1(0) = 0\\) but \\(f_1(1) = 0 \leq 0\\). Have I missed something?`

John #108 said:

Fair enough, I will err on the side of not deleting my wrong guesses (you may regret this John ;;) )

`[John #108](https://forum.azimuthproject.org/discussion/comment/16253/#Comment_16253) said: >By the way, I wish you hadn't deleted your earlier guess that every DAG was the Hasse diagram of a poset, because it was an interesting guess. It's often better to be wrong in an interesting way than right in a boring way! Fair enough, I will err on the side of not deleting my wrong guesses (you may regret this John ;;) )`

Alex has a very important question:

Puzzle.In a poset, what's the meet, or greatest lower bound, of the empty set? (It may not exist, but what is it if it exists?)It's a nerve-racking question when you first meet it (pun intended), but if keep your wits about you and use logic you can solve it.

Similarly:

Puzzle.What's the least upper bound, or join, of the empty set?`Alex has a very important question: **Puzzle.** In a poset, what's the meet, or greatest lower bound, of the empty set? (It may not exist, but what is it if it exists?) It's a nerve-racking question when you first meet it (pun intended), but if keep your wits about you and use logic you can solve it. Similarly: **Puzzle.** What's the least upper bound, or join, of the empty set?`

I'll do meet. Following Definition 1.60 in the book, \(p\) is the meet of \(\emptyset\) if

for all \(a \in \emptyset\), \(p \leq a\), which imposes no restrictions at all, since there is no such \(a\);

for all \(q\) such that \(q \leq a\) for all \(a \in A\), it is true that \(q \leq p\). Well, the antecedent is true of every \(q\), since no restriction is placed on it. So \(q \leq p\) for all \(q\) in the entire poset, and \(p\) must satisfy this if possible. For example, if the poset is a power set \(PX\) with the inclusion relation, \(p\) is the entire set \(X\).

`I'll do meet. Following Definition 1.60 in the book, \\(p\\) is the meet of \\(\emptyset\\) if 1. for all \\(a \in \emptyset\\), \\(p \leq a\\), which imposes no restrictions at all, since there is no such \\(a\\); 2. for all \\(q\\) such that \\(q \leq a\\) for all \\(a \in A\\), it is true that \\(q \leq p\\). Well, the antecedent is true of every \\(q\\), since no restriction is placed on it. So \\(q \leq p\\) for all \\(q\\) in the entire poset, and \\(p\\) must satisfy this if possible. For example, if the poset is a power set \\(PX\\) with the inclusion relation, \\(p\\) is the entire set \\(X\\).`