It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 500
- Study Groups 19
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 1
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- MIT 2019: Applied Category Theory 339
- MIT 2019: Lectures 79
- MIT 2019: Exercises 149
- MIT 2019: Chat 50
- UCR ACT Seminar 4
- General 68
- Azimuth Code Project 110
- Statistical methods 4
- Drafts 2
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 147
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 713

Options

Yesterday I wrote a blog article on categories that show up in probability theory, as a kind of warmup for finishing off a paper with Tobias Fritz.

This made me want to organize some old notes on this topic. So, I just took some material from the nLab and made it into an Azimuth page:

I have mixed feelings about this material. On the one hand it's rather esoteric and doesn't yet have a "killer app" to justify it. On the other hand, it brings entropy into contact with mathematical ideas that pure mathematicians like, which should someday be useful. And it's cute.

## Comments

In this part:

[...]

I want to extend the $Q_i$ to all the $X_k$ with zeroes when $k \neq i$, so that each $Q_i$ is a distribution over the same set $U$, the disjoint union of the $X_i$, and $j$ runs over $U$ in the definition of the $q_{ij}$. Then what you call glomming is a mixture of distributions. People often use mixtures, but I've never heard of glomming!

The formula with the extra term reminds of `the law of total variance', see http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/.

`In this part: > We begin with a sadly familiar problem: > Suppose you live in a town with a limited number of tolerable restaurants. Every Friday you go out for dinner. You randomly choose a restaurant according to a certain probability distribution $P$. If you go to the $i$th restaurant, you then choose a dish from the menu according to some probability distribution $Q_i$. <i>How surprising will your choice be, on average?</i> [...] > Glomming together probabilities > But the interesting thing about this problem is that it involves an operation which I'll call 'glomming together' probability distributions. First you choose a restaurant according to some probability distribution $P$ on the set of restaurants. Then you choose a meal according to some probability distribution $Q_i$. If there are $n$ restaurants in town, you wind up eating meals in a way described by some probability distribution we'll call > $$ P \circ (Q_1, \dots, Q_n )$$ > A bit more formally: > Suppose $P$ is a probability distribution on the set $\{1,\dots, n\}$ and $Q_i$ are probability distributions on finite sets $X_i$, where $i = 1, \dots, n$. Suppose the probability distribution $P$ assigns a probability $p_i$ to each element $i \in \{1,\dots, n\}$, and suppose the distribution $Q_i$ assigns a probability $q_{i j}$ to each element $j \in X_i$. I want to extend the $Q_i$ to all the $X_k$ with zeroes when $k \neq i$, so that each $Q_i$ is a distribution over the same set $U$, the disjoint union of the $X_i$, and $j$ runs over $U$ in the definition of the $q_{ij}$. Then what you call glomming is a mixture of distributions. People often use mixtures, but I've never heard of glomming! The formula with the extra term reminds of `the law of total variance', see [http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/](http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/).`

"Glom" is just American slang for "stick together", not a technical term.

You're right that "glomming" is a special case of a mixture of distributions. But I suspect that the rather simple formula for the entropy of a probability distribution formed by 'glomming' gets more complicated for mixtures of probability distributions that don't have disjoint supports.

Yes, that formula for the variance of a mixture is somehow related! I'm not sure exactly how to fit them in a common framework....

`"Glom" is just American slang for "stick together", not a technical term. You're right that "glomming" is a special case of a mixture of distributions. But I suspect that the rather simple formula for the entropy of a probability distribution formed by 'glomming' gets more complicated for mixtures of probability distributions that don't have disjoint supports. Yes, that formula for the variance of a mixture is somehow related! I'm not sure exactly how to fit them in a common framework....`

I think it's very useful to have gathering this material on a page. Some I'd missed and others I'd certainly like to reread. Whether Zurek's ideas and non-extensive entropies in any shape have any usable connections to the kinds of models Azimuth people want seems like a very good question.

`I think it's very useful to have gathering this material on a page. Some I'd missed and others I'd certainly like to reread. Whether Zurek's ideas and non-extensive entropies in any shape have any usable connections to the kinds of models Azimuth people want seems like a very good question.`