It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 1.6K
- General 37
- Azimuth Project 95
- - Latest Changes 345
- News and Information 222
- - Strategy 93
- - Questions 39
- Azimuth Forum 25
- - Conventions and Policies 21
- Chat 166
- Azimuth Blog 132
- - - Action 13
- - - Biodiversity 7
- - - Books 1
- - - Carbon 7
- - - Climate 41
- - - Computational methods 34
- - - Earth science 21
- - - Ecology 40
- - - Energy 28
- - - Geoengineering 0
- - - Mathematical methods 62
- - - Oceans 4
- - - Methodology 16
- - - Organizations 33
- - - People 6
- - - Reports 3
- - - Software 19
- - - Statistical methods 1
- - - Things to do 1
- - - Visualisation 1
- - - Meta 7
- - - Natural resources 4
- Azimuth Wiki 6
- - - Experiments 23
- - - Sustainability 4
- - - Publishing 3
- Azimuth Code Project 69
- - Spam 1

Yesterday I wrote a blog article on categories that show up in probability theory, as a kind of warmup for finishing off a paper with Tobias Fritz.

This made me want to organize some old notes on this topic. So, I just took some material from the nLab and made it into an Azimuth page:

I have mixed feelings about this material. On the one hand it's rather esoteric and doesn't yet have a "killer app" to justify it. On the other hand, it brings entropy into contact with mathematical ideas that pure mathematicians like, which should someday be useful. And it's cute.

## Comments

In this part:

[...]

I want to extend the $Q_i$ to all the $X_k$ with zeroes when $k \neq i$, so that each $Q_i$ is a distribution over the same set $U$, the disjoint union of the $X_i$, and $j$ runs over $U$ in the definition of the $q_{ij}$. Then what you call glomming is a mixture of distributions. People often use mixtures, but I've never heard of glomming!

The formula with the extra term reminds of `the law of total variance', see http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/.

`In this part: > We begin with a sadly familiar problem: > Suppose you live in a town with a limited number of tolerable restaurants. Every Friday you go out for dinner. You randomly choose a restaurant according to a certain probability distribution $P$. If you go to the $i$th restaurant, you then choose a dish from the menu according to some probability distribution $Q_i$. <i>How surprising will your choice be, on average?</i> [...] > Glomming together probabilities > But the interesting thing about this problem is that it involves an operation which I'll call 'glomming together' probability distributions. First you choose a restaurant according to some probability distribution $P$ on the set of restaurants. Then you choose a meal according to some probability distribution $Q_i$. If there are $n$ restaurants in town, you wind up eating meals in a way described by some probability distribution we'll call > $$ P \circ (Q_1, \dots, Q_n )$$ > A bit more formally: > Suppose $P$ is a probability distribution on the set $\{1,\dots, n\}$ and $Q_i$ are probability distributions on finite sets $X_i$, where $i = 1, \dots, n$. Suppose the probability distribution $P$ assigns a probability $p_i$ to each element $i \in \{1,\dots, n\}$, and suppose the distribution $Q_i$ assigns a probability $q_{i j}$ to each element $j \in X_i$. I want to extend the $Q_i$ to all the $X_k$ with zeroes when $k \neq i$, so that each $Q_i$ is a distribution over the same set $U$, the disjoint union of the $X_i$, and $j$ runs over $U$ in the definition of the $q_{ij}$. Then what you call glomming is a mixture of distributions. People often use mixtures, but I've never heard of glomming! The formula with the extra term reminds of `the law of total variance', see [http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/](http://statisticalmodeling.wordpress.com/2011/06/16/the-variance-of-a-mixture/).`

"Glom" is just American slang for "stick together", not a technical term.

You're right that "glomming" is a special case of a mixture of distributions. But I suspect that the rather simple formula for the entropy of a probability distribution formed by 'glomming' gets more complicated for mixtures of probability distributions that don't have disjoint supports.

Yes, that formula for the variance of a mixture is somehow related! I'm not sure exactly how to fit them in a common framework....

`"Glom" is just American slang for "stick together", not a technical term. You're right that "glomming" is a special case of a mixture of distributions. But I suspect that the rather simple formula for the entropy of a probability distribution formed by 'glomming' gets more complicated for mixtures of probability distributions that don't have disjoint supports. Yes, that formula for the variance of a mixture is somehow related! I'm not sure exactly how to fit them in a common framework....`

I think it's very useful to have gathering this material on a page. Some I'd missed and others I'd certainly like to reread. Whether Zurek's ideas and non-extensive entropies in any shape have any usable connections to the kinds of models Azimuth people want seems like a very good question.

`I think it's very useful to have gathering this material on a page. Some I'd missed and others I'd certainly like to reread. Whether Zurek's ideas and non-extensive entropies in any shape have any usable connections to the kinds of models Azimuth people want seems like a very good question.`