It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.4K
- Chat 505
- Study Groups 21
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 2
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- Baez ACT 2019: Online Course 339
- Baez ACT 2019: Lectures 79
- Baez ACT 2019: Exercises 149
- Baez ACT 2019: Chat 50
- UCR ACT Seminar 4
- General 75
- Azimuth Code Project 111
- Statistical methods 4
- Drafts 10
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 719

Options

I was invited to write a proposal for a workshop at the National Institute for Mathematical and Biological Synthesis, so I did, and you can see the proposal here:

If you can see ways to improve it, please let me know! I think it would be a great workshop, but I don't feel very good at talking to biologists, so they might not see it the way I do. I will try to get some help from my co-organizers, but so far all the writing is mine.

## Comments

I'll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means,. Great stuff :).

Typos:

p. 2 "Rieper has study". "England used thermodynamic argument".

`I'll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means,. Great stuff :). Typos: p. 2 "Rieper has study". "England used thermodynamic argument".`

I don't know what the neural code is. Maybe 'relative entropy as a tool for studying neural encodings.'

This is not obviously related to biological systems.

Something you could add is cellular communication via signal molecules. I don't know much about it myself, but by putting 'signal molecules noise' into google scholar, you can find things like Information transduction capacity of noisy biochemical signaling networks

I would interpret 'a population distribution' as 'the distribution of genotypes (or phenotypes) in a population'. I would interpret 'population biology' as meaning population genetics (or possibly demographics). I think from what you say later:

that you mean something else. To a biologist, a population consists of a single species.

Typo:

`> relative entropy as a tool for studying the `neural code', I don't know what the neural code is. Maybe 'relative entropy as a tool for studying neural encodings.' > information theory as a tool for optimizing experimental design. This is not obviously related to biological systems. Something you could add is cellular communication via signal molecules. I don't know much about it myself, but by putting 'signal molecules noise' into google scholar, you can find things like [Information transduction capacity of noisy biochemical signaling networks](http://www.sciencemag.org/content/334/6054/354.short) > The entropy of a population distribution is not merely a predictive tool in population biology I would interpret 'a population distribution' as 'the distribution of genotypes (or phenotypes) in a population'. I would interpret 'population biology' as meaning population genetics (or possibly demographics). I think from what you say later: > the fraction of organisms of the ith type (which could mean species, some other taxon, etc.) that you mean something else. To a biologist, a population consists of a single species. Typo: > some of big remaining questions.`

Thanks for your comments, Graham!

I was basing my remark on some neurobiologists who talk, perhaps somewhat facetiously, of "cracking the neural code". That's why I put "neural code" in quotes. But anyway, I've decided to downplay this whole aspect of the program, so I'm removing this bit.

Good point. I have two biologist friends, my co-organizer Marc Harper and also Chris Lee, who are working on this. Later I explain that they're trying automate the process of experimental design. So there's a connection to biology, even though we could apply this idea to other fields. But I'll remove this remark here - it discomfited you, so it's probably bad!

That could be good, but I'm only listing topics where we've got a few people lined up who can speak on them.

Okay. For starters, I'll change "population biology" to "ecology". I was referring to the work of John Harte, who is using maximum entropy to predict the number of organisms of different species in an ecosystem. He calls his work "ecology", so that must be the right word.

For seconders: at do you call a bunch of numbers $p_i$ when $p_i$ is the fraction of organisms that belong to the $i$th type - where "type" can mean either genotype or phenotype,

orspecies,orgenus, or any other sort of group?I was calling this bunch of numbers a "population distribution". You're telling me that term only applies when we've got a bunch of organisms of different genotypes or phenotypes within a given species.

I could call it a "probability distribution": $p_i$ is the probability that a randomly chosen organism is of the $i$th type. This term would clarify why people do things like compute the entropy

$$ - \sum_i p_i \log p_i $$ But I think this term would be confusing in other ways.

I could try to avoid calling it anything, but then it's hard to say the sentence I'm trying to say here, without it sound very dry:

`Thanks for your comments, Graham! > I don’t know what the neural code is. I was basing my remark on some neurobiologists who talk, perhaps somewhat facetiously, of "cracking the neural code". That's why I put "neural code" in quotes. But anyway, I've decided to downplay this whole aspect of the program, so I'm removing this bit. > > information theory as a tool for optimizing experimental design. > This is not obviously related to biological systems. Good point. I have two biologist friends, my co-organizer Marc Harper and also Chris Lee, who are working on this. Later I explain that they're trying automate the process of experimental design. So there's a connection to biology, even though we could apply this idea to other fields. But I'll remove this remark here - it discomfited you, so it's probably bad! > Something you could add is cellular communication via signal molecules. That could be good, but I'm only listing topics where we've got a few people lined up who can speak on them. > > The entropy of a population distribution is not merely a predictive tool in population biology > I would interpret ’a population distribution’ as ’the distribution of genotypes (or phenotypes) in a population’. I would interpret ’population biology’ as meaning population genetics (or possibly demographics). Okay. For starters, I'll change "population biology" to "ecology". I was referring to [the work of John Harte](http://johncarlosbaez.wordpress.com/2013/02/21/maximum-entropy-and-ecology/), who is using maximum entropy to predict the number of organisms of different species in an ecosystem. He calls his work "ecology", so that must be the right word. For seconders: at do you call a bunch of numbers $p_i$ when $p_i$ is the fraction of organisms that belong to the $i$th type - where "type" can mean either genotype or phenotype, _or_ species, _or_ genus, or any other sort of group? I was calling this bunch of numbers a "population distribution". You're telling me that term only applies when we've got a bunch of organisms of different genotypes or phenotypes within a given species. I could call it a "probability distribution": $p_i$ is the probability that a randomly chosen organism is of the $i$th type. This term would clarify why people do things like compute the entropy $$ - \sum_i p_i \log p_i $$ But I think this term would be confusing in other ways. I could try to avoid calling it anything, but then it's hard to say the sentence I'm trying to say here, without it sound very dry: > The entropy of the list of numbers $p_i$ is not merely a predictive tool in population biology...`

Okay, right now I'll go for the vaguer statement:

Anyone who cares can see an updated version of the workshop proposal. I've included some more speakers. I'm getting dangerously attached to the idea that this workshop will actually take place! Luckily John Harte said the Santa Fe Institute is "dying to have workshops like this" - so if this proposal doesn't fly at NIMBioS, I can try to sell it to them.

`Okay, right now I'll go for the vaguer statement: > Entropy not merely a predictive tool in population biology… Anyone who cares can see an [updated version of the workshop proposal](http://math.ucr.edu/home/baez/nimbios.pdf). I've included some more speakers. I'm getting dangerously attached to the idea that this workshop will actually take place! Luckily John Harte said the Santa Fe Institute is "dying to have workshops like this" - so if this proposal doesn't fly at NIMBioS, I can try to sell it to them.`

Jim wrote:

I'm actually quite skeptical that the principle of maximum entropy production is true except in some limited circumstances, since I've never seen a convincing demonstration of it. But Roderick Dewar has papers on this:

R. C. Dewar, Maximum entropy production and non-equilibrium statistical mechanics, in

Non-Equilibrium Thermodynamics and Entropy Production: Life, Earth and Beyond, eds. A. Kleidon and R. Lorenz, Springer, New York, 41-55.R. C. Dewar, Information theoretic explanation of maximum entropy production, the fluctuation theorem and self-organized criticality in non-equilibrium stationary states,

J. Phys. A (Mathematical and General),36(2003), 631-641. Summary here.R. C. Dewar, Maximum entropy production as an inference algorithm that translates physical assumptions into macroscopic predictions: don't shoot the messenger,

Entropy11(2009), 931-944.so I'm eager to hear him defend this principle against the attacks of a rowdy crowd! I promised not to shoot him.

`Jim wrote: > I’ll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means... I'm actually quite skeptical that the principle of maximum entropy production is true except in some limited circumstances, since I've never seen a convincing demonstration of it. But Roderick Dewar has papers on this: * R. C. Dewar, Maximum entropy production and non-equilibrium statistical mechanics, in _Non-Equilibrium Thermodynamics and Entropy Production: Life, Earth and Beyond_, eds. A. Kleidon and R. Lorenz, Springer, New York, 41-55. * R. C. Dewar, Information theoretic explanation of maximum entropy production, the fluctuation theorem and self-organized criticality in non-equilibrium stationary states, _J. Phys. A (Mathematical and General)_, **36** (2003), 631-641. [Summary here](http://ms.mcmaster.ca/~grasselli/Dewar.pdf). * R. C. Dewar, [Maximum entropy production as an inference algorithm that translates physical assumptions into macroscopic predictions: don't shoot the messenger](http://www.mdpi.com/1099-4300/11/4/931/pdf), _Entropy_ **11** (2009), 931-944. so I'm eager to hear him defend this principle against the attacks of a rowdy crowd! I promised not to shoot him.`

Thanks for the links. I won't shoot the messenger :). I've wondered why the Santa Fe institute doesn't seem to have figured on Azimuth yet.

`Thanks for the links. I won't shoot the messenger :). I've wondered why the Santa Fe institute doesn't seem to have figured on Azimuth yet.`

Hi! This is a cool topic!

`Hi! This is a cool topic!`

Hi, Jacob. Yes, it's cool.

Jim wrote:

The ecologist John Harte said:

Workshops sponsored by the National Institute for Mathematical and Biological Synthesis need to be held in Knoxille Tennesee, but if they don't approve this workshop proposal I may try to hold a similar workshop at Santa Fe.

`Hi, Jacob. Yes, it's cool. Jim wrote: > I’ve wondered why the Santa Fe institute doesn’t seem to have figured on Azimuth yet. The ecologist John Harte said: > The Santa Fe Institute is eager to host workshops exactly like this one and so if the venue is undetermined and you were interested in Santa Fe as a host (they do a splendid job of hosting workshops) we could do it there. Workshops sponsored by the National Institute for Mathematical and Biological Synthesis need to be held in Knoxille Tennesee, but if they don't approve this workshop proposal I may try to hold a similar workshop at Santa Fe.`

David Wolpert, who works at SFI and LANL, may have some interests that overlap with the Azimuth Project. LANL's Center for Nonlinear Studies has various interests in complexity, information theory, networks, quantum theory, theoretical biology, etc., and runs theme workshops, but I don't think they solicit workshops from outside; they'd have to be organized by LANL staff.

`[David Wolpert](http://www.santafe.edu/about/people/profile/David%20Wolpert), who works at SFI and LANL, may have some interests that overlap with the Azimuth Project. LANL's [Center for Nonlinear Studies](http://cnls.lanl.gov/External/) has various interests in complexity, information theory, networks, quantum theory, theoretical biology, etc., and runs theme workshops, but I don't think they solicit workshops from outside; they'd have to be organized by LANL staff.`

Thanks, Nathan!

`Thanks, Nathan!`

I don't want to blog about this just now, but Annette Ostling, a workshop invitee who works on biodiversity, has an interesting paper that mentions a stochastic Petri net model called "Hubbell’s original zero-sum model":

PNAS106(2009), 6170-6175.Well, they don't

sayit's a stochastic Petri net, but it is, so all the theorems in my book with Jacob apply! Unfortunately the new generalized model described in this paper is not a stochastic Petri net model: there are a bunch of species, and each one has births and deaths... but all new individuals are born into the species that happens to have the smallest population! That seems strange to me.`I don't want to blog about this just now, but Annette Ostling, a workshop invitee who works on biodiversity, has an interesting paper that mentions a stochastic Petri net model called "Hubbell’s original zero-sum model": * J. P. O’Dwyer, J. K. Lake, A. Ostling, V. M. Savage, and J. L. Green, [An integrative framework for stochastic, size-structured community assembly](http://www-personal.umich.edu/~aostling/papers/ODwyer2009.pdf), _PNAS_ **106** (2009), 6170-6175. Well, they don't _say_ it's a stochastic Petri net, but it is, so all the theorems in my book with Jacob apply! Unfortunately the new generalized model described in this paper is not a stochastic Petri net model: there are a bunch of species, and each one has births and deaths... but all new individuals are born into the species that happens to have the smallest population! That seems strange to me.`

YAY!My proposal for a workshop on Information and Entropy in Biological Systems at the National Institute for Mathematical and Biological Synthesis was accepted! Thanks to everyone who helped out. More later.

`**YAY!** My proposal for a workshop on [Information and Entropy in Biological Systems](http://math.ucr.edu/home/baez/entropy_nimbios.pdf) at the [National Institute for Mathematical and Biological Synthesis](http://www.nimbios.org/) was accepted! Thanks to everyone who helped out. More later.`