Options

Information and entropy in biological systems

I was invited to write a proposal for a workshop at the National Institute for Mathematical and Biological Synthesis, so I did, and you can see the proposal here:

If you can see ways to improve it, please let me know! I think it would be a great workshop, but I don't feel very good at talking to biologists, so they might not see it the way I do. I will try to get some help from my co-organizers, but so far all the writing is mine.

Comments

  • 1.

    I'll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means,. Great stuff :).

    Typos:

    p. 2 "Rieper has study". "England used thermodynamic argument".

    Comment Source:I'll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means,. Great stuff :). Typos: p. 2 "Rieper has study". "England used thermodynamic argument".
  • 2.

    relative entropy as a tool for studying the `neural code',

    I don't know what the neural code is. Maybe 'relative entropy as a tool for studying neural encodings.'

    information theory as a tool for optimizing experimental design.

    This is not obviously related to biological systems.

    Something you could add is cellular communication via signal molecules. I don't know much about it myself, but by putting 'signal molecules noise' into google scholar, you can find things like Information transduction capacity of noisy biochemical signaling networks

    The entropy of a population distribution is not merely a predictive tool in population biology

    I would interpret 'a population distribution' as 'the distribution of genotypes (or phenotypes) in a population'. I would interpret 'population biology' as meaning population genetics (or possibly demographics). I think from what you say later:

    the fraction of organisms of the ith type (which could mean species, some other taxon, etc.)

    that you mean something else. To a biologist, a population consists of a single species.

    Typo:

    some of big remaining questions.

    Comment Source:> relative entropy as a tool for studying the `neural code', I don't know what the neural code is. Maybe 'relative entropy as a tool for studying neural encodings.' > information theory as a tool for optimizing experimental design. This is not obviously related to biological systems. Something you could add is cellular communication via signal molecules. I don't know much about it myself, but by putting 'signal molecules noise' into google scholar, you can find things like [Information transduction capacity of noisy biochemical signaling networks](http://www.sciencemag.org/content/334/6054/354.short) > The entropy of a population distribution is not merely a predictive tool in population biology I would interpret 'a population distribution' as 'the distribution of genotypes (or phenotypes) in a population'. I would interpret 'population biology' as meaning population genetics (or possibly demographics). I think from what you say later: > the fraction of organisms of the ith type (which could mean species, some other taxon, etc.) that you mean something else. To a biologist, a population consists of a single species. Typo: > some of big remaining questions.
  • 3.
    edited August 2013

    Thanks for your comments, Graham!

    I don’t know what the neural code is.

    I was basing my remark on some neurobiologists who talk, perhaps somewhat facetiously, of "cracking the neural code". That's why I put "neural code" in quotes. But anyway, I've decided to downplay this whole aspect of the program, so I'm removing this bit.

    information theory as a tool for optimizing experimental design.

    This is not obviously related to biological systems.

    Good point. I have two biologist friends, my co-organizer Marc Harper and also Chris Lee, who are working on this. Later I explain that they're trying automate the process of experimental design. So there's a connection to biology, even though we could apply this idea to other fields. But I'll remove this remark here - it discomfited you, so it's probably bad!

    Something you could add is cellular communication via signal molecules.

    That could be good, but I'm only listing topics where we've got a few people lined up who can speak on them.

    The entropy of a population distribution is not merely a predictive tool in population biology

    I would interpret ’a population distribution’ as ’the distribution of genotypes (or phenotypes) in a population’. I would interpret ’population biology’ as meaning population genetics (or possibly demographics).

    Okay. For starters, I'll change "population biology" to "ecology". I was referring to the work of John Harte, who is using maximum entropy to predict the number of organisms of different species in an ecosystem. He calls his work "ecology", so that must be the right word.

    For seconders: at do you call a bunch of numbers $p_i$ when $p_i$ is the fraction of organisms that belong to the $i$th type - where "type" can mean either genotype or phenotype, or species, or genus, or any other sort of group?

    I was calling this bunch of numbers a "population distribution". You're telling me that term only applies when we've got a bunch of organisms of different genotypes or phenotypes within a given species.

    I could call it a "probability distribution": $p_i$ is the probability that a randomly chosen organism is of the $i$th type. This term would clarify why people do things like compute the entropy

    $$ - \sum_i p_i \log p_i $$ But I think this term would be confusing in other ways.

    I could try to avoid calling it anything, but then it's hard to say the sentence I'm trying to say here, without it sound very dry:

    The entropy of the list of numbers $p_i$ is not merely a predictive tool in population biology...

    Comment Source:Thanks for your comments, Graham! > I don’t know what the neural code is. I was basing my remark on some neurobiologists who talk, perhaps somewhat facetiously, of "cracking the neural code". That's why I put "neural code" in quotes. But anyway, I've decided to downplay this whole aspect of the program, so I'm removing this bit. > > information theory as a tool for optimizing experimental design. > This is not obviously related to biological systems. Good point. I have two biologist friends, my co-organizer Marc Harper and also Chris Lee, who are working on this. Later I explain that they're trying automate the process of experimental design. So there's a connection to biology, even though we could apply this idea to other fields. But I'll remove this remark here - it discomfited you, so it's probably bad! > Something you could add is cellular communication via signal molecules. That could be good, but I'm only listing topics where we've got a few people lined up who can speak on them. > > The entropy of a population distribution is not merely a predictive tool in population biology > I would interpret ’a population distribution’ as ’the distribution of genotypes (or phenotypes) in a population’. I would interpret ’population biology’ as meaning population genetics (or possibly demographics). Okay. For starters, I'll change "population biology" to "ecology". I was referring to [the work of John Harte](http://johncarlosbaez.wordpress.com/2013/02/21/maximum-entropy-and-ecology/), who is using maximum entropy to predict the number of organisms of different species in an ecosystem. He calls his work "ecology", so that must be the right word. For seconders: at do you call a bunch of numbers $p_i$ when $p_i$ is the fraction of organisms that belong to the $i$th type - where "type" can mean either genotype or phenotype, _or_ species, _or_ genus, or any other sort of group? I was calling this bunch of numbers a "population distribution". You're telling me that term only applies when we've got a bunch of organisms of different genotypes or phenotypes within a given species. I could call it a "probability distribution": $p_i$ is the probability that a randomly chosen organism is of the $i$th type. This term would clarify why people do things like compute the entropy $$ - \sum_i p_i \log p_i $$ But I think this term would be confusing in other ways. I could try to avoid calling it anything, but then it's hard to say the sentence I'm trying to say here, without it sound very dry: > The entropy of the list of numbers $p_i$ is not merely a predictive tool in population biology...
  • 4.
    edited August 2013

    Okay, right now I'll go for the vaguer statement:

    Entropy not merely a predictive tool in population biology…

    Anyone who cares can see an updated version of the workshop proposal. I've included some more speakers. I'm getting dangerously attached to the idea that this workshop will actually take place! Luckily John Harte said the Santa Fe Institute is "dying to have workshops like this" - so if this proposal doesn't fly at NIMBioS, I can try to sell it to them.

    Comment Source:Okay, right now I'll go for the vaguer statement: > Entropy not merely a predictive tool in population biology… Anyone who cares can see an [updated version of the workshop proposal](http://math.ucr.edu/home/baez/nimbios.pdf). I've included some more speakers. I'm getting dangerously attached to the idea that this workshop will actually take place! Luckily John Harte said the Santa Fe Institute is "dying to have workshops like this" - so if this proposal doesn't fly at NIMBioS, I can try to sell it to them.
  • 5.
    edited August 2013

    Jim wrote:

    I’ll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means...

    I'm actually quite skeptical that the principle of maximum entropy production is true except in some limited circumstances, since I've never seen a convincing demonstration of it. But Roderick Dewar has papers on this:

    so I'm eager to hear him defend this principle against the attacks of a rowdy crowd! I promised not to shoot him.

    Comment Source:Jim wrote: > I’ll look forward to some reconciliation or not of least effort and maximum entropy production; I think this has stopped me getting a grasp on what MEP means... I'm actually quite skeptical that the principle of maximum entropy production is true except in some limited circumstances, since I've never seen a convincing demonstration of it. But Roderick Dewar has papers on this: * R. C. Dewar, Maximum entropy production and non-equilibrium statistical mechanics, in _Non-Equilibrium Thermodynamics and Entropy Production: Life, Earth and Beyond_, eds. A. Kleidon and R. Lorenz, Springer, New York, 41-55. * R. C. Dewar, Information theoretic explanation of maximum entropy production, the fluctuation theorem and self-organized criticality in non-equilibrium stationary states, _J. Phys. A (Mathematical and General)_, **36** (2003), 631-641. [Summary here](http://ms.mcmaster.ca/~grasselli/Dewar.pdf). * R. C. Dewar, [Maximum entropy production as an inference algorithm that translates physical assumptions into macroscopic predictions: don't shoot the messenger](http://www.mdpi.com/1099-4300/11/4/931/pdf), _Entropy_ **11** (2009), 931-944. so I'm eager to hear him defend this principle against the attacks of a rowdy crowd! I promised not to shoot him.
  • 6.
    edited August 2013

    Thanks for the links. I won't shoot the messenger :). I've wondered why the Santa Fe institute doesn't seem to have figured on Azimuth yet.

    Comment Source:Thanks for the links. I won't shoot the messenger :). I've wondered why the Santa Fe institute doesn't seem to have figured on Azimuth yet.
  • 7.

    Hi! This is a cool topic!

    Comment Source:Hi! This is a cool topic!
  • 8.
    edited August 2013

    Hi, Jacob. Yes, it's cool.

    Jim wrote:

    I’ve wondered why the Santa Fe institute doesn’t seem to have figured on Azimuth yet.

    The ecologist John Harte said:

    The Santa Fe Institute is eager to host workshops exactly like this one and so if the venue is undetermined and you were interested in Santa Fe as a host (they do a splendid job of hosting workshops) we could do it there.

    Workshops sponsored by the National Institute for Mathematical and Biological Synthesis need to be held in Knoxille Tennesee, but if they don't approve this workshop proposal I may try to hold a similar workshop at Santa Fe.

    Comment Source:Hi, Jacob. Yes, it's cool. Jim wrote: > I’ve wondered why the Santa Fe institute doesn’t seem to have figured on Azimuth yet. The ecologist John Harte said: > The Santa Fe Institute is eager to host workshops exactly like this one and so if the venue is undetermined and you were interested in Santa Fe as a host (they do a splendid job of hosting workshops) we could do it there. Workshops sponsored by the National Institute for Mathematical and Biological Synthesis need to be held in Knoxille Tennesee, but if they don't approve this workshop proposal I may try to hold a similar workshop at Santa Fe.
  • 9.
    edited August 2013

    David Wolpert, who works at SFI and LANL, may have some interests that overlap with the Azimuth Project. LANL's Center for Nonlinear Studies has various interests in complexity, information theory, networks, quantum theory, theoretical biology, etc., and runs theme workshops, but I don't think they solicit workshops from outside; they'd have to be organized by LANL staff.

    Comment Source:[David Wolpert](http://www.santafe.edu/about/people/profile/David%20Wolpert), who works at SFI and LANL, may have some interests that overlap with the Azimuth Project. LANL's [Center for Nonlinear Studies](http://cnls.lanl.gov/External/) has various interests in complexity, information theory, networks, quantum theory, theoretical biology, etc., and runs theme workshops, but I don't think they solicit workshops from outside; they'd have to be organized by LANL staff.
  • 10.

    Thanks, Nathan!

    Comment Source:Thanks, Nathan!
  • 11.

    I don't want to blog about this just now, but Annette Ostling, a workshop invitee who works on biodiversity, has an interesting paper that mentions a stochastic Petri net model called "Hubbell’s original zero-sum model":

    Well, they don't say it's a stochastic Petri net, but it is, so all the theorems in my book with Jacob apply! Unfortunately the new generalized model described in this paper is not a stochastic Petri net model: there are a bunch of species, and each one has births and deaths... but all new individuals are born into the species that happens to have the smallest population! That seems strange to me.

    Comment Source:I don't want to blog about this just now, but Annette Ostling, a workshop invitee who works on biodiversity, has an interesting paper that mentions a stochastic Petri net model called "Hubbell’s original zero-sum model": * J. P. O’Dwyer, J. K. Lake, A. Ostling, V. M. Savage, and J. L. Green, [An integrative framework for stochastic, size-structured community assembly](http://www-personal.umich.edu/~aostling/papers/ODwyer2009.pdf), _PNAS_ **106** (2009), 6170-6175. Well, they don't _say_ it's a stochastic Petri net, but it is, so all the theorems in my book with Jacob apply! Unfortunately the new generalized model described in this paper is not a stochastic Petri net model: there are a bunch of species, and each one has births and deaths... but all new individuals are born into the species that happens to have the smallest population! That seems strange to me.
  • 12.
    edited October 2013

    YAY!

    My proposal for a workshop on Information and Entropy in Biological Systems at the National Institute for Mathematical and Biological Synthesis was accepted! Thanks to everyone who helped out. More later.

    Comment Source:**YAY!** My proposal for a workshop on [Information and Entropy in Biological Systems](http://math.ucr.edu/home/baez/entropy_nimbios.pdf) at the [National Institute for Mathematical and Biological Synthesis](http://www.nimbios.org/) was accepted! Thanks to everyone who helped out. More later.
Sign In or Register to comment.