Home › Azimuth Project › › Azimuth Blog

It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.4K
- Chat 505
- Study Groups 21
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 2
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- Baez ACT 2019: Online Course 339
- Baez ACT 2019: Lectures 79
- Baez ACT 2019: Exercises 149
- Baez ACT 2019: Chat 50
- UCR ACT Seminar 4
- General 75
- Azimuth Code Project 111
- Statistical methods 4
- Drafts 10
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 719

Options

I've been soliciting blog articles, and they're starting to come in!

Here are some on stochastic methods which we can expect to see soon:

Marc Harper, Blog - relative entropy in evolutionary dynamics.

Marc Harper, Blog - stationary stability in finite populations.

I should be editing the first of these today! In fact I should be doing it right now! It will be the next article on the blog, that's for sure.

Taken together, these two should give a nice introduction to new work on evolutionary game theory with a strong emphasis on *relative entropy* and *information geometry*. One thing I want to explain clearly is how relative entropy can serve as a Lyapunov function for evolutionary games. This includes answering "what the @#!% is a Lyapunov function and why the &#%@ should I care???"

The overall goal is applying robust concepts like entropy to better understand the behavior of biological and ecological systems.

- Manoj Gopalkrishnan, Blog - Lyapunov functions for complex-balanced systems.

This post will explain a very interesting entropy-related Lyapunov function for certain chemical reaction networks. It's almost done, but it needs some editing and it still needs pictures.

Again, the overall goal is to apply entropy to better understand living systems. And since some evolutionary games can be reinterpreted as chemical reaction networks, this post should be closely related to what Marc is doing! But there's some mental work to make the connection — for me, at least. It should be *really cool* when it all fits together!

Very roughly, this is about a method for determining which economic scenarios are more likely. The likely scenarios get fed into things like the IPCC climate models, so this is important. I met Vanessa Schweizer at that workshop on What is climate change and what to do about it?, and she suggested that I could help her and Alastair and Matteo Smerlak work on this topic. That sounded really interesting, so I solicited some blog articles to prime the pump!

Since this stuff is stochastic, it may be related to the other posts above. I don't know how related it is. Matteo should have some opinions on that.

## Comments

I'm glad that Lyapunov functions are quite central to to Marc's and Manoj's blogs, since they were the only thing I did sort of understand beforehand!

I think I'd start an explanation with a ball rolling around in a bowl. Add some syrup to stop the the ball having enough kinetic energy to worry about. Then the gravitational potential (or just the height) of the ball can serve as a Lyapunov function. Without the syrup, things become more complicated, but at least you expect the potential plus kinetic energy of the ball to monotonically decrease. Without the bowl, who knows where the ball might go.

It seems quite intuitive that:

Lyapunov functions are great things if you can get them because they give a guide to the global properties of the system.

You don't expect to find them in general, and even when one exists, it may be difficult to find.

Free energy or entropy, or analogues, are a good place to start looking.

`> This includes answering “what the @#!% is a Lyapunov function and why the &#%@ should I care???” I'm glad that Lyapunov functions are quite central to to Marc's and Manoj's blogs, since they were the only thing I did sort of understand beforehand! I think I'd start an explanation with a ball rolling around in a bowl. Add some syrup to stop the the ball having enough kinetic energy to worry about. Then the gravitational potential (or just the height) of the ball can serve as a Lyapunov function. Without the syrup, things become more complicated, but at least you expect the potential plus kinetic energy of the ball to monotonically decrease. Without the bowl, who knows where the ball might go. It seems quite intuitive that: 1. Lyapunov functions are great things if you can get them because they give a guide to the global properties of the system. 2. You don't expect to find them in general, and even when one exists, it may be difficult to find. 3. Free energy or entropy, or analogues, are a good place to start looking.`

Graham: you might check out my edited version of Marc Harper's post on relative entropy in evolutionary dynamics. You'll be glad to know that even before reading your comment, I greatly expanded the discussion of Lyapunov functions and started talking about balls rolling down hills.

I did not talk about the need for syrup, nor did I mention that free energy or entropy are good places to start looking for such functions.

Before, in parts 9-13 of the information geometry series, I explained how relative entropy being a Lyapunov function for evolutionary dynamics is connected to the 2nd law of thermodynamics. However, it's probably worth another mention!

The issue of needing lots of friction to get our "rolling ball" intuition to apply to

first-orderdifferential equations is extremely important, but I'm not sure this blog article is the place for it.`Graham: you might check out my edited version of Marc Harper's post on [[Blog - relative entropy in evolutionary dynamics|relative entropy in evolutionary dynamics]]. You'll be glad to know that even before reading your comment, I greatly expanded the discussion of Lyapunov functions and started talking about balls rolling down hills. I did not talk about the need for syrup, nor did I mention that free energy or entropy are good places to start looking for such functions. Before, in parts 9-13 of the [information geometry series](http://math.ucr.edu/home/baez/information/information_geometry_9.html), I explained how relative entropy being a Lyapunov function for evolutionary dynamics is connected to the 2nd law of thermodynamics. However, it's probably worth another mention! The issue of needing lots of friction to get our "rolling ball" intuition to apply to _first-order_ differential equations is extremely important, but I'm not sure this blog article is the place for it.`