It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 501
- Study Groups 21
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 2
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- MIT 2019: Applied Category Theory 339
- MIT 2019: Lectures 79
- MIT 2019: Exercises 149
- MIT 2019: Chat 50
- UCR ACT Seminar 4
- General 69
- Azimuth Code Project 110
- Statistical methods 4
- Drafts 5
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 714

Options

In the strategy discussion, I wrote:

All told, I propose an Azimuth quest with the following focus:

- Pursuit of applications of Petri nets to stochastic as well as deterministic epidemiology

This is wide open.

Here is one idea I had, which I posted to the Azimuth blog:

Modeling each country separately leaves holes in the overall model for a pandemic. E.g. if the curve goes down, travel restrictions are lifted, and then it goes back up due to what’s happening in other countries. Compartmental models use ODEs and assume a well-mixed population. What about a multi-level approach, where each country or well-mixed region has a compartmental model with its own parameters. Then there could be transitions between the compartments in different countries, reflecting flows due to travel. This looks like a potential application of composition of open networks. Perhaps a good composition rule could produce an aggregated, abstracted compartmental model for the whole globe. Or help us in other ways to understand the dynamics of the whole.

Posted to:

- How scientists can help fight Covid-19, John Baez, Azimuth blog, March 31.

Spelled out, the suggestion is to apply the open Petri net framework that John, Jade and Blake have been developing to the composition of global pandemic networks from smaller regional networks.

Open Petri nets, John C. Baez, Jade Master, Aug 2018.

A compositional framework for reaction networks, John C. Baez, Blake S. Pollard, April 2017.

## Comments

This is a huge epidemic modeling source code library called FRED which is overly ambitious

"Framework for Reconstructing Epidemiological Dynamics" https://github.com/PublicHealthDynamicsLab/FRED/

This style of program could benefit from a rule-based architecture or a stochastic Petri net where all the busy-work of manipulating data structures would be vastly reduced.

If I haven't mentioned it before this application of signalling Petri nets looks interesting : "Graphical & Computational Modelling of Biological Pathways" https://youtu.be/1IPOIE0PvQY

`This is a huge epidemic modeling source code library called FRED which is overly ambitious "Framework for Reconstructing Epidemiological Dynamics" https://github.com/PublicHealthDynamicsLab/FRED/ This style of program could benefit from a rule-based architecture or a stochastic Petri net where all the busy-work of manipulating data structures would be vastly reduced. If I haven't mentioned it before this application of signalling Petri nets looks interesting : "Graphical & Computational Modelling of Biological Pathways" https://youtu.be/1IPOIE0PvQY`

I'm concerned about being able to model the boundaries of countries as they are fractal.

`I'm concerned about being able to model the boundaries of countries as they are fractal.`

I mentioned that these projects are overly ambitious. Besides the FRED project above, there is the program by Neil Ferguson of Imperial College, which is what the UK used for decision making. The two are very similar in scope. One fellow has the source code and is compiling and executing the models:

I will stay away from this. My preferred approach would be to apply statistical mechanics instead of performing Monte Carlo simulations on an overly detailed model.

Bill Shipley, Science 3 November 2006: Vol. 314. no. 5800, pp. 812 - 814

`I mentioned that these projects are overly ambitious. Besides the FRED project above, there is the program by Neil Ferguson of Imperial College, which is what the UK used for decision making. The two are very similar in scope. One fellow has the source code and is compiling and executing the models: > "I’ve decided to run 2 extreme of the scale runs first with NR=10. Each one takes about 4 hours on my iMac ! The first failed because I had the output file open in Excel !" I will stay away from this. My preferred approach would be to apply statistical mechanics instead of performing Monte Carlo simulations on an overly detailed model. --- Bill Shipley, [Science 3 November 2006: Vol. 314. no. 5800, pp. 812 - 814](https://science.sciencemag.org/content/314/5800/812) > "Curiously, given the historical dominance of the demographic Lotka-Volterra equations, Volterra recognized the difficulties of this approach and even considered a statistical mechanistic approach (22). Very few authors have followed his lead (23–31)."`