Options

Petri nets - research concept 1 - modeling a global pandemic as a composition of regional nets

edited May 18 in Petri Nets

In the strategy discussion, I wrote:

All told, I propose an Azimuth quest with the following focus:

  • Pursuit of applications of Petri nets to stochastic as well as deterministic epidemiology

This is wide open.

Here is one idea I had, which I posted to the Azimuth blog:

Modeling each country separately leaves holes in the overall model for a pandemic. E.g. if the curve goes down, travel restrictions are lifted, and then it goes back up due to what’s happening in other countries. Compartmental models use ODEs and assume a well-mixed population. What about a multi-level approach, where each country or well-mixed region has a compartmental model with its own parameters. Then there could be transitions between the compartments in different countries, reflecting flows due to travel. This looks like a potential application of composition of open networks. Perhaps a good composition rule could produce an aggregated, abstracted compartmental model for the whole globe. Or help us in other ways to understand the dynamics of the whole.

Posted to:

Spelled out, the suggestion is to apply the open Petri net framework that John, Jade and Blake have been developing to the composition of global pandemic networks from smaller regional networks.

Comments

  • 1.

    This is a huge epidemic modeling source code library called FRED which is overly ambitious

    "Framework for Reconstructing Epidemiological Dynamics" https://github.com/PublicHealthDynamicsLab/FRED/

    This style of program could benefit from a rule-based architecture or a stochastic Petri net where all the busy-work of manipulating data structures would be vastly reduced.

    If I haven't mentioned it before this application of signalling Petri nets looks interesting : "Graphical & Computational Modelling of Biological Pathways"

    Comment Source:This is a huge epidemic modeling source code library called FRED which is overly ambitious "Framework for Reconstructing Epidemiological Dynamics" https://github.com/PublicHealthDynamicsLab/FRED/ This style of program could benefit from a rule-based architecture or a stochastic Petri net where all the busy-work of manipulating data structures would be vastly reduced. If I haven't mentioned it before this application of signalling Petri nets looks interesting : "Graphical & Computational Modelling of Biological Pathways" https://youtu.be/1IPOIE0PvQY
  • 2.

    I'm concerned about being able to model the boundaries of countries as they are fractal.

    Comment Source:I'm concerned about being able to model the boundaries of countries as they are fractal.
  • 3.

    I mentioned that these projects are overly ambitious. Besides the FRED project above, there is the program by Neil Ferguson of Imperial College, which is what the UK used for decision making. The two are very similar in scope. One fellow has the source code and is compiling and executing the models:

    "I’ve decided to run 2 extreme of the scale runs first with NR=10. Each one takes about 4 hours on my iMac ! The first failed because I had the output file open in Excel !"

    I will stay away from this. My preferred approach would be to apply statistical mechanics instead of performing Monte Carlo simulations on an overly detailed model.


    Bill Shipley, Science 3 November 2006: Vol. 314. no. 5800, pp. 812 - 814

    "Curiously, given the historical dominance of the demographic Lotka-Volterra equations, Volterra recognized the difficulties of this approach and even considered a statistical mechanistic approach (22). Very few authors have followed his lead (23–31)."

    Comment Source:I mentioned that these projects are overly ambitious. Besides the FRED project above, there is the program by Neil Ferguson of Imperial College, which is what the UK used for decision making. The two are very similar in scope. One fellow has the source code and is compiling and executing the models: > "I’ve decided to run 2 extreme of the scale runs first with NR=10. Each one takes about 4 hours on my iMac ! The first failed because I had the output file open in Excel !" I will stay away from this. My preferred approach would be to apply statistical mechanics instead of performing Monte Carlo simulations on an overly detailed model. --- Bill Shipley, [Science 3 November 2006: Vol. 314. no. 5800, pp. 812 - 814](https://science.sciencemag.org/content/314/5800/812) > "Curiously, given the historical dominance of the demographic Lotka-Volterra equations, Volterra recognized the difficulties of this approach and even considered a statistical mechanistic approach (22). Very few authors have followed his lead (23–31)."
Sign In or Register to comment.