It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 499
- Study Groups 18
- Petri Nets 9
- Epidemiology 3
- Leaf Modeling 1
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- MIT 2019: Applied Category Theory 339
- MIT 2019: Lectures 79
- MIT 2019: Exercises 149
- MIT 2019: Chat 50
- UCR ACT Seminar 4
- General 67
- Azimuth Code Project 110
- Statistical methods 3
- Drafts 2
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 147
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 708

Options

I have 4 people wanting to become grad students of mine here at UCR. I need to hurry up and apply for NSF grants to support these people - deadlines are coming up. There are various possible choices of where I could apply:

Physics of living systems, October 24.

Mathematical physics, October 31.

Probability, October 23-November 7.

Mathematical biology, November 1-15. ("May be selected as a secondary program", but they suggest applying to the related program with the earliest deadline.)

Looking at the proposals that have been accepted, Probability seems better than I might have thought - there are some proposals Markov processes that seem not drastically different from what I'm thinking about.

Of course the problem is that I'm not a known 'big name' in any of these areas except mathematical physics. So, maybe I should write an application emphasizing statistical mechanics and submit it to Mathematical Physics.

I don't even know if it increases my chances, or how much it increases my chances, if I apply to more different grants! I feel like a complete newbie, though I did get an NSF grant under the category Physics at the Information Frontier, which helped support Mike Stay and Alex Hoffnung for several years.

## Comments

I knew about your papers before your blog, but then realized I knew about your blogs long before :D I recalled running into them searching for stuff. In any case, you're probably a bigger name in these areas than you think! I even started to be a reviewer for a few journals in the area of chemical reaction networks, after just writing those blogs for you!

I wait to invite your new students to Torino!

`I knew about your papers before your blog, but then realized I knew about your blogs long before :D I recalled running into them searching for stuff. In any case, you're probably a bigger name in these areas than you think! I even started to be a reviewer for a few journals in the area of chemical reaction networks, after just writing those blogs for you! I wait to invite your new students to Torino!`

Jacob wrote:

Great, thanks! I've got four people starting to meet me regularly now. They officially become grad students of mine if and when they write a paper (with lots of help from me) and pass an oral exam on it. The business about writing a paper is my own personal requirement, to weed out people who are lazy or can't write.

Three are first-year students, so it may take them a couple of years. One is a fourth-year student, so we'll know soon about hm.

Here's some more grant news:

I've been told that this next grant program will focus on information theory, and that I'm one of the people they want to fund. So, I'm definitely going to apply.

`Jacob wrote: > I wait to invite your new students to Torino! Great, thanks! I've got four people starting to meet me regularly now. They officially become grad students of mine if and when they write a paper (with lots of help from me) and pass an oral exam on it. The business about writing a paper is my own personal requirement, to weed out people who are lazy or can't write. Three are first-year students, so it may take them a couple of years. One is a fourth-year student, so we'll know soon about hm. Here's some more grant news: > FQXi will announce the start of our next Large Grant program on Monday, October 15, 2012. Please check our website for information on Monday. We will also send out another Member email next week with all the details. We want to make sure this request for proposals (RFP) gets circulated far and wide, so if you have suggestions of people or places we should send the RFP when it's announced next week, please let me know. We'll have $3 million in grant funds available for this round of grants, more than ever before! I've been told that this next grant program will focus on information theory, and that I'm one of the people they want to fund. So, I'm definitely going to apply.`

Here is another call that might be relevant for funding a workshop:

The second is the NSF call for international collaborations. (This money can't be used for workshops, but it can be used to collaborate on new projects, provided the people collaborating have not already collaborated too much):

link

Note that in this case, Jason Morton and I wrote a paper together and contacted the NSF about this call. They told us that we already collaborated too much for this, as it is aimed at funding new international collaborations. If anyone has ideas for

newinternational collaborations, it might be a quick way to get the ball rolling!`Here is another call that might be relevant for funding a workshop: > * "Proposals are sought for focused workshops and SQuaREs to be held at AIM in Palo Alto. The deadline for proposals is November 1 for activities to be held in 2013 or 2014. > Funded workshops provide full support (travel, hotel, and meals) for up to 28 participants for a 5-day workshop. > The SQuaREs program provides full funding for 4 to 6 people to work together for a week at AIM, with the possibility of returning for a second or third meeting in subsequent years. > We are happy to provide assistance in all aspects of proposal preparation. Both proposal forms are relatively short and can be found at: > [for workshops](http://www.aimath.org/research/workshopproposals.html) > [for SQuaREs](http://www.aimath.org/research/squares.html) The second is the NSF call for international collaborations. (This money can't be used for workshops, but it can be used to collaborate on new projects, provided the people collaborating have not already collaborated too much): [link](http://www.nsf.gov/pubs/2012/nsf12573/nsf12573.htm) Note that in this case, Jason Morton and I wrote a paper together and contacted the NSF about this call. They told us that we already collaborated too much for this, as it is aimed at funding new international collaborations. If anyone has ideas for *new* international collaborations, it might be a quick way to get the ball rolling!`

Thanks, Jacob!

Kavita Rajanna at the Foundational Questions Institute writes:

`Thanks, Jacob! Kavita Rajanna at the Foundational Questions Institute writes: > As promised, I'm emailing with news of FQXi's new large grant program. The RFP for FQXi's Physics of Information program is now available online here: As promised, I'm emailing with news of FQXi's new large grant program. The RFP for FQXi's Physics of Information program is now available online here: [http://fqxi.org/grants/large/initial](http://fqxi.org/grants/large/initial). > FQXi will award one- and two-year grants, totaling FQXi will award one- and two-year grants, totaling $3 million, to topical, innovative, and unconventional research and outreach projects. We will award grants ranging from $50,000 to $200,000. All grants will have a start date of September 1, 2013 and run through August 31, 2014 (one-year grants) or August 31, 2015 (two-year grants). Initial application must be submitted through our online application ([http://fqxi.org/grants/large/initial/application](http://fqxi.org/grants/large/initial/application)) by January 16, 2013. > Possible ideas for a successful proposal include: What is the precise relationship between information and reality? Can information exist without matter, and vice versa, or are they two sides of the same coin? Are there fundamental limits to information processing by physical systems, such as computers or the universe as a whole? You can see more possible questions here: [http://fqxi.org/grants/large/initial/examples](http://fqxi.org/grants/large/initial/examples). > In addition to research and outreach projects in physics, this RFP is open to those in related fields including, cosmology, astrophysics, mathematics, computer science, complex systems, biology, and neuroscience. Please share this RFP far and wide with friends, colleagues, and any likely applicants. There is a share tool on the RFP site that can easily enable sharing through email or social media (again that page is [http://fqxi.org/grants/large/initial](http://fqxi.org/grants/large/initial/examples)). > If you have any questions about the RFP or suggestions for possible RFP outreach, please be in touch, rajanna@fqxi.org. > Best, > Kavita`

Just for fun, some of you might like to see a draft of the abstract for my grant proposal for the NSF's Mathematical Physics program. It's supposed to fund two grad students so they don't have to work as teaching assistants.

If anyone here knows anything about getting grants, especially NSF grants, suggestions for how to improve this would be appreciated!

`Just for fun, some of you might like to see a draft of the abstract for my grant proposal for the NSF's [Mathematical Physics](http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=503537) program. It's supposed to fund two grad students so they don't have to work as teaching assistants. > **Quantum Techniques for Stochastic Physics** > It has been known for some time that much of the mathematics underlying quantum theory applies to collections of classical objects interacting in a random (or 'stochastic') way if we reinterpret the wavefunction as describing a probability distribution instead of an amplitude. Surprisingly, the machinery of annihilation and creation operators obeying their usual commutation relations are applicable to collections of objects ranging from molecules (treated, as chemists often do, in a stochastic rather than quantum-mechanical way) to biological organisms and queues (collections of entities `waiting in line'). Work on this topic has thus been carried out by diverse communities of researchers including chemists, biologists, and computer scientists. Sadly, these researchers have not adequately communicated their results between each other. Indeed, many have failed to note that the mathematical tools involved have been deeply studied within mathematical physics. > During a recent 2-year visit to the Centre for Quantum Technologies, the PI began to address this problem by writing a textbook, _A Course on Quantum Techniques for Stochastic Mechanics_, with the help of Jacob Biamonte. Freely available on the arXiv in draft form, this book explains how techniques familiar in quantum physics can be extended to stochastic processes involving collections of classical objects. Not merely a summary of existing work, it also shows for the first time how various tools of quantum theory - for example Noether's theorem and coherent states - can be used to study stochastic multi-body processes. It contains a rich set of examples taken from chemistry, population biology and other subjects. > The research funded by this grant will develop quantum techniques for stochastic physics still further, with the help of two graduate students. Starting with Horn, Jackson and Feinberg, mathematically minded chemists have proved a number of powerful theorems on the existence, uniqueness and stability of equilibrium states for stochastic many-body processes. The PI has shown that these theorems arise by applying Perron-Frobenius theory to a Markov process indirectly related to the actual problem at hand, and then using a nonlinear map to transfer information to the problem at hand. Some simple ideas from cohomology theory also play a role. By making the underlying mathematics explicit, it should be possible to extend these theorems to their full range of applicability; currently they only apply to a limited class of systems. There are also stochastic many-body systems exhibiting periodic behavior or bistability, at least in the limit of large numbers: some of these function as 'biological clocks' and `switches' in living organisms. Characterizing these systems should have useful applications in biophysics. > There are many aspects of this project that graduate students can assist, especially since the PI is working with two students who have done undergraduate work in physics. This project will also help with their professional development. If anyone here knows anything about getting grants, especially NSF grants, suggestions for how to improve this would be appreciated!`

And here's the draft of the abstract for my grant proposal for the NSF’s Probability program:

`And here's the draft of the abstract for my grant proposal for the NSF’s [Probability]( http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5555) program: > **Information geometry and Markov processes** > The PI will use techniques of information geometry to study a class of continuous-time Markov processes described by 'stochastic Petri nets', also known as 'chemical reaction networks'. This formalism describe collections of things that randomly interact and turn into things of different kinds. In applications to chemistry these things could be molecules of different kinds, while in applications to biology they could be organisms with different genotypes. As usual, these Markov processes obey a differential equation: the master equation. But in the limit of large numbers, the expected number of things of each kind obeys a differential equation of its own: the 'rate equation'. > The stability of equilibrium solutions of the rate equation is often proved with the help of a version of the second law of thermodynamics: a function akin to entropy increases, and this can be used to show that any solution starting near equilibrium approaches equilibrium as time goes to infinity. However, this only holds for a certain special class of systems. The increase of (relative) entropy for reversible Markov processes is known to hold much more generally, and not just for ordinary Shannon entropy but also for many variants, such as Renyi entropies. Recent work on information geometry by Harremoës and others has shown that the 'majorization order' on probability measures is a powerful way to understand all these results. > By fully exploiting this, the PI plans to obtain better results on the approach to equilibrium for the rate equation, and even more importantly, a better understanding of situations where the rate equation does not have a stable equilibrium solution. A good example is the 'Brusselator'. This is a chemical reaction where solutions of the rate equation do not approach an equilibrium, but instead display periodic behavior. However, when stochastic fluctuations are taken into account, as in the master equation, there is indeed an increase of entropy and approach to equilibrium. Understanding this requires relating geometrical structures on the space of probability measures, where solutions of the master equation live, to the geometry of the space where solutions of the rate equation live. > The PI will carry out this research with the help of two mathematics graduate students funded by the grant. This project will also help with their professional development.`

These grant proposals seem so boring and narrow compared to what I actually want to do! I also hate how the

save the planettheme is completely absent. But I'm thinking that for these particular NSF programs, it's best to describe a focused project without lofty extra-scientific goals.`These grant proposals seem so boring and narrow compared to what I actually want to do! I also hate how the _save the planet_ theme is completely absent. But I'm thinking that for these particular NSF programs, it's best to describe a focused project without lofty extra-scientific goals.`

Since my proposal on 'Quantum Techniques for Stochastic Physics' is due at midnight Eastern Standard Time on October 31, when the witches and goblins come out and evaluate NSF proposals, I've been hurriedly writing it up.

People interested in Petri nets or stochastic processes might be amused by the one-page summary, and might learn a bit from my half-finished project description. The latter includes some nice references.

`Since my proposal on 'Quantum Techniques for Stochastic Physics' is due at midnight Eastern Standard Time on October 31, when the witches and goblins come out and evaluate NSF proposals, I've been hurriedly writing it up. People interested in Petri nets or stochastic processes might be amused by the one-page [summary](http://math.ucr.edu/home/baez/baez_quantum_techniques_summary.pdf), and might learn a bit from my half-finished [project description](http://math.ucr.edu/home/baez/baez_quantum_techniques_description.pdf). The latter includes some nice references.`

I've come a lot further on the project description for the 'Quantum Techniques in Stochastic Physics' proposal. It looks pretty good except there's not enough detail in the 'Details' section about what I'll do, as opposed to what's been done. Of course I expect that I'll do something much cooler than I can imagine now - that's usually how it goes, and I'd be disappointed otherwise. But grant proposals seem to want you to say not only what mountain you'll climb but what you'll find on top.

`I've come a lot further on the [project description](http://math.ucr.edu/home/baez/baez_quantum_techniques_description.pdf) for the 'Quantum Techniques in Stochastic Physics' proposal. It looks pretty good except there's not enough detail in the 'Details' section about what I'll do, as opposed to what's been done. Of course I expect that I'll do something much cooler than I can imagine now - that's usually how it goes, and I'd be disappointed otherwise. <img src = "http://math.ucr.edu/home/baez/emoticons/tongue2.gif" alt = ""/> But grant proposals seem to want you to say not only what mountain you'll climb but what you'll find on top.`

The proposal for Quantum Techniques in Stochastic Physics has been submitted and the completed project description is available at the same place. It's still quite pale compared to what I actually plan to do, but I'm hoping it's the sort of thing they want.

Here are some references some of you might like to read. Some of these I haven't read yet, and need to get.

General references:

P. Dodd and N. Ferguson, A many-body field theory approach to stochastic models in population biology,

PLoS ONE4(2009).M. Feinberg,

Lectures On Reaction Networks, 1979.P. Goss and J. Peccoud, Quantitative modeling of stochastic systems in molecular biology by using stochastic Petri nets,

Proc. Natl. Acad. Sci. USA98(1998), 6750--6755. (Not free, I think.)D. Kartson, G. Balbo, G. Conte, S. Donatelli and G. Franceschinis,

Modelling with Generalized Stochastic Petri Nets, Wiley, 1994. (I need to find out what ageneralizedstochastic Petri net is!)W. Reisig and G. Rozenberg, eds.,

Lectures on Petri Nets, two volumes, Springer, Berlin, 1998. (I haven't looked through this yet!)D. Wilkinson,

Stochastic Modelling for Systems Biology, Taylor and Francis, 2006. (A fun easy introduction including programming techniques.)On bistability:

G. Craciun, M. Feinberg and Y. Tang, Understanding bistability in complex enzyme-driven reaction networks,

Proc. Nat. Acad. Sci. USA103(2006), 8697-8702.M. Feinberg, Multiple steady states for chemical reaction networks of deficiency one,

Arch. Rational Mech. Analysis132(1995), 371-406.Biological clocks:

Proc. Nat. Acad. Sci. USA108(2011), 4281-4285.I will add these references to our Petri net article.

`The proposal for Quantum Techniques in Stochastic Physics has been submitted and the [completed project description](http://math.ucr.edu/home/baez/baez_quantum_techniques_description.pdf) is available at the same place. It's still quite pale compared to what I actually plan to do, but I'm hoping it's the sort of thing they want. Here are some references some of you might like to read. Some of these I haven't read yet, and need to get. General references: * P. Dodd and N. Ferguson, [A many-body field theory approach to stochastic models in population biology](http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0006855), _PLoS ONE_ **4** (2009). * M. Feinberg, _[Lectures On Reaction Networks](http://www.che.eng.ohio-state.edu/~FEINBERG/LecturesOnReactionNetworks)_, 1979. * P. Goss and J. Peccoud, Quantitative modeling of stochastic systems in molecular biology by using stochastic Petri nets, _[Proc. Natl. Acad. Sci. USA](http://www.pnas.org/content/95/12/6750.full.pdf+html)_ **98** (1998), 6750--6755. (Not free, I think.) * D. Kartson, G. Balbo, G. Conte, S. Donatelli and G. Franceschinis, _Modelling with Generalized Stochastic Petri Nets_, Wiley, 1994. (I need to find out what a _generalized_ stochastic Petri net is!) * W. Reisig and G. Rozenberg, eds., _Lectures on Petri Nets_, two volumes, Springer, Berlin, 1998. (I haven't looked through this yet!) * D. Wilkinson, _Stochastic Modelling for Systems Biology_, Taylor and Francis, 2006. (A fun easy introduction including programming techniques.) On bistability: * G. Craciun, M. Feinberg and Y. Tang, [Understanding bistability in complex enzyme-driven reaction networks](http://www.pnas.org/content/103/23/8697.abstract), _Proc. Nat. Acad. Sci. USA_ **103** (2006), 8697-8702. * M. Feinberg, Multiple steady states for chemical reaction networks of deficiency one, _Arch. Rational Mech. Analysis_ **132** (1995), 371-406. Biological clocks: * D. Forger, Signal processing in cellular clocks, _[Proc. Nat. Acad. Sci. USA](http://www.pnas.org/content/early/2011/02/25/1004720108.full.pdf+htm)_ **108** (2011), 4281-4285. I will add these references to our [[Petri net]] article.`

Today I wrote a one-page summary for my grant proposal on 'Information geometry and Markov processes'. I managed to work a bit more of the 'green' theme into this proposal, though only slightly: since it's for the NSF program on probability theory I didn't want to overdo it, but I emphasized the applications to evolutionary biology. Now I need to write the actual proposal - and fast, too!

`Today I wrote a one-page [summary](http://math.ucr.edu/home/baez/baez_information_geometry_summary.pdf) for my grant proposal on 'Information geometry and Markov processes'. I managed to work a bit more of the 'green' theme into this proposal, though only slightly: since it's for the NSF program on probability theory I didn't want to overdo it, but I emphasized the applications to evolutionary biology. Now I need to write the actual proposal - and fast, too!`

Everything is done for these NSF grant proposals now!

Quantum techniques for stochastic physics: project description and summary.

Information geometry and Markov processes: project description and summary.

The second one touches on biology more explicitly.

`Everything is done for these NSF grant proposals now! * Quantum techniques for stochastic physics: [project description](http://math.ucr.edu/home/baez/baez_quantum_techniques_description.pdf) and [summary](http://math.ucr.edu/home/baez/baez_quantum_techniques_summary.pdf). * Information geometry and Markov processes: [project description](http://math.ucr.edu/home/baez/baez_information_geometry_description.pdf) and [summary](http://math.ucr.edu/home/baez/baez_information_geometry_summary.pdf). The second one touches on biology more explicitly.`

Now it's time to work on the FQXi Physics of Information grant. To get myself motivated, and maybe teach some of you how to apply for grants - those of you who are even worse at it than me - let me talk about it here.

The 'initial proposal' is due January 16, with a 'full proposal by invitation only' due April 30. I've gotten previous grants from the FQXi, and they've told me I'm the kind of guy they want to fund with this one, so I figure I just need to make the initial proposal

reasonablygood and then, with luck, get to the next stage.I think it's always good to study the request for proposals very carefully and write a proposal that - as much as one honestly can - gives them

exactly what they want. They write:So, I need to downplay the more practical applications a bit and focus on how my word addresses the 'ultimate nature of reality'. Good. I have various projects going that I could conceivably fund by this grant:

A study of quantropy, which is like entropy but with amplitudes replacing probabilities. The path-integral approach to quantum mechanics says that a quantum system takes all possible paths, with different amplitudes, with those amplitudes chosen in such a way that 'quantropy' is extremized. This is something I discovered, though Garrett Lisi (and probably other people) noticed something very similar. I think there's something 'foundational' about this work since it expands the very important analogy between statistical mechanics (with probabilities) and quantum mechanics (with amplitudes). This analogy spawns ideas like 'imaginary time as inverse temperature' which are very important and practical yet deeply mysterious... we should understand these better!

My work with Mike Stay on algorithmic thermodynamics is also 'foundational' in a way, since it says algorithmic entropy is really a kind of entropy, not just analogous to entropy. This lets us take concepts from thermodynamics and apply them to computation in new ways. However, I don't see how to push this work forwards further right now, and I don't think I want to get a student working on it. So this is not so good.

My work on quantum information geometry is also a bit 'foundational'. I'm not sure, but it

mightprovide a deeper explanation of Öttinger's work on dissipative mechanics (in his bookBeyond Equilibrium Thermodynamics).What I'm really interested in, more than any of the above things, is the relation between thermodynamics, evolutionary biology, game theory, and machine learning. I described some aspects of this in my Information Geometry and Markov Processes proposal for the NSF. This is further from what people normally call 'foundational physics' than ideas 1 and 2, but it's a much bigger and more promising pile of ideas. I could try to sell it as 'foundational' if I could convince people that traditional reductionism is not the right way to reach deep new insights these days: that biochemistry and ecosystems and economies offer more food for thought than black holes and dark matter and string theory! Since they list "complex systems, biology, and neuroscience" as acceptable topics, this might fly.

Of course what I'd actually fund is not the process of thinking and writing about these topics, but either workshops, travel, or grad students. Of these the only one I really feel like funding is grad students - and some travel for grad students, since they, much more than me, would benefit from going to some conferences.

`Now it's time to work on the FQXi [Physics of Information](http://fqxi.org/tools/download/__details/2013-Request-for-Proposals.pdf) grant. To get myself motivated, and maybe teach some of you how to apply for grants - those of you who are even worse at it than me - let me talk about it here. The 'initial proposal' is due January 16, with a 'full proposal by invitation only' due April 30. I've gotten previous grants from the FQXi, and they've told me I'm the kind of guy they want to fund with this one, so I figure I just need to make the initial proposal _reasonably_ good and then, with luck, get to the next stage. I think it's always good to study the request for proposals very carefully and write a proposal that - as much as one honestly can - gives them _exactly what they want_. They write: > The past century in fundamental physics has shown a steady progression away from thinking about physics, at its deepest level, as a description of material objects and their interactions, and towards physics as a description of the evolution of information about and in the physical world. It has become clear that information lies at the heart of statistical mechanics (and thermodynamics), quantum mechanics, and even general relativity. In turn, deep analysis of these physical theories has led to significant advances in thinking about information. > Moreover, recent years have shown an explosion of research interest at the nexus of physics and information, driven by the "information age" in which we live, and more importantly by developments in quantum information theory and computer science. These studies, however, tend to focus on practical applications and fundamental theory but not _foundational_ questions. Moreover, it is often unclear whether different researchers are even talking about the same fundamental entities, or what the relations are between them. > FQXi requests proposals for rigorous research on the Physics of Information both in physics and also in related fields including cosmology, astrophysics, mathematics, computer science, complex systems, biology, and neuroscience. Funded research will address this gap between research and technological progress on information science on one side, and active study of the true physical nature of information on the other. It will also, by bringing together the community in a focused and intense effort, seek to help develop a common understanding of the different types of information and the roles they play. Most importantly, the research supported by this program should have significant implications for our understanding of the major questions across many scientific disciplines, and address the deep or ‘ultimate’ nature of reality. So, I need to downplay the more practical applications a bit and focus on how my word addresses the 'ultimate nature of reality'. Good. I have various projects going that I could conceivably fund by this grant: 1. A study of [quantropy](http://johncarlosbaez.wordpress.com/2011/12/22/quantropy/), which is like entropy but with amplitudes replacing probabilities. The path-integral approach to quantum mechanics says that a quantum system takes all possible paths, with different amplitudes, with those amplitudes chosen in such a way that 'quantropy' is extremized. This is something I discovered, though Garrett Lisi (and probably other people) noticed something very similar. I think there's something 'foundational' about this work since it expands the very important analogy between statistical mechanics (with probabilities) and quantum mechanics (with amplitudes). This analogy spawns ideas like 'imaginary time as inverse temperature' which are very important and practical yet deeply mysterious... we should understand these better! 1. My work with Mike Stay on [algorithmic thermodynamics](http://johncarlosbaez.wordpress.com/2011/01/06/algorithmic-thermodynamics-part-2/) is also 'foundational' in a way, since it says algorithmic entropy is really a kind of entropy, not just analogous to entropy. This lets us take concepts from thermodynamics and apply them to computation in new ways. However, I don't see how to push this work forwards further right now, and I don't think I want to get a student working on it. So this is not so good. 1. My work on [quantum information geometry](http://math.ucr.edu/home/baez/information/information_geometry_4.html) is also a bit 'foundational'. I'm not sure, but it _might_ provide a deeper explanation of Öttinger's work on dissipative mechanics (in his book _Beyond Equilibrium Thermodynamics_). 1. What I'm really interested in, more than any of the above things, is the relation between thermodynamics, evolutionary biology, game theory, and machine learning. I described some aspects of this in my [Information Geometry and Markov Processes](http://math.ucr.edu/home/baez/baez_information_geometry_description.pdf) proposal for the NSF. This is further from what people normally call 'foundational physics' than ideas 1 and 2, but it's a much bigger and more promising pile of ideas. I could try to sell it as 'foundational' if I could convince people that traditional reductionism is not the right way to reach deep new insights these days: that biochemistry and ecosystems and economies offer more food for thought than black holes and dark matter and string theory! Since they list "complex systems, biology, and neuroscience" as acceptable topics, this might fly. Of course what I'd actually fund is not the process of thinking and writing about these topics, but either workshops, travel, or grad students. Of these the only one I really feel like funding is grad students - and some travel for grad students, since they, much more than me, would benefit from going to some conferences.`

The request for proposals kindly includes 'lists of questions and topics that make suitable targets for research funded under this program.' They also say these lists aren't exhaustive! But here they are:

What IS information? What is its relation to “Reality”?

What sorts of information are definable and useful? What are the relations between them?

What is the physical foundation of information? Are information and reality two sides of the same coin?

Is some information principle prior to (or more fundamental than) the laws of physics? Or, are the laws of physics necessary to have information processing?

Is information an emergent feature of the deeper laws and structure of the world, or is the macroscopic world emergent from information?

How does nature (the universe and things therein) “store” and “process” information?

Are there fundamental limits to information storage and processing?

What new perspectives can quantum information storage and processing give us?

What is the connection between information and probability?

How is nature shaped and transformed by processing information? How do physical, chemical, biological, or “mental” systems process information?

In isolated systems, is the total “amount” of information always preserved? Or is there a Second Law of information? If so, how does this law relate to the Second Law of thermodynamics?

How does understanding information help us understand physics, and vice-versa?

Is information “discrete”? How does a continuous or discrete description of a physical system relate to its information content and processing? Can information theory shed light on open “discrete” questions such as the nature of space?

What can information tell us about foundational questions in quantum mechanics, such as the true status of the wavefunction, or what exactly defines a quantum system?

What is the relationship between information and statistical and thermodynamic entropy? Can information tell us about the relationship between these entropies?

Is there a role for information physics in the study of quantum gravity? What might information-theoretic reasoning reveal about black holes, the universe before the Big Bang, or physics at the Planck scale?

I don't really care about a lot of these questions. The most interesting ones to me are:

But I take an unapologetically thermodynamic attitude toward information, so I usually tend to talk more about entropy.

`The request for proposals kindly includes 'lists of questions and topics that make suitable targets for research funded under this program.' They also say these lists aren't exhaustive! But here they are: * What IS information? What is its relation to “Reality”? - What sorts of information are definable and useful? What are the relations between them? - What is the physical foundation of information? Are information and reality two sides of the same coin? - Is some information principle prior to (or more fundamental than) the laws of physics? Or, are the laws of physics necessary to have information processing? - Is information an emergent feature of the deeper laws and structure of the world, or is the macroscopic world emergent from information? * How does nature (the universe and things therein) “store” and “process” information? - Are there fundamental limits to information storage and processing? - What new perspectives can quantum information storage and processing give us? - What is the connection between information and probability? - How is nature shaped and transformed by processing information? How do physical, chemical, biological, or “mental” systems process information? - In isolated systems, is the total “amount” of information always preserved? Or is there a Second Law of information? If so, how does this law relate to the Second Law of thermodynamics? * How does understanding information help us understand physics, and vice-versa? - Is information “discrete”? How does a continuous or discrete description of a physical system relate to its information content and processing? Can information theory shed light on open “discrete” questions such as the nature of space? - What can information tell us about foundational questions in quantum mechanics, such as the true status of the wavefunction, or what exactly defines a quantum system? - What is the relationship between information and statistical and thermodynamic entropy? Can information tell us about the relationship between these entropies? - Is there a role for information physics in the study of quantum gravity? What might information-theoretic reasoning reveal about black holes, the universe before the Big Bang, or physics at the Planck scale? I don't really care about a lot of these questions. The most interesting ones to me are: > How is nature shaped and transformed by processing information? How do physical, chemical, biological, or “mental” systems process information? But I take an unapologetically thermodynamic attitude toward information, so I usually tend to talk more about entropy.`

The request for proposals also says:

I knew this already, but it's good to remember:

they want your research to be unconventional!They also say:

However, I have managed to fund students before - not fulltime, that eats up money too fast. Here are things they like to fund:

`The request for proposals also says: > This RFP is intended to fill a gap, not a shortfall, in conventional funding. We wish to enable research that, because of its speculative, non-mainstream, or high-risk nature, would otherwise go unperformed due to lack of available monies. Thus, although there will be inevitable overlaps, an otherwise scientifically rigorous proposal that is a good candidate for an FQXi grant will generally not be a good candidate for funding by the NSF, DOE, etc.—and vice versa. I knew this already, but it's good to remember: _they want your research to be unconventional!_ They also say: > **Can you fund PhD student research assistantships?** > Yes. However, because of the unconventional nature of the FQXi mission, full-time student support is discouraged. Potential students cannot directly apply for a studentship. However, I have managed to fund students before - not fulltime, that eats up money too fast. Here are things they like to fund: > **What kinds of programs and requests are eligible for funding?** > · Student or postdoctoral salary and benefits for part of the academic year > · Summer salary and teaching buyout for academics > · Support for specific projects during sabbaticals > · Assistance in writing or publishing books > · Modest allowance for computer equipment, publication charges, or supplies, provided that these items are clearly justified in the proposal > · Experimental equipment (Keep in mind that while FQXi is very interested in experimental proposals, the total available funding means that funding for large equipment purchases will be unlikely.) > · Modest travel allowance > · Development of large workshops, conferences, or lecture series (Note that small programs of this type, and others costing less than US $15K, are best supported by an FQXi Mini-Grant. Mini-Grant applications, however, are restricted to FQXi Members.) > · Development of outreach or educational programs for laypeople that disseminate knowledge regarding foundational questions in physics and cosmology (The impact criterion, in this case, will be judged on the proposal’s ability to disseminate knowledge rather than develop it.)`

Quick-non-considered-ideas:

"Foundational" is an unfortunate term, in that implies everything else "rests upon it" (although I suppose "fundamental" also implies that). Interpreted just as "unifying" or "all-pervasive" principles, there's a school of thought, particularly but not only associated with Stephen Wolfram, that information and more specifically patterns of processing are what really matters and are "universal"; while they generally have physical components, these are only relevant to the extent they "combine to produce" a particular information processing pattern. (Analogous to the way that to understand chess it's irrelevant if I'm playing with you with wooden pieces or via an interactive board on a tablet computer.) So while information processing patterns isn't (necessarily) "the foundations" of reality, it's what you really should study to understand things. Since you're particularly interested in unifying things (eg, QM and Petri nets), you might be able to push the argument further for trying to understand the common kinds of information processing going on in these things when you completely ignore the actual physical details. This isn't so much a specific suggestion as much as a meta-suggestion as to one thing you might use to frame a specific approach.

(I have several difficulties with Wolfram's position and methodology, but it's not completely without it's merits.)

`Quick-non-considered-ideas: "Foundational" is an unfortunate term, in that implies everything else "rests upon it" (although I suppose "fundamental" also implies that). Interpreted just as "unifying" or "all-pervasive" principles, there's a school of thought, particularly but not only associated with Stephen Wolfram, that information and more specifically patterns of processing are what really matters and are "universal"; while they generally have physical components, these are only relevant to the extent they "combine to produce" a particular information processing pattern. (Analogous to the way that to understand chess it's irrelevant if I'm playing with you with wooden pieces or via an interactive board on a tablet computer.) So while information processing patterns isn't (necessarily) "the foundations" of reality, it's what you really should study to understand things. Since you're particularly interested in unifying things (eg, QM and Petri nets), you might be able to push the argument further for trying to understand the common kinds of information processing going on in these things when you completely ignore the actual physical details. This isn't so much a specific suggestion as much as a meta-suggestion as to one thing you might use to frame a specific approach. (I have several difficulties with Wolfram's position and methodology, but it's not completely without it's merits.)`

Thanks, David! Yes, I can easily make an argument like this as to the fundamental importance of 'information' as a unifying idea. Since they've called for proposals on information theory they'll be predisposed to accept this idea. My main challenge is to lay out a specific, doable yet ambitious-sounding, unconventional yet not insane-sounding project.

On an infinitely more mundane note, I spent the afternoon writing a CV of the particular sort they want. Now I need:

`Thanks, David! Yes, I can easily make an argument like this as to the fundamental importance of 'information' as a unifying idea. Since they've called for proposals on information theory they'll be predisposed to accept this idea. My main challenge is to lay out a specific, doable yet ambitious-sounding, unconventional yet not insane-sounding project. On an infinitely more mundane note, I spent the afternoon writing a CV of the particular sort they want. Now I need: * a project title * a 500-word project description for this initial proposal * a dollar figure including 15% overhead * a 200-word budget description`

Thanks for sharing these; it's educational!

`Thanks for sharing these; it's educational!`

Hi, Jacob!

Okay, here's a proposal that clocks in at 469 words. I don't really say what I'm going to

do(except figure out the mysteries of the universe), and I don't say why I need money to do it. So, I'll probably have to shorten it a bit and stick in a bit of that stuff. But I think it sounds sufficiently far-out and 'foundational' while still containing completely no bullshit:There's probably a bunch of detail that's irrelevant for this short proposal, but will be helpful later.

I'll keep polishing this...

`Hi, Jacob! Okay, here's a proposal that clocks in at 469 words. I don't really say what I'm going to _do_ (except figure out the mysteries of the universe), and I don't say why I need money to do it. So, I'll probably have to shorten it a bit and stick in a bit of that stuff. But I think it sounds sufficiently far-out and 'foundational' while still containing completely no bullshit: > There is an extensive analogy between quantum mechanics and thermodynamics, which however remains deeply mysterious. In the path-integral approach to quantum mechanics, particles take different paths with amplitudes depending on the action of those paths. In the statistical mechanics approach to thermodynamics, a system in equilibrium occupies different states with probabilities depending on the energy of those states. The formulas are very similar, and this resemblance is widely used to take techniques from one subject and apply them to the other. > But what does it all mean? Is it just a "trick", or something deeper? I believe it cannot be just a trick: it is trying to tell us something. > We have many clues. For example, in this analogy, probabilities are analogous to amplitudes. This seems to go against the usual recipe of computing probabilities in quantum mechanics by taking the absolute values of amplitudes and then squaring them, though of course it must be compatible. Jacob Biamonte and I have written a book _Quantum Techniques in Stochastic Physics_, available online, which shows what we can do when treat probabilities as directly analogous to amplitudes: there are interesting applications to biology and chemistry. I am now in the position to think more deeply about the relation between probabilities and amplitudes. > Furthermore, in this analogy entropy is analogous to a little-known quantity which I call "quantropy". Statistical mechanics is governed by the principle of maximum entropy. In particular, a system in thermal equilibrium has probabilities of being in different states that are chosen to maximize entropy (subject to a constaint on the expected energy). I have shown that in quantum mechanics a similar principle holds: the amplitudes for a particle to take different paths are chosen to extremize "quantropy" (subject to a constraint on the expected action). > What does all this have to do with information? Since entropy can be seen as "missing information", the maximum entropy principle in statistical mechanics can be seen as a principle of minimum information. As emphasized by Jaynes, this tells us that in situations of incomplete knowledge we should describe a system in a probabilistic way which involves the least possible information. A similar principle underlies "Solomonoff induction", a rigorous form of Occam's razor in which we choose the theory that minimizes algorithmic information. In a paper published with Mike Stay, I have shown that algorithmic information can really be seen as a _special case_ of information in the sense of statistical mechanics: that is, up to a minus sign, entropy. > Putting this all together, it means we can describe the path-integral approach to quantum mechanics in terms of something closely akin to information, but different: "quantropy". (This is not to be confused with the entropy of a quantum system: for one thing, it is complex-valued!) I think this is a fascinating starting-point for further investigations. There's probably a bunch of detail that's irrelevant for this short proposal, but will be helpful later. I'll keep polishing this...`

Next attempt:

I'll add a little about what I need the money for when I find out how much various things cost.

`Next attempt: > There is an extensive analogy between quantum mechanics and statistical mechanics, but it remains deeply mysterious. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. The formulas are very similar, and this resemblance is widely used to take techniques from one subject and apply them to the other. > But what does it all mean? Is it just a "trick", or something deeper? I believe it cannot be just a trick: it is trying to tell us something. > We have many clues. For example, in this analogy, probabilities are analogous to amplitudes. Jacob Biamonte and I are writing a book _Quantum Techniques in Stochastic Physics_, available online, which shows what we can do with this fact. We present interesting applications to biology and chemistry. I am now ready to think more deeply about the foundational aspects. > Furthermore, in this analogy entropy is analogous to a little-known quantity which I call "quantropy". Statistical mechanics is governed by the principle of maximum entropy. A system in thermal equilibrium has probabilities of being in different states that are chosen to maximize entropy subject to a constaint on the expected energy. I have shown that in quantum mechanics a similar principle holds: the amplitudes for a system to take different paths are chosen to extremize quantropy subject to a constraint on the expected action. I use the word "extremize" instead of "minimize" because like amplitudes, quantropy is not a real number: it is complex. > What does all this have to do with information? Since entropy can be seen as "missing information", the maximum entropy principle can be seen as a principle of minimum information. As emphasized by Jaynes, this says that in situations of incomplete knowledge we should describe a system in a way which involves the least amount of extraneous information. A similar principle underlies Solomonoff induction, a rigorous form of Occam's razor in which we choose the theory that minimizes algorithmic information. In a paper published with Mike Stay, I have shown that algorithmic information is truly a special case of information in the sense of statistical mechanics. > Putting this all together, it means that the motion of quantum systems extremizes something closely akin to information, but different: "quantropy". This is a fascinating starting-point for further investigations. Are information and quantropy two aspects of a single unified concept? I hope get more insights in part by working through examples, and in part by further extending the network of analogies discussed here. I'll add a little about what I need the money for when I find out how much various things cost.`

Hi! It's really a very interesting abstract; in fact, there are lot's of things that I didn't know about there that I should know more about! I've actually started today to write a stochastic rate equation representing the Yule process into our formalism. This is very helpful for my own understanding. The Yule process gives the iconic powerlaw distribution found in modern network theory (e.g. terms like the rich get richer, or cumulative advantage). I think one thing that we're certainly missing is a way to make a stochastic Hamiltonian quantum, and then a way to arrive at the stochastic version as perhaps the first order term in a series.

We talked about this in Singapore, but there was not much past doing some sort of ugly trick that involves basically making the stochastic Hamiltonian hermitian. If there was something deeper here, some connection between quantum physics and stochastic physics in our framework it might be very interesting.

By the way, in the abstract it says "hope get" so it seems you're off by a factor of "to" :D

`Hi! It's really a very interesting abstract; in fact, there are lot's of things that I didn't know about there that I should know more about! I've actually started today to write a stochastic rate equation representing the Yule process into our formalism. This is very helpful for my own understanding. The Yule process gives the iconic powerlaw distribution found in modern network theory (e.g. terms like the rich get richer, or cumulative advantage). I think one thing that we're certainly missing is a way to make a stochastic Hamiltonian quantum, and then a way to arrive at the stochastic version as perhaps the first order term in a series. We talked about this in Singapore, but there was not much past doing some sort of ugly trick that involves basically making the stochastic Hamiltonian hermitian. If there was something deeper here, some connection between quantum physics and stochastic physics in our framework it might be very interesting. By the way, in the abstract it says "hope get" so it seems you're off by a factor of "to" :D`

I don't know anything about the Yule process, so that's something for me to learn about!

Yes. There was a guy - I forget his name - who hounded me about this both on my blog and on Google+. He said it was pointless just

assumingwe had a Markov process: we need to figure out how quantum processes approximately act like Markov processes. Of course we don't really "need to" do this, but it would be good to try.Thanks for catching the typo.

`I don't know anything about the Yule process, so that's something for me to learn about! > I think one thing that we’re certainly missing is a way to make a stochastic Hamiltonian quantum, and then a way to arrive at the stochastic version as perhaps the first order term in a series. Yes. There was a guy - I forget his name - who hounded me about this both on my blog and on Google+. He said it was pointless just _assuming_ we had a Markov process: we need to figure out how quantum processes approximately act like Markov processes. Of course we don't really "need to" do this, but it would be good to try. Thanks for catching the typo.`

I realize I should say I'm

goingto do stuff about quantropy, instead of saying Ihave. I've done it on my blog, but I haven't published it. I need to polish it up and publish it - to academics, that counts as "doing it". Otherwise in grant proposals nobody could predict what they're going to do!`I realize I should say I'm _going_ to do stuff about quantropy, instead of saying I _have_. I've done it on my blog, but I haven't published it. I need to polish it up and publish it - to academics, that counts as "doing it". Otherwise in grant proposals nobody could predict what they're going to do!`

Any time anyone mentions power laws, I parrot-like repeat the point that not all significantly decreasing curves are actually power laws and that there's a tendency amongst some applied physicists (not anyone here, but in the general literature) to assume they have got a power law and assuming that fit some parameters to it without any further scrutiny, which is not reliable methodology.

`Any time anyone mentions power laws, I parrot-like repeat the point that not all significantly decreasing curves are actually power laws and that there's a tendency amongst some applied physicists (not anyone here, but in the general literature) to assume they have got a power law and assuming that fit some parameters to it without any further scrutiny, which is [not reliable methodology](www.stat.cmu.edu/~cshalizi/2010-10-18-Meetup.pdf).`

Final version of the FQXi short proposal, coming in at 487 words:

`Final version of the FQXi short proposal, coming in at 487 words: > There is an extensive analogy between quantum mechanics and statistical mechanics, but it remains deeply mysterious. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. This analogy is widely used to take techniques from one subject and apply them to the other. > But what does it all mean? Is it just a “trick”, or something deeper? I believe it is trying to tell us something. > We have many clues. For example, in this analogy, probabilities in statistical mechanics are analogous to amplitudes in quantum mechanics. Jacob Biamonte and I are writing a book _Quantum Techniques in Stochastic Physics_, available online, which shows what we can do with this fact. > Furthermore, in this analogy entropy is analogous to a little-known quantity which I call “quantropy”. Statistical mechanics is governed by the principle of maximum entropy. A system in thermal equilibrium has probabilities of being in different states that are chosen to maximize entropy subject to a constraint on the expected energy. In preliminary calculations I have shown that in quantum mechanics a similar principle holds: the amplitudes for a system to take different paths are chosen to extremize quantropy subject to a constraint on the expected action. I use the word “extremize” instead of “maximize” since while entropy is real, quantropy is complex. > What does all this have to do with information? Since entropy can be seen as “missing information”, the maximum entropy principle can be seen as a principle of minimum information. As emphasized by Jaynes, this says that in situations of incomplete knowledge we should describe a system in a way which involves the least amount of extraneous information. A similar principle underlies Solomonoff induction, a rigorous form of Occam’s razor in which we choose the theory that minimizes algorithmic information. In a paper "Algorithmic Thermodynamics", published with Mike Stay, I have shown that algorithmic information is truly a special case of information in the sense of statistical mechanics. > Putting this all together, it means that the motion of quantum systems extremizes something closely akin to information, but different: “quantropy”. This is a fascinating starting-point for further investigations. Are information and quantropy two aspects of a single unified concept? I hope to get more insights into this by writing a series of papers which develop the ideas sketched here. > To do this effectively, my proposal includes a small teaching load reduction and also part-time support for one graduate student. I find that working with a student on some topic, and the responsibility of supervising their thesis, speeds progress immensely.`

Just some typos...

and

Bad grammar and I think you mean `maximise'.

`Just some typos... > maximize entropy subject to a constaint and > I use the word “extremize” instead of “minimize” while entropy is real, quantropy is complex. Bad grammar and I think you mean `maximise'.`

Ugh, I read this just after submitting the proposal! Oh well. It'll help for the longer proposal assuming I make the first cut.

`Ugh, I read this just after submitting the proposal! Oh well. It'll help for the longer proposal assuming I make the first cut.`

No way to edit the submitted proposal...

`No way to edit the submitted proposal...`

... but I'll edit it here.

`... but I'll edit it here.`

Yay, my FQXi proposal made it through the first cut! Now I need to write a longer proposal by April 30th. (For some reason, "success" in academia means "more work".)

Here's what they told me today:

`Yay, my FQXi proposal made it through the first cut! Now I need to write a longer proposal by April 30th. (For some reason, "success" in academia means "more work".) Here's what they told me today: > Dear John, > On behalf of FQXi, Congratulations---Our review panel has now concluded, and we're happy to let you know that they selected your proposal as one of the finalists in our 2013 Grant competition, on Physics of Information. We now invite you to submit a Full Proposal based on your Initial one. > Due to our limited budget and a large number of solid entries, competition was very tough, and we had to decline many promising entries. This initial review process avoids inflicting the community with a very small acceptance rate for proposals that take great effort to prepare. > We can't provide individual feedback; however, in general, the panel looked closely at a proposal's relevance to the FQXi mission and the topic category, and to the estimated scientific impact-per-dollar. Successful proposals had to score highly in both categories. We recommend you continue to highlight these qualities. > Please begin preparing your Full Proposal using the guidelines in the RFP, available at our website, where you will again submit your proposal: > [http://fqxi.org/grants/large/full](http://fqxi.org/grants/large/full) > (The site will be up and running early next week). The deadline for submission remains 11:59 PM EST, Tuesday 30 April. > Thank you again for your time and effort. An increase in entries for each successive RFP indicates that more and more researchers view foundational questions as important and interesting. We're glad you see FQXi as a useful part of the effort to answer these questions. > Sincerely, > Brendan Foster & Kavita Rajanna`

The deadline was extended so I have until 14 May 2013 to write my more detailed FQXi proposal... and now I have to get cracking!

Here is the deal:

`The deadline was extended so I have until 14 May 2013 to write my more detailed FQXi proposal... and now I have to get cracking! Here is the deal: > FULL PROPOSAL—DUE May 14, 2013—Must Include: > * A 200-word project abstract, suitable for publication in an academic journal > * A project summary not exceeding 200 words, explaining the work and its significance to laypeople > * A detailed description of the proposed research, not to exceed 15 single-spaced 11-point pages, including a short statement of how the application fits into the applicant's present research program, and a description of how the results might be communicated to the wider scientific community and general public > * A detailed budget over the life of the award, with justification and utilization distribution (preferably drafted by your institution’s grant officer or equivalent) > * A list, for all project senior personnel, of all present and pending financial support, including project name, funding source, dates, amount, and status (current or pending) > * Evidence of tax-exempt status of grantee institution, if other than a US university > * Names of 3 recommended referees > * Curricula Vitae for all project senior personnel, including: > Education and employment history > A list of references of up to five previous publications relevant to the proposed research, and up to five additional representative publications > Full publication list > * For past awardees only: A 250-word statement explaining what was done with previous funding and how that ties in to the current proposal (if at all)`

Here is a 206-word abstract, too long by 6 words. Suggestions on how to cut it down would be great!

The last sentence is a bit weak - I don't want to just "be in the position" to do something, I want to do it. Maybe that's one way to cut some words.

`Here is a 206-word abstract, too long by 6 words. Suggestions on how to cut it down would be great! > There is a famous analogy between quantum mechanics and statistical mechanics. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. > While famous, this analogy remains mysterious. I believe it is trying to tell us something about information. > Occam's razor says the best theory is the one that uses the least information to fit the data. Since entropy is "missing information", this can be made precise by the principle of maximum entropy, which says a theory should maximize entropy subject to constraints given by the data. The probabilities in statistical mechanics can be computed using this principle. > Calculations suggest a similar principle holds in quantum mechanics: the amplitudes for a system to take different paths extremize a quantity which I call "quantropy". > Quantropy is not the same as the entropy of a quantum system. Nonetheless, we can ask: are information and quantropy two aspects of a single unified concept? After more carefully proving that quantum mechanics extremizes quantropy, I will be in the position to explore this question. The last sentence is a bit weak - I don't want to just "be in the position" to do something, I want to do it. Maybe that's one way to cut some words.`

Well, okay - 201 words:

`Well, okay - 201 words: > There is a famous analogy between quantum mechanics and statistical mechanics. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. > While famous, this analogy remains mysterious. I believe it is trying to tell us something about information. > Occam's razor says the best theory is the one that uses the least information to fit the data. Since entropy is "missing information", this can be made precise by the principle of maximum entropy, which says a theory should maximize entropy subject to constraints given by the data. The probabilities in statistical mechanics can be computed using this principle. > Calculations suggest a similar principle holds in quantum mechanics: the amplitudes for a system to take different paths extremize a quantity which I call "quantropy". > Quantropy is not the same as the entropy of a quantum system. Nonetheless, we can ask: are information and quantropy two aspects of a single unified concept? After more carefully proving that quantum mechanics extremizes quantropy, I will explore this question.`

Sorry, this is boring but now I'm down to 188 words:

`Sorry, this is boring but now I'm down to 188 words: > There is a famous analogy between quantum mechanics and statistical mechanics. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. > While famous, this analogy remains mysterious. I believe it is trying to tell us something about information. > Occam's razor says the best model is the one that uses the least information to fit the data. Jaynes noted that since entropy is "missing information", Occam's razor has a precise statement in the principle of maximum entropy. The probabilities in statistical mechanics obey this principle. > Calculations suggest a similar principle holds in quantum mechanics: the amplitudes for a system to take different paths extremize a quantity which I call "quantropy". > Quantropy is not the same as the entropy of a quantum system. Nonetheless, we can ask: are information and quantropy two aspects of a single unified concept? After more carefully proving that quantum mechanics extremizes quantropy, I will explore this question.`

This isn't really a comment on your abstract, just something that pops up given your reference to Occam's razor: in machine learning one common way of talking about Occam's razor is in terms of the Minimum description length principle which basically states that the "best" model (for some class of models) data is the model in which you can describe the data in the fewest bits, which means the sum of the bits required to specify the model and then the parameters of the data with respect to that model. For example, as a silly example suppose that you've got a set of points in some high dimensional space then you could specify some number of "basis vectors" and then the coefficients in that basis; Specifying both requires some bits but the trade-off that gives the total fewest bits is the "Occam's razor" model. (There's all sorts of games, such as allowing some noise so that you don't have to absolutely exactly reproduce the data, etc).

I can't immediately see how that connects with the Jaynes' interpretation, or indeed if they're considering Occam with respect to other criteria.

`This isn't really a comment on your abstract, just something that pops up given your reference to Occam's razor: in machine learning one common way of talking about Occam's razor is in terms of the [Minimum description length principle](http://en.wikipedia.org/wiki/Minimum_description_length) which basically states that the "best" model (for some class of models) data is the model in which you can describe the data in the fewest bits, which means the sum of the bits required to specify the model and then the parameters of the data with respect to that model. For example, as a silly example suppose that you've got a set of points in some high dimensional space then you could specify some number of "basis vectors" and then the coefficients in that basis; Specifying both requires some bits but the trade-off that gives the total fewest bits is the "Occam's razor" model. (There's all sorts of games, such as allowing some noise so that you don't have to absolutely exactly reproduce the data, etc). I can't immediately see how that connects with the Jaynes' interpretation, or indeed if they're considering Occam with respect to other criteria.`

Hi John,

can I give some comments? (even though it's probably only showing my misunderstanding)

When I read "a system in thermal equilibrium occupies all possible states" I was a little bit confused. Am I correct that you mean something like "within an ensemble all possible states are occupied"?

I once read in an economy news article the sentence "the gold price is trying to tell us something" and I had the impression a prophet was starting to speak, so this has biased the way I read your sentience. Since you have twelve words left, what about something like: "I believe this analogy can lead us to a better understanding of information."

`Hi John, can I give some comments? (even though it's probably only showing my misunderstanding) > In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. When I read "a system in thermal equilibrium occupies all possible states" I was a little bit confused. Am I correct that you mean something like "within an ensemble all possible states are occupied"? > While famous, this analogy remains mysterious. I believe it is trying to tell us something about information. I once read in an economy news article the sentence "the gold price is trying to tell us something" and I had the impression a prophet was starting to speak, so this has biased the way I read your sentience. Since you have twelve words left, what about something like: "I believe this analogy can lead us to a better understanding of information."`

David Tweed wrote:

Nice!

They're actually quite closely related. There's a theorem saying the "minimum description length" differs by at most some constant from the negative of "algorithmic entropy", which can be computed, like ordinary Shannon entropy, using a formula of the form

$$ S = - \sum_i p_i \ln p_i $$ for certain probabilities $p_i$.

I'd like to give you a nice online reference for this result, but a lot of the easy-to-read intros to "algorithmic entropy" use this term

synonymouslywith minimum description length, which obscures the issue... so the best I can do is my paper with Mike Stay.Once you see that minimum description length is

equalto the negative of a kind of entropy, up to an error bounded by an additive constant, the distinction between Jaynes' desire to maximize entropy and other people's desire to find very short descriptions of models starts to dissolve. One of my goals is to make it dissolve.(As you can see, there's an annoying minus sign lurking in this subject, which can be confusing at times. This is why one guy's maximum is another gal's minimum.)

`David Tweed wrote: > in machine learning one common way of talking about Occam's razor is in terms of the [Minimum description length principle](http://en.wikipedia.org/wiki/Minimum_description_length) which basically states that the "best" model (for some class of models) data is the model in which you can describe the data in the fewest bits... Nice! > I can't immediately see how that connects with the Jaynes' interpretation, or indeed if they're considering Occam with respect to other criteria. They're actually quite closely related. There's a theorem saying the "minimum description length" differs by at most some constant from the negative of "algorithmic entropy", which can be computed, like ordinary Shannon entropy, using a formula of the form $$ S = - \sum_i p_i \ln p_i $$ for certain probabilities $p_i$. I'd like to give you a nice online reference for this result, but a lot of the easy-to-read intros to "algorithmic entropy" use this term _synonymously_ with minimum description length, which obscures the issue... so the best I can do is [my paper with Mike Stay](http://math.ucr.edu/home/baez/thermo.pdf). > I can’t immediately see how that connects with the Jaynes’ interpretation, or indeed if they’re considering Occam with respect to other criteria. Once you see that minimum description length is _equal_ to the negative of a kind of entropy, up to an error bounded by an additive constant, the distinction between Jaynes' desire to maximize entropy and other people's desire to find very short descriptions of models starts to dissolve. One of my goals is to make it dissolve. (As you can see, there's an annoying minus sign lurking in this subject, which can be confusing at times. This is why one guy's maximum is another gal's minimum.)`

Frederik wrote:

Yes. It's just a way of saying "anything can happen... with some probability". Personally I don't like to talk about "ensembles", since then we're invoking a frequentist definition of probability, but I would happily do so here if I thought it would increase the number of people who'd understand the sentence. I'm not sure it would!

Okay - I sort of like the ominous prophetic overtones, but it's better to be clear.

`Frederik wrote: > When I read “a system in thermal equilibrium occupies all possible states” I was a little bit confused. Am I correct that you mean something like “within an ensemble all possible states are occupied”? Yes. It's just a way of saying "anything can happen... with some probability". Personally I don't like to talk about "ensembles", since then we're invoking a frequentist definition of probability, but I would happily do so here if I thought it would increase the number of people who'd understand the sentence. I'm not sure it would! > Since you have twelve words left, what about something like: “I believe this analogy can lead us to a better understanding of information.” Okay - I sort of like the ominous prophetic overtones, but it's better to be clear.`

I'm writing the full proposal for the FQXi grant now, and you can see a draft here. You might like it if you're curious about minimum and maximum principles in physics, my attempt to boil them all down to Occam's razor, and the idea of quantropy. Here's how it starts:

`I'm writing the full proposal for the FQXi grant now, and you can see a draft [here](http://math.ucr.edu/home/baez/fqxi_narrative_2013.pdf). You might like it if you're curious about minimum and maximum principles in physics, my attempt to boil them all down to Occam's razor, and the idea of quantropy. Here's how it starts: > There is an extensive analogy between statistical mechanics and quantum mechanics. In statistical mechanics, a system in thermal equilibrium occupies all possible states, each with a probability depending on the energy of that state. In quantum mechanics, a system takes all possible paths with the passage of time, each with an amplitude depending on the action of that path. > This analogy is famous and widely used in physics. But what does it really mean? Is it just a mathematical `trick', or something deeper? I believe it is trying to tell us something about information. > Occam's razor says that the best model of a system is the simplest one that fits the data we have. As noted by Jaynes and Solomonoff, the concept of `simplicity' can be made quantitative using information theory. In these terms, Occam's razor says the best model specifies the least amount of information needed to fit the data. But since entropy is another name for unspecified information, we can say this another way: our model should maximize entropy subject to the constraints given by the data. This is the **principle of maximum entropy**. > Statistical mechanics is governed by the principle of maximum entropy. A system in thermal equilibrium occupies states with precisely the probabilities that maximize entropy subject to the constraints given by what we know about the system. > In preliminary calculations, I have found that the analogy between statistical mechanics and quantum mechanics extends to include a concept analogous to entropy, which I call `quantropy'. In quantum mechanics, the amplitudes for a system to take different paths are precisely those that extremize quantropy. > Quantropy is not the same as the entropy of a quantum system; it is something new. For example, it is a *complex* number instead of a *real* number. It governs *dynamical* rather than *static* systems. Nonetheless, we can ask: does the tight relation between information and entropy extend to include quantropy? Can we use quantropy to understand dynamics in quantum mechanics using a new generalization of Occam's razor? To answer these questions, the first step is to carefully work out the role quantropy plays in quantum mechanics. > In what follows, I start by reviewing some variational principles in physics, and noting that the principles governing statics all follow from the principle of maximum entropy. Then I argue that the principles governing dynamics follow from the principle of stationary quantropy. Finally, I sketch a research program, to be funded by this grant, that could study this principle and explore its implications.`

I'm also supposed to write a 200-word summary of my project "suitable for a layperson". This one is too long but it may be clearer than the shorter one I'll have to write:

`I'm also supposed to write a 200-word summary of my project "suitable for a layperson". This one is too long but it may be clearer than the shorter one I'll have to write: > Occam's razor says we should seek the simplest description of a situation that fits the data we have. While almost everyone likes this idea, it was hard to make it precise until the invention of information theory by Claude Shannon in 1948. Now we can say the simplest description is the one that takes the fewest bits of information. > It wasn't until the late 1950's, with the work of Edwin Thompson Jaynes, that we realized that Occam's razor is built into thermodynamics: the study of heat, which underlies much of physics, chemistry and even biology. A system in thermal equilibrium makes its entropy, or disorder, as large as possible. But Jaynes realized that since entropy can be seen as "missing information", we can also say that the best description of a system in equilibrium is the one that takes the least information! > This is fine if all we care about is equilibrium. But what about systems that are changing in interesting ways with time? Here an analogy between thermodynamics and quantum mechanics can help us out. Many of the formulas in these two subjects are analogous, but thermodynamics involves ordinary real numbers, while quantum mechanics uses "complex numbers": numbers involving the square root of -1. > Using this analogy, we can define a new quantity, similar to entropy but different, which is - roughly speaking - maximized by quantum-mechanical systems as they change with time. Since it needs a name, we might as well call it "quantropy" until someone thinks of a better word. But the goal of this project is to better understand it, and learn to use it. > While ordinary information is measured by a real number, quantropy is complex! What could it mean to have the square root of -1 bits of information? To figure this out will require new ideas and also plenty of work.`

Hi John,

is it really common to explain complex numbers for laypersons?

I think you should trust that laypersons interested in your proposal also know complex numbers.

Similarly, I'd suggest to drop the square root of -1 here as well, e.g.

`Hi John, is it really common to explain complex numbers for laypersons? > uses “complex numbers”: numbers involving the square root of -1. I think you should trust that laypersons interested in your proposal also know complex numbers. > While ordinary information is measured by a real number, quantropy is complex! What could it mean to have the square root of -1 bits of information? To figure this out will require new ideas and also plenty of work. Similarly, I'd suggest to drop the square root of -1 here as well, e.g. > While ordinary information is measured by a real number, quantropic information (quantropy) is measured by complex numbers! What could this mean? To figure this out will require new ideas and also plenty of work.`

Just referring in general to Frederik's comments, it's unclear if the summary is for dissemination or if it's for "laypersons on the funding panel". If it's dissemination you might expect the reader to have a background since they've decided to read it. If it's actually someone on the committee, then quite possible that they might not have a background that includes complex numbers.

`Just referring in general to Frederik's comments, it's unclear if the summary is for dissemination or if it's for "laypersons on the funding panel". If it's dissemination you might expect the reader to have a background since they've decided to read it. If it's actually someone on the committee, then quite possible that they might not have a background that includes complex numbers.`

Frederik wrote:

Most people in the US don't know what complex numbers are, I think: they're not typically explained in high school, and while most universities require that every student take

somemathematics, this may not include complex numbers. I will be happy if it turns out all educated citizens know about complex numbers in Germany.People on the panel who decide whether or not to fund this proposal will know about complex numbers and much more, since I've been on such a panel and it was mainly composed of mathematicians and physicists. This summary for layfolk is probably intended for publicity in case my proposal gets accepted.

Anyway, to cut my summary down to 200 words I needed to remove the stuff about complex numbers, which is not really the main idea anyway. I was trying to explain too many different things in 200 words! I wound up using this:

`Frederik wrote: > is it really common to explain complex numbers for laypersons? Most people in the US don't know what complex numbers are, I think: they're not typically explained in high school, and while most universities require that every student take <i>some</i> mathematics, this may not include complex numbers. I will be happy if it turns out all educated citizens know about complex numbers in Germany. People on the panel who decide whether or not to fund this proposal will know about complex numbers and much more, since I've been on such a panel and it was mainly composed of mathematicians and physicists. This summary for layfolk is probably intended for publicity in case my proposal gets accepted. Anyway, to cut my summary down to 200 words I needed to remove the stuff about complex numbers, which is not really the main idea anyway. I was trying to explain too many different things in 200 words! I wound up using this: > Occam's razor says we should always seek the simplest description of a situation that fits the data. This idea was not very precise until Shannon invented information theory. Now we can say the simplest description is the one that takes the fewest bits of information. > It wasn't until later that we realized that Occam's razor is built into thermodynamics, the study of heat. A system in thermal equilibrium makes its entropy as large as possible. We now know why: entropy is simply "missing information". So, the best description of a system in equilibrium is the one that takes the least information! > But what about systems that are changing with time? Here an analogy between thermodynamics and quantum mechanics can help. Many of the formulas in these two subjects are similar. Using this fact we can define a new quantity, similar to entropy but different, that is maximized by quantum systems as they change with time. Since this new quantity needs a name, we can call it "quantropy" until someone thinks of a better word. But the goal of this project is to better understand quantropy. It may give Occam's razor, and information theory, a whole new significance.`

My NSF proposal Information geometry and Markov processes was rejected. Here are the three referee's reports.

So, this suggests I should spell out exactly what I'm gonna do. I could have done that.

`My NSF proposal <a href = "http://math.ucr.edu/home/baez/baez_information_geometry_description.pdf">Information geometry and Markov processes</a> was rejected. Here are the three referee's reports. > **Report 1** > Rating: Good > The PI proposes to use ideas and methods from information geometry and Markov process theory to study evolutionary processes in biology with a particular focus on understanding the relationship between increased fitness through natural selection and increased entropy. Rectifying the behaviors of various biological and physical populations described by standard evolution equations versus limiting rate equations certainly seems a compelling problem. However, it would help to have a more detailed proposal on how the PI intends to do this, perhaps with some stated intermediate goals. > The PI lists 6 publications resulting from recent support (spanning 2007-2012), and, perhaps more notably, four students completing their PhDs and successfully moving on to research positions where they are independently productive. > **Summary:** The questions posed are interesting and certainly worth studying. Although the proposal is fairly short on detail, I expect the PI will carry out the research successfully and do well training the chosen graduate students. So, this suggests I should spell out exactly what I'm gonna do. I could have done that.`

The first sentence here is illiterate and makes no sense. Anyway, this suggests I should:

explain everything in detail for referees who don't understand the material, and

state precise theorems that I intend to prove (which I can only do after I've proved them, but that's okay, since grants should always be written for projects that you've already mostly finished).

`> **Report 2** > Rating: Poor > The PI proposes to investigate how the quantities shown to increasing under rate equations (e.g. in chemical reactions) is related to those under the master equation (such as entropy). The proposal is poorly prepared, with no precise mathematical problem formulated. Even the meaning of rate equation and master equation for Markov processes is not clearly explained. The first sentence here is illiterate and makes no sense. Anyway, this suggests I should: 1. explain everything in detail for referees who don't understand the material, and 2. state precise theorems that I intend to prove (which I can only do after I've proved them, but that's okay, since grants should always be written for projects that you've already mostly finished).`

Again, this suggests I should:

explain everything in detail and

state precise theorems I plan to prove.

I had foolishly assumed people reviewing a proposal on Markov processes on information geometry would be familiar with the concepts I was discussing and want a high-level description of what I planned to do, instead of grungy details. But I guess many people writing grant proposals are bullshitting to some extent, the referees want to eliminate proposals that don't seem bound to succeed, and they may not be experts on the concepts involved, so they want to see proof that I really know what I'm talking about and can do what I'll say I'm going to do.

The other NSF proposal, on Quantum Techniques for Stochastic Physics, is still pending. In this case I referred to a book I've written on the topic, which is available online, so perhaps that will help convince them I know what I'm doing. We'll see.

Anyway, when the next round of NSF proposal deadline rolls around I can produce a proposal or two that's vastly more detailed.

`> **Report 3** > Rating: Fair > It is not explained in sufficiently mathematical terms what theorems or techniques from information geometry the PI proposes to apply. It is also not explained mathematically what the rate and master equations say. It is difficult to say whether the proposed work would advance knowledge and understanding in mathematics. > The proposed work did not mention possible concrete results - beyond saying that the relationship between quantities that increase for the rate equation and those that increase for the master equation would be investigated. It was not precisely explained what examples of such quantities are or why it is conjectured that there should be a relationship between them, what mathematical form such a relationship might take, or a clear plan for finding any relationships. > The ideas sound nice but it is difficult to judge the mathematical merit of the proposal without more details. > **Summary:** This proposal idea sounds interesting and tries to draw a connection between two equations describing the evolution of particle systems. However the proposal itself needs a lot more mathematical justification and concrete statements of potential theorems or conjectures. > I rate this proposal as Fair and lacking in certain critical aspects. Again, this suggests I should: 1. explain everything in detail and 2. state precise theorems I plan to prove. I had foolishly assumed people reviewing a proposal on Markov processes on information geometry would be familiar with the concepts I was discussing and want a high-level description of what I planned to do, instead of grungy details. But I guess many people writing grant proposals are bullshitting to some extent, the referees want to eliminate proposals that don't seem bound to succeed, and they may not be experts on the concepts involved, so they want to see proof that I really know what I'm talking about and can do what I'll say I'm going to do. The other NSF proposal, on [Quantum Techniques for Stochastic Physics](http://math.ucr.edu/home/baez/baez_quantum_techniques_description.pdf), is still pending. In this case I referred to a book I've written on the topic, which is available online, so perhaps that will help convince them I know what I'm doing. We'll see. Anyway, when the next round of NSF proposal deadline rolls around I can produce a proposal or two that's vastly more detailed.`

Then your reviews will say that your proposal is technical and lacking in big ideas and conceptual clarity.

`> Anyway, when the next round of NSF proposal deadline rolls around I can produce a proposal or two that’s vastly more detailed. Then your reviews will say that your proposal is technical and lacking in big ideas and conceptual clarity.`

Implicit in Nathan's comment is one of the annoyances of the system: there'll probably be slightly different people reviewing your next proposal (even if the same big group of people are on the boards as a whole) who by Murphy's law will like/dislike different things....

The only concrete advice I have is that you need to carefully walk the line between giving details about the topic to respond to criticism like this and making it seem like you already know the form of most of the answers (since in that case you probably don't justify funding).

`Implicit in Nathan's comment is one of the annoyances of the system: there'll probably be slightly different people reviewing your next proposal (even if the same big group of people are on the boards as a whole) who by Murphy's law will like/dislike different things.... The only concrete advice I have is that you need to carefully walk the line between giving details about the topic to respond to criticism like this and making it seem like you already know the form of most of the answers (since in that case you probably don't justify funding).`

Okay, thanks for the extra help Nathan and David! I can provide "big ideas and conceptual clarity" at the front and details near the back. Then the trick will mainly be making it sound like I know exactly what I'm going to prove but just haven't proved it yet.

`Okay, thanks for the extra help Nathan and David! I can provide "big ideas and conceptual clarity" at the front and details near the back. Then the trick will mainly be making it sound like I know exactly what I'm going to prove but just haven't proved it yet. <img src = "http://math.ucr.edu/home/baez/emoticons.jpg" alt = ""/>`

My FQXi grant proposal (see above) got turned down. Since the PDF file they sent me is not cut-and-pasteable, I'll just quote a bit:

I think they're being silly for treating "formal" and "mathematical" as derogatory terms, and saying the proposal is "original" and connects "two previously unrelated topics", yet tackles an "old problem" that has been "known and studied for a very long time". I'm somewhat motivated to work harder on this problem just to prove they're idiots, though I guess that's not a very good reason to work on something.

`My FQXi grant proposal (see [above](http://forum.azimuthproject.org/discussion/1075/grant-proposals/?Focus=9115#Comment_9115)) got turned down. Since the PDF file they sent me is not cut-and-pasteable, I'll just quote a bit: > **Strengths** > This is an interesting and original proposal, fitting very well into the topic of Physics of Information. The investigator proposes to use Occam's razor and information theory to understand the foundations of quantum mechanics. This idea is very interesting and original, and similar reasoning turned out to be very successful in improving our understanding of quantum theory in recent years. In addition, the proposal connects two previously unrelated topics, namely the path integral and entropy. > The proposal presents a very clear and convincing work plan. The investigator has made very important and fascinating contributions to foundational physics in the past. > **Weaknesses** > The proposal is built on a formal analogy: that the Gibbs state and quantum time evolution can both be obtained by variation of some functional. However, a simple formal analogy does not in itself imply any kind of deep conceptual relation. Most physical quantities can somehow be expressed as being stationary with respet to some functional. Thus, it seems that there is a major risk that this research will not lead to substantially new conceptual insights. While high risk is fine, it would have been helpful to have a somewhat broader proposal, where different possible approaches could back up the possible failure of one of them. > The formal analogy between quantum mechanics and statistical mechanics has been known and explored for a long time. Thus, it would have been helpful to have a somewhat more concrete explanation of why we should expect to obtain new insights into this old problem by studying a specific formal quantity. > **Summary Evaluation** > This is definitely an interesting and original proposal, which fits very well into the topic of Physics and Information. However, the panel felt it unlikely that studying the particular mathematical functional of quantropy can really give new results or conceptual insights into a problem that has been known and studied for a very long time. I think they're being silly for treating "formal" and "mathematical" as derogatory terms, and saying the proposal is "original" and connects "two previously unrelated topics", yet tackles an "old problem" that has been "known and studied for a very long time". I'm somewhat motivated to work harder on this problem just to prove they're idiots, though I guess that's not a very good reason to work on something.`