#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

in Chat

Hi all:

I am Pradeep. I work on Information theory and statistics. I want to understand and apply resource theories for information processing tasks that go beyond typical communication scenarios (in the sense of Shannon). I am also interested in the interaction between information theory and order theory. I am really looking forward to see how a categorical approach can throw some new light on this.

• Options
1.

Pradeep, you might be interested in the logical information theory that underlies the Shannon theory, the latter being a requantification of the logical theory for the purposes of coding and communications theory. See the paper linked in my Introduction.

Comment Source:Pradeep, you might be interested in the logical information theory that underlies the Shannon theory, the latter being a requantification of the logical theory for the purposes of coding and communications theory. See the paper linked in my Introduction.
• Options
2.

Thanks David for the reference. I am interested in the measure-theoretic foundations of information theory. I will post my comments once I read it.

By the way, would you like to comment on this work that tries to define a notion of "shared information" based on set intersections: https://arxiv.org/abs/1004.2515?

Comment Source:Thanks David for the reference. I am interested in the measure-theoretic foundations of information theory. I will post my comments once I read it. By the way, would you like to comment on this work that tries to define a notion of "shared information" based on set intersections: https://arxiv.org/abs/1004.2515?
• Options
3.
edited April 13

Welcome to the Azimuth Forum, Pradeep! I too am very interested in the intersection of category theory and information theory. While we're inflicting references on each other, I'll mention this one that chararacterizes Shannon information:

and this one that characterizes relative information:

I've also been thinking about information theory in biology:

Comment Source:Welcome to the Azimuth Forum, Pradeep! I too am very interested in the intersection of category theory and information theory. While we're inflicting references on each other, I'll mention this one that chararacterizes Shannon information: * John Baez, Tobias Fritz and Tom Leinster, <a href = "http://arxiv.org/abs/1106.1791">A characterization of entropy in terms of information loss</a>, <a href ="http://www.mdpi.com/1099-4300/13/11/1945/"><i>Entropy</i></a> <b>13</b> (2011), 1945-1957. and this one that characterizes relative information: * John Baez and Tobias Fritz, <a href = "http://arxiv.org/abs/1402.3067">A Bayesian characterization of relative entropy</a>, <i><a href = "http://www.tac.mta.ca/tac/volumes/29/16/29-16abs.html">Theory and Applications of Categories</a></i> <b>29</b> (2014), 421-456. I've also been thinking about information theory in biology: * John Baez and Blake S. Pollard, <a href = "http://arxiv.org/abs/1512.02742">Relative entropy in biological systems</a>, <i><a href = "http://www.mdpi.com/1099-4300/18/2/46">Entropy</a></i> <b>18</b> (2016), 46.
• Options
4.

Hi John,

Thanks for the references. I must admit I am now aware of these references for quite sometime now mostly because I try following Tobias Fritz's work whose office is in the adjacent section in our institute:-)

One of the principle objectives of joining this course is to better understand these works you mentioned. I work on channel preorders in information theory and want to see how categorical insights can help.

Comment Source:Hi John, Thanks for the references. I must admit I am now aware of these references for quite sometime now mostly because I try following Tobias Fritz's work whose office is in the adjacent section in our institute:-) One of the principle objectives of joining this course is to better understand these works you mentioned. I work on channel preorders in information theory and want to see how categorical insights can help. 
• Options
5.
edited April 15

Pradeep - say hi to Tobias! I talk to him quite often by email.

I believe categorical insights can help us better understand information theory, but it's very early days for this research so there are still a lot of simple fundamental insights left to be discovered - like the theorem that Tobias, Tom and I proved, in which the concept of entropy "pops out" of simple ideas combining probability and category theory. So, it's not as if there's a vast body of tools waiting to be applied; we're just starting to build the tools.

Comment Source:Pradeep - say hi to Tobias! I talk to him quite often by email. I believe categorical insights can help us better understand information theory, but it's very early days for this research so there are still a lot of simple fundamental insights left to be discovered - like the theorem that Tobias, Tom and I proved, in which the concept of entropy "pops out" of simple ideas combining probability and category theory. So, it's not as if there's a vast body of tools waiting to be applied; we're just starting to build the tools.
• Options
6.

Hi John,

Indeed, it is exciting times working on this interface of category and information theory. Tobias said he is going to join the class soon.

Comment Source:Hi John, Indeed, it is exciting times working on this interface of category and information theory. Tobias said he is going to join the class soon.