Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

Introduction: Matias Bossa

in Chat

I'm a biomedical engineering researcher at the University of Zaragoza, where I use probabilistic models as well as other machine learning techniques to analyze medical data (images, biomarkers, etc). I discovered Category Theory a year ago through Awodey's book, which I really enjoyed. I would like to use CT characterization of probabilistic models (as in McCullagh work) as well as neural networks (as in Fong, Spivak work) to implement machine learning libraries in Haskell, as a way of learning both, CT and functional programing.

• Options
1.

Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?

Comment Source:Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?
• Options
2.

Comment Source:Hi. I was talking about this paper: https://projecteuclid.org/euclid.aos/1035844977
• Options
3.

Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.

Comment Source:Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.
• Options
4.

Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).

Comment Source:Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).
• Options
5.

Jesus, I found this very recent blog post commenting on McCullagh paper What is a statistical model? and applied category theory. (not really too much insight)

Comment Source:[Jesus](https://forum.azimuthproject.org/discussion/comment/15968/#Comment_15968), I found this very recent [blog post](https://www.johndcook.com/blog/2018/04/14/categorical-data-analysis/) commenting on McCullagh paper [What is a statistical model?](https://projecteuclid.org/euclid.aos/1035844977) and applied category theory. (not really too much insight)
• Options
6.

Thank you Matías!

Comment Source:Thank you Matías!