Options

Introduction: Matias Bossa

I'm a biomedical engineering researcher at the University of Zaragoza, where I use probabilistic models as well as other machine learning techniques to analyze medical data (images, biomarkers, etc). I discovered Category Theory a year ago through Awodey's book, which I really enjoyed. I would like to use CT characterization of probabilistic models (as in McCullagh work) as well as neural networks (as in Fong, Spivak work) to implement machine learning libraries in Haskell, as a way of learning both, CT and functional programing.

Comments

  • 1.

    Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?

    Comment Source:Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?
  • 2.

    Hi. I was talking about this paper: https://projecteuclid.org/euclid.aos/1035844977

    Comment Source:Hi. I was talking about this paper: https://projecteuclid.org/euclid.aos/1035844977
  • 3.

    Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.

    Comment Source:Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.
  • 4.

    Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).

    Comment Source:Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).
  • 5.

    Jesus, I found this very recent blog post commenting on McCullagh paper What is a statistical model? and applied category theory. (not really too much insight)

    Comment Source:[Jesus](https://forum.azimuthproject.org/discussion/comment/15968/#Comment_15968), I found this very recent [blog post](https://www.johndcook.com/blog/2018/04/14/categorical-data-analysis/) commenting on McCullagh paper [What is a statistical model?](https://projecteuclid.org/euclid.aos/1035844977) and applied category theory. (not really too much insight)
  • 6.

    Thank you Matías!

    Comment Source:Thank you Matías!
Sign In or Register to comment.