It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.2K
- Applied Category Theory Course 347
- Applied Category Theory Seminar 2
- Exercises 149
- Discussion Groups 48
- How to Use MathJax 15
- Chat 475
- Azimuth Code Project 108
- News and Information 145
- Azimuth Blog 148
- Azimuth Forum 29
- Azimuth Project 190
- - Strategy 109
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 708
- - Latest Changes 700
- - - Action 14
- - - Biodiversity 8
- - - Books 2
- - - Carbon 9
- - - Computational methods 38
- - - Climate 53
- - - Earth science 23
- - - Ecology 43
- - - Energy 29
- - - Experiments 30
- - - Geoengineering 0
- - - Mathematical methods 69
- - - Meta 9
- - - Methodology 16
- - - Natural resources 7
- - - Oceans 4
- - - Organizations 34
- - - People 6
- - - Publishing 4
- - - Reports 3
- - - Software 20
- - - Statistical methods 2
- - - Sustainability 4
- - - Things to do 2
- - - Visualisation 1
- General 39

Options

in Chat

## Comments

Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?

`Hi and welcome from another Spaniard also interested in probabilistic and neural applications. Reference request, what could one read to get a feel on the McCullagh work you mention?`

Hi. I was talking about this paper: https://projecteuclid.org/euclid.aos/1035844977

`Hi. I was talking about this paper: https://projecteuclid.org/euclid.aos/1035844977`

Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.

`Hi, Matias! It would be very interesting to apply more category theory and "network theory" in machine learning, the design of artificial neural networks, and also the study of biological neural networks.`

Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).

`Yes. Indeed, I found very exciting (and surprisingly not too hard to follow) the paper "Backprop as Functor: A compositional perspective on supervised learning" from Fong, Spivak and Tuyéras. It seems a very appealing framework to connect theoretical ideas to actual implementation in a strongly typed FP language like Haskell. I'm not an expert in DNN, but I think such a compositional and well founded description would be very useful tool for building algorithms that explore the space of neural architectures (I am aware there is some recent research in this direction).`

Jesus, I found this very recent blog post commenting on McCullagh paper What is a statistical model? and applied category theory. (not really too much insight)

`[Jesus](https://forum.azimuthproject.org/discussion/comment/15968/#Comment_15968), I found this very recent [blog post](https://www.johndcook.com/blog/2018/04/14/categorical-data-analysis/) commenting on McCullagh paper [What is a statistical model?](https://projecteuclid.org/euclid.aos/1035844977) and applied category theory. (not really too much insight)`

Thank you Matías!

`Thank you Matías!`