#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

# Introduction: Balachandran Chandrasekharan

Hi,

I write software for my experience design company ( https://www.facebook.com/oleindia/ ). I first heard of Category Theory through the book Topos of Music, about ten years ago. More recently, I was exposed to it when I started learning a bit of Haskell. I found Bartosz's blog quite interesting.

I would like to learn enough Category Theory to understand and work on Conal Elliott's ConCat ( http://conal.net/papers/compiling-to-categories/compiling-to-categories.pdf ). Any tool / perspective that would help in making robust systems would be a win.

Bala.

• Options
1.
edited March 2018

Nice paper to compile Haskell to FPGAs. I like the use Lambek-Scott translation as principled.

Comment Source:Nice paper to compile Haskell to FPGAs. I like the use Lambek-Scott translation as principled.
• Options
2.

Nice paper to compile Haskell to FPGAs. I like the use Lambeck-Scott translation as principled.

Trying to understand that. Aren't they compiling to a HDL such as VHDL or Verilog, and that is what is being transformed into a FPGA (for example)?

Interesting that a language such as VHDL has multiple categories of use implicit in it's design. It's able to take a model of a "circuit" and create a behavioral description of the circuit or a structural description of the circuit, as well as being able to accommodate generic monadic bindings to the type system. That's essentially the reason that VHDL has been able to build up a huge library of reusable components and why the semiconductor industry has been able to simulate, test, and synthesize/fabricate all the complex logic on a chip.

Perhaps VHDL is an early example of a model of modeling. The engineers knew they needed to have a model representation that could function both for various applications, such as behavioral analysis, testing, and for synthesis. That's why there is the complexity to the VHDL syntax and semantics, since so much needs to be set up in declaring the entities and architecture for further instantiation.

I don't know, but is it possible that Haskell is simply doing a rewrite of some description directly to VHDL or Verilog?

Comment Source:> Nice paper to compile Haskell to FPGAs. I like the use Lambeck-Scott translation as principled. Trying to understand that. Aren't they compiling to a HDL such as VHDL or Verilog, and that is what is being transformed into a FPGA (for example)? Interesting that a language such as VHDL has multiple **categories of use** implicit in it's design. It's able to take a model of a "circuit" and create a behavioral description of the circuit **or** a structural description of the circuit, as well as being able to accommodate generic monadic bindings to the type system. That's essentially the reason that VHDL has been able to build up a huge library of reusable components and why the semiconductor industry has been able to simulate, test, and synthesize/fabricate all the complex logic on a chip. Perhaps VHDL is an early example of a model of modeling. The engineers knew they needed to have a model representation that could function both for various applications, such as behavioral analysis, testing, and for synthesis. That's why there is the complexity to the VHDL syntax and semantics, since so much needs to be set up in declaring the entities and architecture for further instantiation. I don't know, but is it possible that Haskell is simply doing a rewrite of some description directly to VHDL or Verilog? 
• Options
3.

Unfortunately without reading the paper I can offer little help, I saved it for reference. I just know or knew part of the story, the traslation to CCCs having read Lambek & Scott book. I'm more of the imperative language camp but think that simply typed lambda calculus is very near to Haskell (I did some lisp at College to have a bridge concept). In the translation to CCC an important part is that of higher order functions (functions acepting or returning functions, as in the currification trick). The existence of exponentials condition in CCCs allows one to have objects with suitable types to host them. But I'm lost in passing from CCCs to VHDL which as you know is what gives one the FPGA/ASIC realization.

Independently, apart from hardware synthesis, I read that the same framework can give a translation to let's call it an automatic differentiation backend and I cannot refrain myself to speculate about neural net applications (not that I have a concrete idea here, though).

Comment Source:Unfortunately without reading the paper I can offer little help, I saved it for reference. I just know or knew part of the story, the traslation to CCCs having read Lambek & Scott book. I'm more of the imperative language camp but think that simply typed lambda calculus is very near to Haskell (I did some lisp at College to have a bridge concept). In the translation to CCC an important part is that of higher order functions (functions acepting or returning functions, as in the currification trick). The existence of exponentials condition in CCCs allows one to have objects with suitable types to host them. But I'm lost in passing from CCCs to VHDL which as you know is what gives one the FPGA/ASIC realization. Independently, apart from hardware synthesis, I read that the same framework can give a translation to let's call it an automatic differentiation backend and I cannot refrain myself to speculate about neural net applications (not that I have a concrete idea here, though).
• Options
4.

I read that the same framework can give a translation to let's call it an automatic differentiation backend and I cannot refrain myself to speculate about neural net applications (not that I have a concrete idea here, though).

Not sure if it resolves your speculation, but Conal Eliott also has a recent talk and paper on automatic differentiation (and backpropogation as an instance of it): The simple essence of automatic differentiation

Comment Source:> I read that the same framework can give a translation to let's call it an automatic differentiation backend and I cannot refrain myself to speculate about neural net applications (not that I have a concrete idea here, though). Not sure if it resolves your speculation, but Conal Eliott also has a recent talk and paper on automatic differentiation (and backpropogation as an instance of it): [The simple essence of automatic differentiation](http://conal.net/papers/essence-of-ad/)
• Options
5.

I would guess the reason for many computer science types to be interested in category theory is for any help in aiding programming-in-the-large, automation, and program transformation. Everyone that does software is looking for a tool to help them or their team to get the job done more quickly -- thus the transition from macros, to OOP, to patterns, to generics/templates, and to aspect-oriented programming (which is in some sense category related).

The connection to VHDL and Verilog is apparent in that regard because the designs keep getting bigger and bigger over time with more and more integrated functionality on a chip.

Comment Source:I would guess the reason for many computer science types to be interested in category theory is for any help in aiding programming-in-the-large, automation, and program transformation. Everyone that does software is looking for a tool to help them or their team to get the job done more quickly -- thus the transition from macros, to OOP, to patterns, to generics/templates, and to aspect-oriented programming (which is in some sense category related). The connection to VHDL and Verilog is apparent in that regard because the designs keep getting bigger and bigger over time with more and more integrated functionality on a chip.
• Options
6.

Hi Eran, what a fantastic pointer, thanks a lot! I love when a glimpse such as this happens to be studied earlier in detail, it's an excellent motivation to dive in.

Comment Source:Hi Eran, what a fantastic pointer, thanks a lot! I love when a glimpse such as this happens to be studied earlier in detail, it's an excellent motivation to dive in.
• Options
7.

Paul. Yep the relevant "hardware" pkgs nearly all compile to VHDL or verilog //hackage.haskell.org/packages/search?terms=hardware (http) with exceptions being "itl" an iterated temporal logic checker and Tom Hawkins' Atom DSL for embedded systems (like bus brakes and urban cycle network control) //leepike.wordpress.com/2009/05/05/an-atomic-fibonacci-server-exploring-the-atom-haskell-dsl/ (https).

Comment Source:Paul. Yep the relevant "hardware" pkgs nearly all compile to VHDL or verilog [[http://hackage.haskell.org/packages/search?terms=hardware]] with exceptions being "itl" an iterated temporal logic checker and Tom Hawkins' Atom DSL for embedded systems (like bus brakes and urban cycle network control) [[https://leepike.wordpress.com/2009/05/05/an-atomic-fibonacci-server-exploring-the-atom-haskell-dsl/]].
• Options
8.

Jim, agree that anything with data flow that one may want to simulate, such as Petri nets, can be transformed to VHDL or Verilog.

Comment Source:Jim, agree that anything with data flow that one may want to simulate, such as Petri nets, can be transformed to VHDL or Verilog.
• Options
9.
edited March 2018

Paul, I'm interested in direct haskell to llvm for engineering apps as with Atom. Haskell has hmatrix as its interface to BLAS-LAPACK but progress on producing any pure haskell version might be slow.

Comment Source:Paul, I'm interested in direct haskell to llvm for engineering apps as with Atom. Haskell has hmatrix as its interface to BLAS-LAPACK but progress on producing any pure haskell version might be slow. 
• Options
10.

Jim, I kind of understand what they are trying to do with Atom. My experience is with protected types in Ada instead of Atom, where the mutexes are built-in to the Ada type system.

Comment Source:Jim, I kind of understand what they are trying to do with Atom. My experience is with protected types in Ada instead of Atom, where the mutexes are built-in to the Ada type system.
• Options
11.

The way I understand Conal's paper, we can compile one category ( expressed in Haskell code ) to multiple categories. For example, I could write a mathematical expression in Haskell, compile it to a category that interprets it as human readable documentation, another that synthesizes hardware, another one suitable for simulation on cpu, another one that performs interval analysis and so on.

Comment Source:The way I understand Conal's paper, we can compile one category ( expressed in Haskell code ) to multiple categories. For example, I could write a mathematical expression in Haskell, compile it to a category that interprets it as human readable documentation, another that synthesizes hardware, another one suitable for simulation on cpu, another one that performs interval analysis and so on.