From comment [#28](https://forum.azimuthproject.org/discussion/comment/20832/#Comment_20832):
>So after working that out, I think the inverse functor is just the dual functor for a linear map.

>If we have two different vector spaces \$$x \text{ and } y \text{ where } Ax =y\$$ Since A is invertible, \$$A^{-1}A = I\$$.

Since we are working with finite dimensional vector spaces, we can assume they have bases. I will assume \$$x\$$ is an element of an \$$N-\$$dimensional vector space \$$V\$$, and \$$y\$$ is an element of an \$$M-\$$dimensional vector space \$$W\$$. Let \$$x=\sum\_{n=1}^Na^ne_n\$$ and \$$y=\sum\_{m=1}^M b^m\tilde e_m\$$, where \$$\\{e\_n\\}\_{n=1}^N\$$ is a basis for \$$V\$$ and \$$\\{\tilde e\_m\\}\_{m=1}^M\$$ is a basis for \$$W\$$.

Then \$$A:V\to W\$$ can be written as \$\sum\_{n=1}^N\sum\_{m=1}^M A^{nm}e^* \_n\otimes\tilde e\_m\$, where \$$A^{nm}\$$ is the number in the nth row and mth column of the matrix that represents \$$A\$$ in the chosen bases, and \$$e^* \_n\$$ is the transpose of \$$e\_n\$$.

Since linear transformations can be expressed as linear combinations of tensor products, it is not consistent to choose the dual independently for vectors and linear transformations (in particular, inverse for linear transformations does not act in a way consistent with transpose on vectors).