>**Puzzle 283**
>Guess what the cap and cup
>$$ \cap_V \colon k \to V \otimes V^\ast, \qquad \cup_V \colon V^\ast \otimes V \to k $$
>are for a finite-dimensional vector space \\(V\\), and check your guess by proving the snake equations.
I wrote:
>\\(\cap_V \colon k \to V \otimes V^\ast \\) is the function that takes the inverse of a matrix of a vector space which sends it into its dual space. \\(\cup_V \colon V^\ast \otimes V \to k\\) is the function that takes the inverse of a dual matrix which sends it back into the vector space.
I realized that my answer above is incorrect while working out the solutions to the other puzzles. My answer cannot be right since you can't take an inverse of a vector. I was approaching the problem from the wrong direction since we are dealing with vector spaces and not the linear maps.
So I think the answer to this puzzle is :
First we need to define what the dual functor is. The vector space is the basis and lies in the covariant position. The dual space is the linear functions on the basis and therefore lies in the contravariant position. The (-)* functor switches the two positions and for vectors this is just the transpose function.
\\(\cup_V \colon V^\ast \otimes V \to k\\) is the evaluation map that evaluates the vector multiplication \\(w^iv_i = k\\) or in 2D components \\(w^1v_1 + w^2v_2 = k\\).
\\(\cap_V \colon k \to V \otimes V^\ast \\) is the coevaluation map. This map is kind of like saying there exists some vector and covector \\(w^iv_i\\) that satisfies the equations (w^iv_i = k\\) or every \\(k\\) can be written as the multiplication of vector and covector.
The snake equations are straight forward but one small detail to be worked out. We need to prove that \\(v \stackrel{\sim}{\to} k \otimes v\\) and \\(v^\mathsf{T} \stackrel{\sim}{\to} v^\mathsf{T} \otimes k\\).
For this, we have to remember that the objects are vector spaces and one dimensional vectors or scalars serve as identities since a n-dimensional vector space tensored with a 1-dimensional vector space is still a n-dimensional vector space. In other words, it says you can always factor out a scalar from a vector by dividing through like homogenous coordinates.
So if we have scalar \\(k\\) and vector \\(v = \begin{bmatrix} a \\ b \end{bmatrix}\\), then :
\[\begin{bmatrix} ka \\ kb \end{bmatrix} = k \begin{bmatrix} a \\ b \end{bmatrix} = \begin{bmatrix} a \\ b \end{bmatrix} \begin{bmatrix} a & b \end{bmatrix} \begin{bmatrix} a \\ b \end{bmatrix} = \begin{bmatrix} a \\ b \end{bmatrix} k = \begin{bmatrix} ak \\ bk \end{bmatrix}\]
The other snake equation is similar with tensor order and duals reversed.
... Why isn't the latex working for column vectors?