John wrote:

> I _visualize_ them almost the same way.

I have no problem with picturing vectors that wrap around, but it's the _tensors_ that appear to lose the palpable interpretations that are possible in the special case of modules that are vector spaces.

Here, let me do a bit of linear algebra 101 thinking out loud, just for the sake of stating the case clearly. That will then help to explain why things look less clear in the more general case of modules.

Basic questions:

* What _is_ a tensor?

* What is the specific _tensor_ that results from taking the tensor product of two vectors/covectors?

From linear algebra 101:

In a vector space \$V\$, over the ground field \$F\$, we can give the following answers:

* A tensor is a multi-linear mapping, where the domain is a product of copies of \$V\$ and its dual \$V*\$, and the range is the ground field \$F\$.

This is "meaty" and works for physics. Once we choose a basis for \$V\$, then a tensor becomes visualizable as a multi-dimensional array of coefficients (which transform in a certain way, when the basis changes).

Further interpretations are available here. Consider tensor of rank 2, represented by a matrix. By matrix multiplication with a vector, it gives us a homomorphism from \$V\$ to \$V\$.

So these are the pictures available for tensors, in vector spaces: multi-linear mapping into the ground field, array of coefficients with a basis transformation law, homomorphisms with domains involving \$V\$ and \$V*\$.

For the second question, how can we picture the tensor product of two vectors/covectors?

The product of a covector (dual vector) and a covector is just the bilinear machine that results from multiplying the outputs of each of the covectors.

The product of a covector \$w\$ and a vector \$v\$ maps a vector \$x\$ to \$w(x) * v\$ -- this is a linear transformation.

In terms of matrices, we can picture the tensor product of a covector \$w\$ and a vector \$v\$ as the outer product -- obtained by matrix multiplication -- of the row vector \$w\$ and the column vector \$v\$.

Such products will only lead to a certain type of matrix, in which the \$i,j\$th entry is the product of \$w_i\$ and \$v_j\$. These are the _simple_ tensors. It is easy to see that the simple tensors span the entire space of tensors (matricies).