@Allison+Burgers

@Allison\+Burgers

@Allison Burgers

@Allison%20Burgers

@"Allison Burgers"

@Allison\ Burgers

@Allison\%20Burgers

(maybe the spaces were a bad idea)

ANYway, coming in late to this discussion, I don't see any other answers to your puzzles. My linear algebra is very rusty, but here's my go:

AB1:

A linear subspace X in V can be identified by a basis B, a subset of V. T(X) can be defined as the linear subspace of W identified by the T(B), that is, by the set of T(b) for each b in B.

If X ≤ Y, then X is a subspace of Y in V, and we can construct a basis C of Y such that C is a superset of B. Then T(B) is a subset of T(C), and therefore T(X) is a subspace of T(Y); that is, T(X) ≤ T(Y) under the order operator in W. Thus T is monotone.

AB2:

Let K be the kernel of T, i.e. the "largest" subspace of V such that T(K) = 0, and let P be the subspace of W that is orthogonal to the image of T(V). The right adjoint should basically map P to 0, apply a linear inverse of T from T(V) to (a subspace of) V, and always "add" K.

AFAICT, a right adjoint should always exist, although for a highly degenerate transform it will be very simple. If T maps V (and thus any subspace) to zero, the right adjoint will map any subspace of W, including W itself and zero, to the whole space V.

A left adjoint should apply a linear inverse of T from T(V) to (a subspace of) V, but it should not "add" K. However, only if P is the empty set can we construct a left adjoint; otherwise we run into trouble.

AB3:

This is where my linear algebra knowledge runs out!

![vector space linear transform graph](https://getupstreettheater.files.wordpress.com/2018/06/vector_space_linear_transform_adjoints2.png)

On the left is an example in which T is surjective, so we get a left adjoint in addition to the right adjoint. On the right is an example in which the complement of Image(T) in W is not empty, so the right adjoint is richer, in a sense, but the left adjoint is not possible.

@Allison\+Burgers

@Allison Burgers

@Allison%20Burgers

@"Allison Burgers"

@Allison\ Burgers

@Allison\%20Burgers

(maybe the spaces were a bad idea)

ANYway, coming in late to this discussion, I don't see any other answers to your puzzles. My linear algebra is very rusty, but here's my go:

AB1:

A linear subspace X in V can be identified by a basis B, a subset of V. T(X) can be defined as the linear subspace of W identified by the T(B), that is, by the set of T(b) for each b in B.

If X ≤ Y, then X is a subspace of Y in V, and we can construct a basis C of Y such that C is a superset of B. Then T(B) is a subset of T(C), and therefore T(X) is a subspace of T(Y); that is, T(X) ≤ T(Y) under the order operator in W. Thus T is monotone.

AB2:

Let K be the kernel of T, i.e. the "largest" subspace of V such that T(K) = 0, and let P be the subspace of W that is orthogonal to the image of T(V). The right adjoint should basically map P to 0, apply a linear inverse of T from T(V) to (a subspace of) V, and always "add" K.

AFAICT, a right adjoint should always exist, although for a highly degenerate transform it will be very simple. If T maps V (and thus any subspace) to zero, the right adjoint will map any subspace of W, including W itself and zero, to the whole space V.

A left adjoint should apply a linear inverse of T from T(V) to (a subspace of) V, but it should not "add" K. However, only if P is the empty set can we construct a left adjoint; otherwise we run into trouble.

AB3:

This is where my linear algebra knowledge runs out!

![vector space linear transform graph](https://getupstreettheater.files.wordpress.com/2018/06/vector_space_linear_transform_adjoints2.png)

On the left is an example in which T is surjective, so we get a left adjoint in addition to the right adjoint. On the right is an example in which the complement of Image(T) in W is not empty, so the right adjoint is richer, in a sense, but the left adjoint is not possible.