This is a sequel.
Tensor Product
Tensor product is a very important, very commonplace concept in mathematics. We speak of tensor product of vector spaces and tensor product of modules, but the later will not concern us in this post. Tensor product of vector spaces logically belongs to the realm of linear algebra, but is not covered in any linear algebra course I know. The reason is probably pedagogical: linear algebra is an introductory coarse, intended for people with little experience in mathematics. On the other hand, tensor product is a rather sophisticated concept in comparison. Sometimes people refer to tensors as multilinear algebra, but I don’t find this to be a natural category. For instance, the theory of bilinear forms is a part of linear algebra, not “bilinear algebra”.
Definition
Fix k a field. Consider V and W finite dimensional vector spaces over k. Define the tensor product of V and W, denoted V (x) W (pardon my ASCII), to be the vector space of bilinear mappings
V* x W* –> k.
basic properties
- V (x) W is naturally isomorphic to W (x) V
- V (x) k is naturally isomorphic to V. This is so because V** is naturally isomorphic to V.
- V (x) {0} is naturally isomorphic to {0}, the 0-dimensional vector space.
- Consider V, U, W vector spaces. Then (V (+) U) (x) W is naturally isomorphic to V (x) W (+) U (x) W. Here “(+)” denotes direct sum of vector spaces, i.e. V (+) U is the vector space of ordered pairs (v, u) where v is in V and u is in U.
- (V (x) W)* is naturally isomorphic to V* (x) W*.
- V* (x) W* is naturally isomorphic to the vector space of bilinear mappings V x W –> k.
- V* (x) W is naturally isomorphic to Hom(V, W), the vector space of linear operators (=homomorphisms) V –> W.
- For dim V = 1, Hom(V, V) is also 1-dimensonal and it has a special basis consisting of the identity operator. Hence
Hom(V, V) is naturally ismorphic to k and so is V* (x) V. Thus, for 1-dimensional spaces, the dual vector space is the inverse vector space with respect to tensor product.
tensor product of vectors
Consider v a vector in V, w a vector in W. Then we construct
the tensor product of v and w, denoted v (x) w, a vector in V (x) W. By definition, v (x) w is supposed to be a bilinear mapping
v (x) w: V* x W* –> k. Consider a in V* and b in W*. We define
(v (x) w)(a, b) = a(v) b(w) (take your time to parse this expression).
basis
Suppose e1 .. en is a basis of V, f1 .. fm a basis of W.
Claim
{ei (x) fj} is a basis of V (x) W. In particular,
dim (V (x) W) = dim V dim W.
(Anti)Symmetric Tensors
Fix a vector space V. Consider the vector space V (x) V. Elements of V (x) V are called tensors of rank 2 over V. We construct a linear operator s: V (x) V –> V (x) V as follows. Consider t an element of
V (x) V. By definition, t is a bilinear mapping V* x V* –> k. We need to describe s(t), also an element of V (x) V hence also a bilinear mapping V* x V* –> k. Consider a, b in V*. We define
s(t)(a, b) = t(b, a).
It is easy to see V (x) V splits into a direct sum of two subspaces:
S^2(V) and L^2(V). S^2(V) consists of t in V (x) V such that s(t) = t, i.e. it is the eigenspace of s corresponding to eigenvalue 1. Elements of S^2(V) are called symmetric tensors of rank 2 over V. L^2(V) consists of t in V (x) V such that s(t) = –t, i.e. it is the eigenspace of s corresponding to eigenvalue -1. Elements of L^2(V) are called antisymmetric tensors of rank 2 over V. The direct sum structure of V (x) V follows from the observation that s^2 = 1.
More generally, consider the vector space
T^k(V) := V (x) V (x) … (x) V for k copies of V. Elements of T^k(V) are called tensors of rank k over V. Consider
p a permutation of k elements (i.e. a bijection
{1, 2 … k} –> {1, 2 … k}). We define a linear operator
s_p: T^k(V) –> T^k(V) by the condition
s_p(t)(a_1, a_2 … a_k) = t(a_p(1), a_p(2) … a_p(k)). Here t is an element of T^k(V) and a_1, a_2 … a_k are elements of V*.
We define S^k(V) to be the subspace of T^k(V) consisting of t such that for any permutation of k elements p, s_p(t) = t. Elements of
S^k(V) are called symmetric tensors of rank k over V. Remember that permutations can be divided into odd and even. An odd permutation it the composition of an odd number of permutations which are transpositions of two elements of {1, 2 … k}. An even permutation is the composition of an even number of such transpositions. We define L^k(V) to be the subspace of T^k(V) consisting of such t that for any permutation of k elements p,
s_p(t) = sgn(p) t. Here sgn(p) is +1 for p even and –1 for p odd. Elements of L^k(V) are called antisymmetric tensors of rank k over V. For k > 2, the direct sum of S^k(V) and L^k(V) is not the entire space T^k(V).
There is a natural projection operator sym: T^k(V) –> S^k(V). Conisder t in T^k(V). Then, by definition, sym(t) = S_p s_p(t) / k! Here the sum ranges over all permutations of k elements p. There is also a natural projection operator asym: T^k(V) –> L^k(V). Given t in T^k(V), we define asym(t) = S_p sgn(p) s_p(t) / k!
Claim
Consider v1, v2 … vk elements of V. Then
asym(v1 (x) v2 (x) … (x) vk) is a non-vanishing element of L^k(V) if and only if v1, v2 … vk are linearly independent.
Suppose dim V = n. Consider e1 … en a basis of V. Consider
N = {1, 2 … n}^k, the set of all ordered k-tuples made of elements of {1, 2 … n}. Clearly #N, the number of elements of N, is n^k. Consider I = (i_1, i_2 … i_k) in N. We define E_I in T^k(V) to be e_i_1 (x) e_i_2 (x) … (x) e_i_k.
Claim
The E_I form a basis of T^k(V). In particular, dim T^k(V) = n^k.
This claims follows from our previous claim about general tensor products.
Define S(n, k) to be the set of all subsets of {1, 2 … n} of size k. Evidently #S(n, k) is the binomial coefficient (n k). Consider
I = {i_1, i_2 … i_k} in S(n, k). We define F_I in L^k(V) to be
asym(e_i_1 (x) e_i_2 (x) … (x) e_i_k). I’m a bit cheating here since this expression depends on the order of i_1 … i_k. However, the only ambiguity is the sign: even permutations don’t change the expression whereas odd permutation change its sign. For our purposes, we can make an arbitrary choice of order/sign for each I in S(n, k).
Claim
The F_I form a basis of L^k(V). In particular, dim L^k(V) = (n k).
Define M(n, k) to be set of all multisets of size k made of elements of {1, 2 … n}. Multisets are like sets except that each element can appear in a multiset several times. The size of a multiset is defined by counting the elements with multiplicity. We have
#M(n, k) = (n + k – 1 k). Consider
I = {i_1, i_2 … i_k}. Here, i_m may coincide with i_l for some m and l. We define G_I in S^k(V) to be
sym(e_i_1 (x) e_i_2 (x) … (x) e_i_k).
Claim
The G_I form a basis of S^k(V). In particular,
dim S^k(V) = (n + k – 1 k).
Vector Space Determinant
The concept of a vector space determinant is standard and widely used, however not under this name and notation.
Definition
Consider a vector space V with dim V = n. Then, the determinant of V, denoted Dim V is the vector space L^n(V).
basic properties
- dim Det V = 1
- Det (V (+) W) is naturally isomorphic to Det V (x) Det W
- Suppose W is a subspace of V. Then Det V is naturally isomorphic to det W (x) det (V/W).
- Det (V*) is naturally isomorphic to (Det V)*
- Consider A: V –> W a linear operator, m a natural number. Then there is naturally induced operator
L^m(A): L^m(V) –> L^m(W). In the special case
dim V = dim W = m we get the operator
det A: Det V –> Det W. Let us specialize further to the case
V = W. Then det A: Det V –> Det V is simply multiplication by a constant c in the ground field k (since Det V is
1-dimensional). c is the conventional determinant of the operator A we all know and love.
Consider v1, v2 … vn elements of V. By a previous claim,
asym(v1 (x) v2 (x) … (x) vn) is a non-vanishing element of Det V if and only if v1, v2 … vn form a basis of V. Also, suppose e1 … en and f1 … fn are two bases of V related by the n x n matrix P, i.e.
(e1 … en) = (f1 … fn) P. Then
asym(e1 (x) … (x) en) = (det P) asym(f1 (x) … (x) fn).
Chirality
Consider V a complex quadratic vector space of dimension n. Then Det V contains two special non-vanishing elements that differ by a sign, say w and –w. These elements are constructed as follows. Consider e1 … en an orthonormal basis of V. Then
asym(e1 (x) e2 (x) … (x) en) is a non-vanishing element of Det V. Any two orthonormal bases are related by an orthogonal n x n matrix O. Since O is orthogonal, we have either det O = +1 or det O = –1. Thus asym(e1 (x) e2 (x) … (x) en) can only differ by a sign for different orthonormal bases, yielding w and –w.
Suppose W is an isotropic subspace of V. Any element v of V defines a linear functional v* on V defined by v*(u) = Q(v, u). Here u is an arbitrary element of V and Q is the quadratic form of V. If v and u both belong to W we get v*(u) = Q(v, u) = 0 since W is isotropic. Thus, given v in W, v* is a linear functional vanishing on W. Since it vanishes on W, it determines a linear functional a on V/W. This can be seen as follows. Any u in V/W can be represented by some u’ in V. We can then set a(u) = v*(u’). However, u’ is only defined up to adding an arbitrary element w of W. But it’s OK because v*(u’ + w) = v*(u) + v*(w) = v*(u) since the second term vanishes by our previous observation. Thus a is well defined.
For any v in W we constructed a linear functional on V/W i.e. an element of (V/W)*. This yields a linear operator W –> (V/W)*. Now suppose dim V = 2m and W is maximal isotropic, that is,
dim W = m. Then our linear operator is an isomorphism.
In the following, we use the symbol “=” to denote “naturally isomorphic”. Det V = Det W (x) Det (V/W). By the above,
W = (V/W)*. Hence
Det V = Det ((V/W)*) (x) Det (V/W) = (Det (V/W))* (x) Det(V/W) = C. In particular we get a special element in Det V: the element corresponding to 1 in C.
Claim
The special element is either w or –w. This invariant divides maximal isotropic subspaces into two classes (chiralities). Any two maximal isotropic subspaces W and W’ are related by an orthogonal operator O: V –> V. For det O = +1, W and W’ have the same chirality. For
det O = –1, W and W’ have opposite chirality.