PRACTICAL EXAMPLES AND REVIEW OF TENSORS:

From this question:

If we have to (co)-vectors in \(V^*:\)

\[\beta=\begin{bmatrix}1 &2 &3 \end{bmatrix}\] and \[\gamma=\begin{bmatrix}2 &4 &6 \end{bmatrix}\]

the \((2,0)\)-tensor \(\beta\otimes \gamma\) is the outer product:

\[\beta\otimes_o \gamma=\begin{bmatrix}2\,e^1\otimes e^1&4\,e^1\otimes e^2&6\,e^1\otimes e^3\\4\,e^2\otimes e^1&8\,e^2\otimes e^2&12\,e^2\otimes e^3\\6\,e^3\otimes e^1&12\,e^3\otimes e^2&18\,e^3\otimes e^3\end{bmatrix}\]

Now if apply this tensor product on the vectors

\[v=\begin{bmatrix}1\\-1\\5\end{bmatrix}, \; w = \begin{bmatrix}2\\0\\3\end{bmatrix}\]

\[\begin{align} (\beta \otimes \gamma)[v,w]&=\\[2ex] & 2 \times 1 \times 2 \quad+\quad 4 \times 1 \times 0 \quad +\quad 6 \times 1 \times 3 \\ +\;&4 \times -1 \times 2 \quad + \quad 8 \times -1 \times 0 \quad + \quad 12 \times -1 \times 3 \\ +\;&6 \times 5 \times 2 \quad + \quad 12 \times 5 \times 0 \quad + \quad 18 \times 5 \times 3 \\[2ex] &= 308\end{align}\]

Is this correct?

Yes, this is correct. The answer should be \(\left<\beta,v\right> \cdot \left<\gamma,w\right>\), where \(\langle\,,\,\rangle\) is the usual inner product:

\[\vec \beta \cdot \vec v \times \vec \gamma \cdot \vec w = 308.\]

v = c(1,-1,5); w = c(2,0,3); beta = 1:3; gamma = c(2,4,6); beta %*% v * gamma %*% w
##      [,1]
## [1,]  308

From this source:



Applied to the case in this question, the change of basis matrix is \(\small\begin{bmatrix}3&4&-1\\0&3&7\\1&3&0.5\end{bmatrix}\), and its inverse \(\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\). The vectors \(v\) and \(w\) in the new coordinate system are

\(v =\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\2\\3\end{bmatrix} =\begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}\) and \(w=\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\0\\0\end{bmatrix}=\begin{bmatrix}0.7\\-0.3\\0.1\end{bmatrix}\).

Therefore,

\[\begin{align}\large v\otimes w=\left(-.23\tilde x + 1.9\tilde y -0.5 \tilde z\right)\otimes \left(0.7\tilde x -0.3\tilde y + 0.1\tilde z\right)\\[2ex]=-1.6\;\tilde x\otimes \tilde x + 1.3\;\tilde x\otimes \tilde y -0.3 \;\tilde x\otimes \tilde z + 0.6\;\tilde y\otimes \tilde x -0.5\;\tilde y\otimes \tilde y+ 0.1\;\tilde y\otimes \tilde z -0.3\;\tilde z\otimes \tilde x +0.2 \;\tilde z\otimes \tilde y-0.1\;\tilde z\otimes \tilde z\end{align}\]

So what’s the point?

Starting off defining the tensor product of two vector spaces (\(V\otimes W\)) with the same bases, we end up calculating the outer product of two vectors:

\[\large v\otimes_o w=\small \begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}\begin{bmatrix}0.7&-0.3&0.1\end{bmatrix}=\begin{bmatrix}-1.6&1.3&-0.3\\0.6&-0.5&0.1\\-0.3&0.2&-0.1\end{bmatrix}\]

This connect this post to this more general question.


From the answer to this question about Spivak’s definition of tensor product as:

\[(S \otimes T)(v_1,...v_n,v_{n+1},...,v_{n+m})= S(v_1,...v_n) * T(v_{n+1},...,v_{n+m})\]

Let’s first set some terminology.

Let \(V\) be an \(n\)-dimensional real vector space, and let \(V^*\) denote its dual space. We let \(V^k = V \times \cdots \times V\) (\(k\) times).

In other words, a covariant \(k\)-tensor is a tensor of type \((k,0)\). This is what Spivak refers to as simply a “\(k\)-tensor.”

In other words, a contravariant \(k\)-tensor is a tensor of type \((0,k)\).

\[\begin{align*} T^k(V) := T^k_0(V) & = \{\text{covariant $k$-tensors}\} \\ T_k(V) := T^0_k(V) & = \{\text{contravariant $k$-tensors}\}. \end{align*}\] Two important special cases are: \[\begin{align*} T^1(V) & = \{\text{covariant $1$-tensors}\} = V^* \\ T_1(V) & = \{\text{contravariant $1$-tensors}\} = V^{**} \cong V. \end{align*}\] This last line means that we can regard vectors \(v \in V\) as contravariant 1-tensors. That is, every vector \(v \in V\) can be regarded as a linear functional \(V^* \to \mathbb{R}\) via \[v(\omega) := \omega(v),\] where \(\omega \in V^*\).

In particular, vectors (contravariant 1-tensors) and dual vectors (covariant 1-tensors) have rank 1.


If \(S \in T^{r_1}_{s_1}(V)\) is an \((r_1,s_1)\)-tensor, and \(T \in T^{r_2}_{s_2}(V)\) is an \((r_2,s_2)\)-tensor, we can define their tensor product \(S \otimes T \in T^{r_1 + r_2}_{s_1 + s_2}(V)\) by

\[(S\otimes T)(v_1, \ldots, v_{r_1 + r_2}, \omega_1, \ldots, \omega_{s_1 + s_2}) = \\ S(v_1, \ldots, v_{r_1}, \omega_1, \ldots,\omega_{s_1})\cdot T(v_{r_1 + 1}, \ldots, v_{r_1 + r_2}, \omega_{s_1 + 1}, \ldots, \omega_{s_1 + s_2}).\]

Taking \(s_1 = s_2 = 0\), we recover Spivak’s definition as a special case.

Example: Let \(u, v \in V\). Again, since \(V \cong T_1(V)\), we can regard \(u, v \in T_1(V)\) as \((0,1)\)-tensors. Their tensor product \(u \otimes v \in T_2(V)\) is a \((0,2)\)-tensor defined by \[(u \otimes v)(\omega, \eta) = u(\omega)\cdot v(\eta)\]


As I suggested in the comments, every bilinear map – i.e. every rank-2 tensor, be it of type \((0,2)\), \((1,1)\), or \((2,0)\) – can be regarded as a matrix, and vice versa.

Admittedly, sometimes the notation can be constraining. That is, we’re used to considering vectors as column vectors, and dual vectors as row vectors. So, when we write something like \[u^\top A v,\] our notation suggests that \(u^\top \in T^1(V)\) is a dual vector and that \(v \in T_1(V)\) is a vector. This means that the bilinear map \(V \times V^* \to \mathbb{R}\) given by \[(v, u^\top) \mapsto u^\top A v\] is a type \((1,1)\)-tensor.

Example: Let \(V = \mathbb{R}^3\). Write \(u = (1,2,3) \in V\) in the standard basis, and \(\eta = (4,5,6)^\top \in V^*\) in the dual basis. For the inputs, let’s also write \(\omega = (x,y,z)^\top \in V^*\) and \(v = (p,q,r) \in V\). Then \[\begin{align*} (u \otimes \eta)(\omega, v) & = u(\omega) \cdot \eta(v) \\ & = \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} (x,y,z) \cdot (4,5,6) \begin{pmatrix} p \\ q \\ r \end{pmatrix} \\ & = (x + 2y + 3z)(4p + 5q + 6r) \\ & = 4px + 5 qx + 6rx \\ & \ \ \ \ \ 8py + 10qy + 12py \\ & \ \ \ \ \ 12pz + 15qz + 18rz \\ & = (x,y,z)\begin{pmatrix} 4 & 5 & 6 \\ 8 & 10 & 12 \\ 12 & 15 & 18 \end{pmatrix}\begin{pmatrix} p \\ q \\ r \end{pmatrix} \\ & = \omega \begin{pmatrix} 4 & 5 & 6 \\ 8 & 10 & 12 \\ 12 & 15 & 18 \end{pmatrix} v. \end{align*}\]

Conclusion: The tensor \(u \otimes \eta \in T^1_1(V)\) is the bilinear map \((\omega, v)\mapsto \omega A v\), where \(A\) is the \(3 \times 3\) matrix above.

The Wikipedia article you linked to would then regard the matrix \(A\) as being equal to the tensor product \(u \otimes \eta\).


Finally, I should point out two things that you might encounter in the literature.

First, some authors take the definition of an \((r,s)\)-tensor to mean a multilinear map \(V^s \times (V^*)^r \to \mathbb{R}\) (note that the \(r\) and \(s\) are reversed). This also means that some indices will be raised instead of lowered, and vice versa. You’ll just have to check each author’s conventions every time you read something.

Second, note that there is also a notion of tensor products of vector spaces. Many textbooks, particularly ones focused on abstract algebra, regard this as the central concept. I won’t go into this here, but note that there is an isomorphism \[T^r_s(V) \cong \underbrace{V^* \otimes \cdots \otimes V^*}_{r\text{ copies}} \otimes \underbrace{V \otimes \cdots \otimes V}_{s \text{ copies}}.\]

Confusingly, some books on differential geometry define the tensor product of vector spaces in this way, but I think this is becoming rarer.


Home Page

NOTE: These are tentative notes on different topics for personal use - expect mistakes and misunderstandings.