Motivation to complexify real vector spaces:
One reason is to solve equations. If we want to prove theorems about real solutions to a system of real linear equations or a system of real linear differential equations, it can be convenient to examine the complex solution space, which gives us information about the real solution space.
Difference between direct sum and Cartesian product of two vector spaces
For the direct sum, the result is a new object of the same type: it has an algebraic structure. But Cartesian product is just a set. For example $R\oplus R$ is the set $R\times R$ with the operation $(a,b)+(c,d)=(a+c,b+d)$.
Complexifying with direct sums
Definition: The
complexification of a real vector space $W$ over $\mathbb{C}$ is defined to be $W_\mathbb{C}=W\oplus W$, the set of all ordered pairs $(w_1,w_2)$, where $w_1,w_2 \in W$ obtained by formally
extending scalar multiplication to include multiplication by complex numbers, with vector addition and multiplication by complex scalars defined as:
$(w_1+iv_1)+(w_2+iv_2)=(w_1+w_2)+i(v_1+v_2)$ for $w_1,v_1,w_2,v_2 \in W$
$(a+bi)(w_1,w_2)=(aw_1-bw_2,bw_1+aw_2)$ for $a,b \in \mathbb{R}, w_1,w_2 \in W$
This rule of multiplication is reasonable if you think about a pair $(w_1,w_2)\in W\oplus W$ as a formal sum of $w_1+iw_2$: $(a+bi)(w_1+iw_2)=aw_1+aiw_2+biw_1-bw_2=(aw_1-bw_2)+i(bw_1+aw_2)$.
In particular,
$i(w_1,w_2)=(-w_2,w_1)$.
[$(a+bi)(w_1,w_2)"="a(w_1,w_2)+bi(w_1,w_2)$ For this expression to equal $(aw_1-bw_2)+i(bw_1+aw_2)$, we must have $i(w_1,w_2)=(-w_2,w_1)$.]
We can check that $W_\mathbb{C}$ is a complex vector space by the multiplication rule. Since $i(w,0)=(0,w)$, we have $(w_1,w_2)=(w_1,0)+(0,w_2)=(w_1,0)+i(w_2,0)$. The $\mathbb{R}$-linear function
$w \mapsto (w,0)$ is called the
standard embedding of $W$ into $W_\mathbb{C}$. With this, we can regard $W_\mathbb{C}$ as $W+iW$.
Examples:
The complexifications of $\mathbb{R}^n,M_n(\mathbb{R}),\mathbb{R}[X]$ are isomorphic to $\mathbb{C}^n,M_n(\mathbb{C}),\mathbb{C}[X]$ by sending an ordered pair $(w_1,w_2)$ in $\mathbb{R}^n \oplus \mathbb{R}^n,M_n(\mathbb{R})\oplus M_n(\mathbb{R}),\mathbb{R}[X]\oplus \mathbb{R}[X]$ to $w_1+iw_2$ in $\mathbb{C}^n,M_n(\mathbb{C}),\mathbb{C}[X]$.
[$\mathbb{R}[X]$ is the set of all polynomials with real coefficients in the variable $x$.]
Namely, we can identify the complexification $(\mathbb{R}^2)_\mathbb{C}$ with $\mathbb{C}^2$ by
$(w_1,w_2)\mapsto w_1+iw_2$$\in \mathbb{R}^2+i\mathbb{R}^2=\mathbb{C}^2$ and this sends the basis vectors $\tiny\begin{pmatrix} 1\\0 \end{pmatrix}$ and $\tiny\begin{pmatrix} 0\\1 \end{pmatrix}$ of $\mathbb{R}^2 \subset (\mathbb{R}^2)_\mathbb{C}$ to $\tiny\begin{pmatrix} 1\\0 \end{pmatrix}$ and $\tiny\begin{pmatrix} 0\\1 \end{pmatrix}$ in $\mathbb{C}^2$, which are the standard basis vectors of $\mathbb{C}^2$ as a complex vector space.
Similarly, the identifications of $(\mathbb{R}^n)_\mathbb{C},M_n(\mathbb{R})_\mathbb{C},\mathbb{R}[X]_\mathbb{C}$ with $\mathbb{C}^n,M_n(\mathbb{C}),\mathbb{C}[X]$ turn every real basis of $\mathbb{R}^n,M_n(\mathbb{R}),\mathbb{R}[X]$ into a complex basis of $\mathbb{C}^n,M_n(\mathbb{C}),\mathbb{C}[X]$.
Denote the space of complex $n\times n$ matrices by $M_n(\mathbb{C})$, the space of anti-Hermitian $A^*=-A$ matrices by $u(n)$, the space of Hermitian matrices $A^*=A$ by $iu(n)$. Then $M_n(\mathbb{C})=u(n)\oplus iu(n)$. Thus $u(n)$ is a real form in $M_n(\mathbb{C})$ and $M_n(\mathbb{C})$ is a complexification of $u(n)$.
Definitions:
A map $T$ between complex vector spaces is said to be
$\mathbb{C}$-linear if $T(\lambda w)=\lambda T(w)$ for all $w$ in the domain in $T$ and for all $\lambda \in \mathbb{C}$.
A map $T$ between complex vector spaces is said to be
$\mathbb{C}$-antilinear or conjugate linear if it is additive and if $T(\lambda w)=\overline{\lambda}T(w)$ for all $w$ in the domain in $T$ and for all $\lambda \in \mathbb{C}$.
Theorem: If $W=0$, then $W_\mathbb{C}=0$. If $W\neq 0$ and $\{e_1,\cdots,e_n\}$ is an $\mathbb{R}$-basis of $W$, then $\{(e_1,0),\cdots,(e_n,0)\}$ is a $\mathbb{C}$-basis of $W_\mathbb{C}$. In particular,
$\text{dim}_\mathbb{C}(W_\mathbb{C})=\text{dim}_\mathbb{R}(W)$ for all $W$.
Proof: We want to prove $\{(e_1,0),\cdots,(e_n,0)\}$ is a basis for $W_\mathbb{C}$: spanning set & linearly independence.
Suppose $W\neq 0$ and $\{e_1,\cdots,e_n\}$ is an $\mathbb{R}$-basis of $W$.
Spanning:
If $\{e_1,\cdots,e_n\}$ is an $\mathbb{R}$-basis of $W$, then $\{(e_1,0),\cdots,(e_n,0),(0,e_1),\cdots,(0,e_n)\}$ is a basis for the real vector space $W\oplus W$.
Let $w\in W_\mathbb{C}$. Using the basis for $W\oplus W$, there exist $a_1,\cdots,a_n,b_1,\cdots,b_n \in \mathbb{R}$ such that
$\begin{align}w&=a_1(e_1,0)+\cdots+a_n(e_n,0)+b_1(0,e_1)+\cdots+b_n(0,e_n) & \qquad [*]\\
&=a_1(e_1,0)+\cdots+a_n(e_n,0)+ib_1(e_1,0)+\cdots+ib_n(e_n,0) &[ib(e_j,0)=b(0,e_j)]\\
&=(a_1+ib_1)(e_1,0)+\cdots+(a_n+ib_n)(e_n,0) & \qquad [**]\end{align}$
$[*]: \mathbb{R}$-linear combination of $(e_j,0)$ and $(0,e_j)$
$[**]: \mathbb{C}$-linear combination of $(e_j,0)$
Therefore, $\{(e_1,0),\cdots,(e_n,0)\}$ is a $\mathbb{C}$-linear spanning set of $W_\mathbb{C}$.
Linearly independence:
Suppose we can write $(0,0)$ as a finite $\mathbb{C}$-linear combination of $(e_j,0)$.
$(a_1+ib_1)(e_1,0)+\cdots+(a_n+ib_n)(e_n,0)=(0,0)$ for some real $a_i,b_i\;\;i=1,\cdots,n$
Equivalently, $(a_1e_1+\cdots+a_ne_n,b_1e_1+\cdots+b_ne_n)=(0,0).\qquad [bi(e_j,0)=(0,be_j)]$
Therefore, $\sum\limits_{i=1}^n a_ie_i=0$ and $\sum\limits_{i=1}^n b_ie_i=0$ in $W$.
Since $e_j$'s are linearly independent over $\mathbb{R}$, all the coefficients $a_j$ and $b_j$ are 0, so $a_j+ib_j=0$ for all $j=1,\cdots,n\:\Box$.
Universal property of complexification
Theorem: For any $\mathbb{R}$-linear map $W\xrightarrow{f}V$ from $W$ into a complex vector space $V$, there is a unique $\mathbb{C}$-linear map $W_\mathbb{C}\xrightarrow{\tilde{f}}V$ making the diagram
$\;\; W \xrightarrow{\quad \quad} W_\mathbb{C}\\ \quad f \searrow \quad \nearrow \tilde{f}\\ \qquad\quad V$
commute, where the map $W\to W_\mathbb{C}$ is the standard embedding.
This theorem says $W\to W_\mathbb{C}$ is an initial object that admits a unique morphism to all other objects.
Complexification of linear transformation
Theorem: Every $\mathbb{R}$-linear transformation $\varphi:W\to W'$ of real vector spaces extends in a unique way to a $\mathbb{C}$-linear transformation of the complexifications: there is a unique $\mathbb{C}$-linear map $\varphi_\mathbb{C}:W_\mathbb{C}\to W'_\mathbb{C}$ making the diagram
$\qquad \qquad \;W\xrightarrow{\quad\;\varphi\;\quad}W'\\ w\mapsto (w,0) \Bigg\downarrow \qquad\qquad \Bigg\downarrow \varphi(w)\mapsto (\varphi(w),0) \\ \qquad \qquad W_\mathbb{C} \xrightarrow{\quad \varphi_\mathbb{C} \quad} W'_\mathbb{C}$
commute, where the vertical maps are the standard embeddings of real vector spaces into their complexifications.
Proof: If such a $\mathbb{C}$-linear map $\varphi_\mathbb{C}$ exists, then the commutativity of the diagram says $\varphi_\mathbb{C}(w,0)=(\varphi(w),0)$ for all $w\in W$. Therefore, for $(w_1,w_2)\in W_\mathbb{C}$, we have
$\begin{align} \varphi_\mathbb{C}(w_1,w_2)&=\varphi_\mathbb{C}(w_1,0)+\varphi_\mathbb{C}(0,w_2)\\
&=(\varphi_\mathbb{C}(w_1),0)+i\varphi_\mathbb{C}(w_2,0)\\
&=(\varphi(w_1),0)+i(\varphi(w_2),0)\\
&=(\varphi(w_1),0)+(0,\varphi(w_2))\\
&=(\varphi(w_1),\varphi(w_2)) \end{align}$
This tells us what $\varphi_\mathbb{C}$ must be. So define $\varphi_\mathbb{C}:W_\mathbb{C}\to W'_\mathbb{C}$ by $\varphi_\mathbb{C}(w_1,w_2)=(\varphi(w_1),\varphi(w_2))$. We need to check that $\varphi_\mathbb{C}$ is $\mathbb{C}$-linear. Now $\varphi_\mathbb{C}(w_1,w_2)=\varphi_\mathbb{C}(w_1,0)+i\varphi_\mathbb{C}(w_2,0)=(\varphi(w_1),0)+i(\varphi(w_2),0)=\varphi(w_1)+i\varphi(w_2).\:\Box$
In other words, to an operator $\varphi:W\to W'$, there corresponds an operator $\varphi_\mathbb{C}:W_\mathbb{C}\to W'_\mathbb{C}$ given by the formula $\varphi_\mathbb{C}(a+ib)=\varphi(a)+i\varphi(b)$. The operator $\varphi_\mathbb{C}$ is called the complexification of $\varphi$.
If $\{w_1,\cdots,w_n\}$ is a basis for $W$, $\{w'_1,\cdots,w'_m\}$ is a basis for $W'$, and $A$ is the matrix representation of $\varphi$ with respect to these bases, then $A$, regarded as a complex matrix, is also the representation of $W_\mathbb{C}$ with respect to the corresponding bases in $W_\mathbb{C}$ and $W'_\mathbb{C}$.
So the complexification process is a formal,
coordinate-free way of saying: take the matrix $A$ of $\varphi$, with its real entries, but operate on it as a complex matrix. The advantage of making this abstracted definition is that we are not required to fix a choice of coordinates and use matrix representations. Say we want to make arguments about the complex eigenvalues and eigenvectors for a transformation $T:W\to W'$, while non-real eigenvalues and eigenvectors, by definition, cannot exist for a transformation between real vector spaces. What we really mean are the eigenvalues and eigenvectors of $\varphi_\mathbb{C}$. Also, the complexification process generalizes without change for infinite-dimensional spaces.
Complexifying with tensor products [pending]
Conjugations
In the complexification $W_\mathbb{C}$ of a real vector space $W$, we can define complex conjugation by $\overline{w}=\overline{u+iv}=u-iv$.
In an arbitrary complex vector space, however, there is no natural, basis-independent way of defining complex conjugation of vectors. For example, if we set the complex conjugate of a vector $u=(a+bi)e_j$ to be $\overline{u}=(a-bi)e_j$, this definition will give a different answer in the basis $\{ie_j\}$ since $u=[-i(a+bi)](ie_j) \Rightarrow \overline{u}=[i(a-bi)](ie_j)=-(a-bi)e_j$.
Thus the concept of complex conjugation of vectors requires prior knowledge of the 'real part' of a complex vector space. The complexification of a real space has precisely the required extra structure needed to define complex conjugation of vectors.
Later, we will use conjugation to describe the real subspaces which have the given complex vector space as its complexification.
Realification/decomplexificaction
One way of creating a real vector space $W_\mathbb{R}$ from a complex vector space $W$ is to forget altogether about the possibility of multiplying vectors by complex numbers and only allow scalar multiplication with real numbers. In abstract algebra, we call this the
restriction of scalars from complex numbers to real numbers. In this process a pair of vectors $u$ and $iu$ must be regarded as linearly independent vectors in $W_\mathbb{R}$ for any non-zero vector $u\in W$. Thus if $W$ is finite-dimensional and $\text{dim}W=n$, then $W_\mathbb{R}$ is $2n$-dimensional, namely $\text{dim}_\mathbb{R} W=2\text{dim}_\mathbb{C}W$, for if $\{e_1,\cdots,e_n\}$ is a basis of $W$ then $\{e_1,\cdots,e_n,ie_1,\cdots,ie_n\}$ is a linearly independent set of vectors spanning $W_\mathbb{R}$.
We can verify that if $A=B+iC$ is the matrix of a linear map $A:V\to W$ with respect to bases $\{e_1,\cdots,e_n\}, \{\epsilon_1,\cdots,\epsilon_m\}$ and the matrices $B$ and $C$ are real, then the matrix of the linear map $A_\mathbb{R}$ with respect to the bases $\{e_1,\cdots,e_n\},\{ie_1,\cdots,ie_n\}$ and $\{\epsilon_1,\cdots,\epsilon_m\},\{i\epsilon_1,\cdots,i\epsilon_m\}$ is of the form $\begin{pmatrix} B & -C \\ C & B \end{pmatrix}$.
Example:
Say $n=2$, if $\{a+ib,c+id\}$ is a basis for a complex vector space $W$, then $\{(a,b,0,0),(-b,a,0,0),(0,0,c,d),(0,0,-d,c)\}$ is the associated basis for $W_\mathbb{R}=\mathbb{R}^{2\times2}=\mathbb{R}^4$.
More to explore:
Complexification of Lie algebras – a tool useful for representation theory
Complex spectral theorem – deducing a real classification for orthogonal matrices
Spliting characteristic polynomials
Extension of scalars & restriction of scalars
References:
Complexification
More to explore
Difference between Cartesian product and tensor product of two vector spaces
[?]
Change of basis matrix
Complexification and realification [p3,5]
A Course in Modern Mathematical Physics Groups, Hilbert Space and Differential Geometry by Peter Szekeres [p154-156]
Ordinary Differential Equations by Vladimir I. Arnol'd [p177-189]
Problems and Theorems in Linear Algebra by Viktor Vasil' evich Prasolov [p55-57]