Vector Spaces
The essential features of vectors is that we can have a linear combination of vectors which again is a vector. This idea can be generalized to different kinds of quantities, not just about magnitude and direction.
Definition. A vector space $V$ over the field $\mathbb{F}$ is a set $V$ of vectors, a field $\mathbb{F}$ of scalars, and two binary operations: an addition $a + b$ of vectors $a$ and $b$, and a scalar multiplication $\lambda v$ of a scalar $\lambda$ and a vector $v$, which together satisfy the following properties/axioms:
$V$ is closed under addition, i.e.
\[\forall a, b \in V \implies a + b \in V\]addition is commutative, i.e. for all $a, b \in V$
\[a + b = b + a\]addition is associative, i.e. for all $a, b, c \in V$
\[a + (b + c) = (a + b) + c\]addition has an identity element, i.e. there exists an element $0 \in V$ such that
\[a + 0 = a\]for all $a \in V$, there exists an additive inverse $a’ \in V$ such that
\[a + a' = 0\]$V$ is closed under scalar multiplication, i.e.
\[\forall \lambda \in F, \forall a \in V \implies \lambda a \in V\]scalar multiplication is distributive over scalar addition, i.e. for all $\lambda, \mu \in \mathbb{F}$ and $a \in V$
\[(\lambda + \mu)a = \lambda a + \mu a\]scalar multiplication is distributive over vector addition, i.e. for all $\lambda \in \mathbb{F}$ and $a, b \in V$
\[\lambda (a + b)= \lambda a + \lambda b\]scalar multiplication is “associative”, i.e. for all $\lambda, \mu \in \mathbb{F}$ and $a \in V$
\[\lambda (\mu a) = (\lambda \mu) a\]scalar multiplication has an identity element, i.e. there exists an element $1 \in \mathbb{F}$ such that
\[1 a = a\]
Base on our study of vectors in $\mathbb{R}^2$/$\mathbb{R}^3$, we can see that they form a vector space over $\mathbb{R}$.
Properties
We generalize properties of vectors in $\mathbb{R}^2$/$\mathbb{R}^3$ and prove them algebraically.
Prop. Vector space is an abelian group under vector addition, hence we have
additive identity $0 \in V$ is unique
additive inverse $-a \in V$ is unique
Proof.
Base on the vector space and abelian group definition.
Definition. Vector subtraction is defined by addition of additive inverse, i.e.
\[a - b = a + (-b)\]
Prop. Scalar multiplication by $0 \in \mathbb{F}$ yields the additive identity $0_V \in V$.
Proof.
Let $a \in V$, we have
\[\begin{align*} 0_{\mathbb{F}} a &= 0_{\mathbb{F}} a + 0_{V} \\ &= 0_{\mathbb{F}} a + a + (-a) \\ &= (0_{\mathbb{F}} + 1) a + (-a) \\ &= a + (-a) \\ &= 0_{V} \end{align*}\]
Prop. Scalar multiplication by $-1 \in \mathbb{F}$ yields the additive inverse $-a \in V$.
Proof.
Let $a \in V$, we have
\[\begin{align*} (-1)a &= (-1)a + 0_{V} \\ &= (-1)a + a + (-a) \\ &= (-1 + 1) a + (-a) \\ &= 0_{\mathbb{F}}a + (-a) \\ &= 0_{V} + (-a) \\ &= -a \end{align*}\]
Prop. Scalar multiplication with zero vector $0_V$ yields $0_V$, i.e. $\lambda 0_V = 0_V$.
Proof.
Let $a \in V$, we have
\[\lambda 0_V + \lambda a = \lambda (0_V + a) = \lambda a\]Hence, $\lambda 0_v$ is a zero vector and from the above zero vector is unique, so $\lambda 0_V = 0_V$.
Prop. If $\lambda a = 0_V$, either $\lambda = 0_{\mathbb{F}}$ or $a = 0_V$.
Proof.
If $\lambda \not = 0_{\mathbb{F}}$, there exists $\lambda^{-1}$ such that
\[a = (\lambda^{-1}\lambda)a = \lambda^{-1}(\lambda a) = \lambda^{-1}0_V = 0_V\]If $a \not = 0_V$, then $\lambda a = 0_V$ for all other $a \in V$ only when $\lambda = 0_\mathbb{F}$.
Prop. Negation commutes freely, i.e.
\[(-\lambda)a = \lambda(-a) = -(\lambda a)\]Proof.
Let $a \in V$ and $\lambda \in \mathbb{F}$, we have
\[\begin{align*} (-\lambda)a &= (\lambda(-1))a \\ &= \lambda((-1)a) \\ &= \lambda(-a) \end{align*}\]and
\[\begin{align*} (-\lambda)a &= ((-1)\lambda)a \\ &= (-1)(\lambda a) \\ &= -(\lambda a) \end{align*}\]
Scalar Product
Definition. For a vector space $V$ over $\mathbb{R}$, The scalar/inner product of vectors $a, b \in V$, denoted by $a \cdot b$ or $\langle a \mid b \rangle$, is a map $V \times V \to \mathbb{R}$ that satisfies the following properties:
Symmetric, i.e.
\[a \cdot b = b \cdot a\]Linearilty in the second argument, i.e.
\[a \cdot (\lambda b + \mu c) = \lambda (a \cdot b) + \mu (a \cdot c)\]Non-negativity, with equality holds iff $a = 0_V$, i.e.
\[a \cdot a \ge 0_\mathbb{R}\]Non-degeneracy, the only vector of zero norm should be the zero vector, i.e.
\[\vert a \vert = 0 \implies a = 0_V\]
This definition is only for real vector spaces. For complex vector spaces, we have a different set of axioms.
Definition. The norm of a vector, denoted by $\vert a \vert$ or $\Vert a \Vert$, is defined by
\[\vert a \vert \equiv \Vert a \Vert = \sqrt{a \cdot a}\]
Cauchy-Schwarz Inequality
Theorem. [Cauchy-Schwarz Inequality] For all $a, b \in V$,
\[\vert a \cdot b \vert \le \vert a \vert \vert b \vert\]with equality only when $a = 0_V$, $b = 0_V$ or $a = \lambda b$.
Proof.
Consider the expression $\vert a + \lambda b \vert^2$, from the above axioms about scalar product, we have
\[\begin{align*} \vert a + \lambda b \vert^2 &\ge 0 \\ (a + \lambda b) \cdot (a + \lambda b) &\ge 0 \\ \vert a \vert^2 + 2 (a \cdot b) \lambda + \vert b \vert^2 \lambda^2 &\ge 0 \end{align*}\]If $b = 0_V$, $\vert b \vert = 0$, we have $\vert a \cdot b \vert = \vert a \vert \vert b \vert = 0$.
If $b \not = 0_V$, by viewing the above as a quadratic equation in $\lambda$, as it is always non-negative, it has at most one root. Hence,
\[\Delta = (2 a \cdot b)^2 - 4 \vert a \vert^2 \vert b \vert^2 \le 0\] \[\vert a \cdot b \vert \le \vert a \vert \vert b \vert\]and equality holds only when $a = -\lambda b$ for some $\lambda$.
Subspaces
Definition. For a vector space $V$, a subspace of $V$ is a non-empty subset $U$ of the vectors of $V$ if $U$ is a vector space under the same operations (i.e. vector addition and scalar multiplication) as are used to define $V$.
Theorem. A subset $U$ of a vector space $V$ is a subspace of $V$ iff under the operations defined on $V$
$\forall a, b \in U, a + b \in U$
$\forall a \in U, \lambda \in \mathbb{F}, \lambda a \in U$
or in short
\[\forall a, b \in U, \forall \lambda, \mu \in \mathbb{F}, \lambda a + \mu b \in U\]Proof.
($\Rightarrow$) If $U$ is a subspace then it is a vector space, by definition the above conditions hold.
($\Leftarrow$) If the above holds, most of the axioms of vector space obviously holds as elements of $U$ are also elements of $V$. The only non-trivial axioms are the existence of identity and inverse in $U$:
As $0_F a \in U$ and $0_F a = 0_V$, $0_V \in U$
As $(-1)a \in U$ and $(-1)a = -a$, $-a \in U$
Definition. A proper subspace is a subspace of $V$ that is not $V$ or $\Set{0_V}$.
Reference
- Alan F. Beardon Algebra and Geometry, 2005 - Chapter 7
- Stephen J. Cowley Algebra and Geometry Lectures Notes, 2006 - Chapter 2