Vector Spaces
Contents
4.1. Vector Spaces#
4.1.1. Algebraic Structures#
In mathematics, the term algebraic structure refers to an arbitrary set with one or more operations defined on it.
Simpler algebraic structures include groups, rings, and fields.
More complex algebraic structures like vector spaces are built on top of the simpler structures.
We will develop the notion of vector spaces as a progression of these algebraic structures.
4.1.1.1. Groups#
A group is a set with a single binary operation. It is one of the simplest algebraic structures.
Definition 4.1 (Group)
Let
If the binary operation
[Closure] The set
is closed under the binary operation . i.e.[Associativity] For every
[Identity element] There exists an element
such that[Inverse element] For every
there exists an element such that
then the set
Above properties are known as group axioms. Note that commutativity is not a requirement of a group.
Remark 4.1
Frequently, the group operation is the regular mathematical
addition. In those cases, we write
Often, we may simply write a group
4.1.1.2. Commutative groups#
A commutative group is a richer structure than a group. Its elements also satisfy the commutativity property.
Definition 4.2 (Commutative group)
Let
[Commutativity] For every
Then
4.1.1.3. Rings#
A ring is a set with two binary operations defined over it with some properties as described below.
Definition 4.3 (Associative ring)
Let
is a commutative group. is closed under multiplication.Multiplication is associative.
Multiplication distributes over addition.
Then,
Remark 4.2
In the sequel we will write
There is a hierarchy of ring like structures. In particular we mention:
Associative ring with identity
Field
Definition 4.4 (Associative ring with identity)
Let
Then
4.1.1.4. Fields#
Field is the richest algebraic structure on one set with two operations.
Definition 4.5 (Field)
Let
is a commutative group (with additive identity as ). is a commutative group (with multiplicative identity as ).Multiplication distributes over addition:
Then
The definition above implies a number of properties. For any
Closure:
, .Associativity of addition and multiplication:
and .Commutativity of addition and multiplication:
and .Additive identity:
.Multiplicative identity:
.Additive inverses:
.Multiplicative inverses for every
: .Distributivity:
.
Example 4.1 (Examples of fields)
The set of real numbers
is a field.The set of complex numbers
is a field.The Galois field GF-2 is the the set
with modulo-2 additions and multiplications.
4.1.2. Vector Spaces#
We are now ready to define a vector space.
A vector space involves two sets.
One set
Definition 4.6 (Vector space)
A set
Vector addition:
mapping for any to another vector in denoted byScalar multiplication:
mapping with and to another vector in denoted by or just
which satisfy following requirements:
is a commutative group.Scalar multiplication
distributes over vector addition :Addition in
distributes over scalar multiplication :Multiplication in
commutes over scalar multiplication:Scalar multiplication from multiplicative identity
satisfies the following:
Some remarks are in order:
as defined above is also known as a vector space.Elements of
are known as vectors.Elements of
are known as scalars.There are two
s involved: and . It should be clear from context which is being referred to. is known as the zero vector.All vectors in
are non-zero vectors.We will typically denote elements of
by .We will typically denote elements of
by .
We quickly look at some vector spaces which will appear again and again in our discussions.
4.1.2.1. N-Tuples#
Example 4.2 (
Let
Let
and
Vector addition is defined as
The zero vector is defined as:
Let
In matrix notation, vectors in
or column vectors:
We next verify that
We can now verify all the properties of the vector space:
is closed under vector addition.Since
, hence .Vector addition is associative.
Vector addition is commutative.
Additive identity:
Since commutativity is already established, hence:
Thus,
is the additive identity element.Additive inverse: Let
Then,
By commutativity
. Thus, every vector has an additive inverse. We can write .Scalar multiplication distributes over vector addition.
Addition in
distributes over scalar multiplication.Multiplication commutes over scalar multiplication:
Similarly,
Scalar multiplication from multiplicative identity of
.
Thus,
Example 4.3 (A field is a vector space)
A field of scalars
The real line
More formally, let
For every
, let be corresponding vector in with .Define vector addition on
as as the corresponding addition in the field :Define scalar multiplication on
as the corresponding multiplication in the field :Define zero vector as
: .
We can now verify all the properties of the vector space:
is closed under vector addition. Let . Then, . Since , hence .Vector addition is associative. For every
:Vector addition is commutative.
For every
,
Thus,
Let
. Define . Since , hence . Now,Thus, every vector has an additive inverse.
Scalar multiplication distributes over vector addition. Let
and .Addition in
distributes over scalar multiplication. Let and .Multiplication commutes over scalar multiplication:
Scalar multiplication from multiplicative identity of
.
Thus,
4.1.2.2. Matrices#
Example 4.4 (Matrices)
Let
with
The set of these matrices is denoted as
Let
Let
4.1.2.3. Polynomials#
Example 4.5 (Polynomials)
Let
where
The set
4.1.2.4. Vector Space Identities#
Some useful identities are presented here.
Theorem 4.1 (Uniqueness of additive identity)
The
Proof. Assume that there is another vector
Then, in particular, it satisfies:
Thus,
Theorem 4.2 (Cancellation law)
Let
Proof. Since
Now
Corollary 4.1 (Uniqueness of additive inverse)
The additive inverse of a vector
Proof. Let
By cancellation law,
Thus, the additive inverse of a vector is unique.
Theorem 4.3
In a vector space
. . .
Proof. (1)
By distributive law:
By existence of zero vector:
Thus,
Now, by cancellation law:
(2)
Due to additive identity and distribution law:
Due to additive identity:
Combining
Now applying the cancellation law, we get:
(3)
We start with
Thus,
Similarly,
Thus,
4.1.3. Linear Independence#
Definition 4.7 (Linear combination)
A linear combination of two vectors
where
A linear combination of
where
Note that a linear combination always consists of a finite number of vectors.
Definition 4.8 (Linear combination vector)
Let
We also say that
Definition 4.9 (Trivial linear combination)
A linear combination
A linear combination may refer to the expression itself or its value. e.g. two different linear combinations may have same value.
Definition 4.10 (Linear dependence)
A finite set of non-zero vectors
Since
Definition 4.11 (Linearly dependent set)
A set
In other words, there exists a non-trivial linear
combination which equals
Definition 4.12 (Linearly independent set)
A set
Definition 4.13 (Linearly independent vectors)
More specifically a finite set of non-zero vectors
In other words, the only linear combination giving us
Example 4.6 (Examples of linearly dependent and independent sets)
The empty set is linearly independent.
A set of a single non-zero vector
is always linearly independent. Prove!If two vectors are linearly dependent, we say that they are collinear.
Alternatively if two vectors are linearly independent, we say that they are not collinear.
If a set
is linearly independent, then any subset of it will be linearly independent. Prove!Adding another vector
to the set may make it linearly dependent. When?It is possible to have an infinite set to be linearly independent. Consider the set of polynomials
. This set is infinite, yet linearly independent.
Theorem 4.4
Let
Corollary 4.2
Let
4.1.4. Span#
Vectors can be combined to form other vectors. It makes sense to consider the set of all vectors which can be created by combining a given set of vectors.
Definition 4.14 (Span)
Let
For convenience we define
Span of a finite set of vectors
We say that a set of vectors
Proposition 4.1
Let
Definition 4.15 (Spanning a vector space)
Let
In this case we also say that vectors of
Theorem 4.5
Let
4.1.5. Basis#
Definition 4.16
A set of linearly independent vectors
Example 4.7 (Basis examples)
Since
and is linearly independent, is a basis for the zero vector space .The basis
with , , , , is called the standard basis for .The set
is the standard basis for . Indeed, an infinite basis. Note that though the basis itself is infinite, yet every polynomial is a linear combination of finite number of elements from the basis.
We review some properties of bases.
Theorem 4.6 (Unique representation)
Let
for unique scalars
This theorem states that a basis
Proof. Assume that each
Then
is a unique linear combination of .But
.Thus,
is linearly independent.Also,
spans since every vector can be expressed as a linear combination.Thus,
is a basis.
Assume that
Then,
is linearly independent and spans .Let
.Let
and be two different representations of in .Then,
But
is linearly independent.Hence,
for .Thus,
for .Hence, every
has a unique representation as a linear combination of .
If the basis is infinite, then the above theorem needs to be modified as follows:
Theorem 4.7 (Unique representation for infinite basis)
Let
for unique scalars
Theorem 4.8
If a vector space
Theorem 4.9 (Replacement theorem)
Let
Then
Corollary 4.3
Let
4.1.6. Dimension#
Definition 4.17 (Dimension of vector space)
A vector space
If
Example 4.8 (Vector space dimensions)
Dimension of
is .Dimension of
is .The vector space of polynomials
is infinite dimensional.
Proposition 4.2
Let
Any finite spanning set for
contains at least vectors, and a spanning set that contains exactly vectors is a basis for .Any linearly independent subset of
that contains exactly vectors is a basis for .Every linearly independent subset of
can be extended to a basis for .
4.1.7. Ordered Basis#
Definition 4.18 (Ordered basis)
For a finite dimensional vector space
Typically, we will write an ordered basis as
With the help of an ordered basis, we can define a coordinate vector.
Definition 4.19 (Coordinate vector)
Let
The coordinate vector of
4.1.8. Subspaces#
Definition 4.20 (Subspace)
Let
are defined by restricting
Example 4.9 (Trivial subspaces)
is a subspace of . is a subspace of any .
Theorem 4.10 (Subspace characterization)
A subset
. whenever . whenever and .
In other words, a subset of
Example 4.10 (Symmetric matrices)
A matrix
The set of symmetric matrices forms a subspace of set of all
Example 4.11 (Diagonal matrices)
A matrix
The set of diagonal matrices is a subspace of
Definition 4.21 (Proper subspace)
A subspace
In other words, there are vectors in
Theorem 4.11 (Intersection of subspaces)
Any intersection of subspaces of a vector space
Proof. Let
Let the intersection of the subspaces be:
[Zero vector]
Since
for every , hence .
[Vector addition]
Let
for every .Then,
for every since are closed under vector addition.Then,
.Thus,
is closed under vector addition.
[Scalar multiplication]
Let
for every and let .Then,
for ever since are closed under scalar multiplication.Thus,
.Thus,
is closed under scalar multiplication.
Thus,
We note that a union of subspaces is not necessarily a subspace, since its not closed under addition.
Theorem 4.12 (Span is a subspace)
Let
Proof. If
We now consider
[Zero vector]
Since
is nonempty, hence there exists some .Then,
as it is a linear combination of elements of .
[Scalar multiplication]
Let
.Then,
where , , .But then for any
:Thus,
is also a linear combination of elements of .Thus,
is closed under scalar multiplication.
[Vector addition]
Let
.Let
and be linear combinations of elements of .Then,
is also a linear combination of (up to
) elements of .Thus,
.Thus,
is closed under vector addition.
Combining,
Let
Let
.Then,
is a linear combination of elements of .But
is closed under linear combinations and .Thus,
.Thus,
.
This theorem is quite useful. It allows us to construct subspaces from a given basis.
Let
Choosing some other basis lets us construct another set of subspaces.
An
Definition 4.22 (Partial order on subspaces)
Let
If
Theorem 4.13 (Span as the smallest subspace)
Let
Let
i.e.
Proof. Since
If the vector space
However, if
Theorem 4.14 (Subspaces of a finite dimensional vector space)
Let
Moreover, if
then
Proof. If
Now, assume that
Let
be a basis for .Then
spans and contains linearly independent vectors.Since
, hence .But
.Hence, any set of
linearly independent vectors must be a basis for too.Thus,
is a basis for .Thus,
.
Corollary 4.4
If
Definition 4.23 (Subspace codimension)
Let
4.1.9. Direct Sum of Vector Spaces#
Consider two vector spaces
Definition 4.24 (Direct sum of vector spaces)
Let
Let
and be the additive identities of and respectively. The additive identity of is given by[Vector addition] Let
and be in . Then, their sum is defined as:[Scalar multiplication] Let
and . Then, the scalar multiplication is defined as:
Note
Some authors prefer to call
For them, external direct sum is a direct product and internal direct sum is a direct sum.
Example 4.12
With the definition of direct sum given above, we have:
or more generally,
Specifically,
Also,
4.1.10. Sets in Vector Spaces#
4.1.10.1. Set Arithmetic#
Definition 4.25 (Arithmetic on sets)
Let
The addition of sets is defined as:
The subtraction of sets is defined as:
Addition of a set with a vector is defined as:
Subtraction of set with a vector is defined as:
Scalar multiplication of a set with a scalar is defined as:
Multiplication of a set of scalars with a set of vectors is defined as:
Multiplication of a set of scalars with a vector is defined as:
4.1.10.2. Symmetry#
Definition 4.26 (Symmetric reflection)
The symmetric reflection of a set
Definition 4.27 (Symmetric set)
A set
If a nonempty set is symmetric, it must contain
4.1.10.3. Set Arithmetic Properties#
Theorem 4.15 (Set vector arithmetic identities)
Let
Proof. We show that
Let
.Then,
.Then,
.Then,
.Thus,
.Conversely, let
.Then,
.But then,
.Thus,
.Combining
.
We show that
Assume
Let
.Then,
.Then,
since .Then,
.Thus,
.
Now, assume
Let
.Then,
.Then,
since .Then,
.Thus,
.
Theorem 4.16 (Properties of set arithmetic)
Let
Let
Set addition is commutative:
Set addition is associative:
Scalar multiplication with a set commutes with multiplication in
:Scalar multiplication distributes over set addition:
The set
is the identity element for set addition:The set
is symmetric and . .
Proof. We shall prove some of the properties.
[Commutativity]
Let
.Then, there exists,
and such that .But then,
since vector addition is commutative.Thus,
.Thus,
.Similarly,
.
[Associativity]
Let
.Then, there exists
and such that .Then, there exists
and such that .Thus,
.But vector addition is associative.
Thus,
.Then,
.Thus,
.Thus,
.Similarly reasoning shows that
.
Let
.Then, there exists
such that .Then,
.Then,
and .Thus,
.Thus,
.
Additive inverses don’t exist for sets containing more than one element.
Theorem 4.17 (Distributive law for set sum over intersection)
Let
Let
Proof. We show that
Let
.Then,
such that and .Then,
, and .Thus,
and .Thus,
.Thus,
.
We mention that
Consider
.Let
to be the -axis.Let
to be the -axis.Let
to be the line .Both
and are the whole plane .Thus,
.But
.Thus,
.Clearly,
.
4.1.10.4. Direct Sums#
Definition 4.28 ((External) direct sum over two different vector spaces)
Let
Definition 4.29 ((Internal) direct sum of sets in same vector space)
Let
then the sum
The key idea behind a direct sum is that it enables a unique decomposition of a vector into its components belonging to individual sets.
Theorem 4.18
Two subsets
If two subspaces of
4.1.11. Sums of Subspaces#
Definition 4.30 ((Internal) direct sum of subspaces)
If
If, in particular,
Example 4.13 (Vector space as direct sum of spans of basis vectors)
The spans of basis vectors have only
Theorem 4.19 (Dimension of sum of subspaces)
Let
Theorem 4.20 (Dimension of direct sum of subspaces)
Let
Proof. We note that
4.1.12. Real Vector Spaces#
Definition 4.31 (Real vector space)
A real vector space
In other words, the scalars come from the field of real numbers.
There are several features associated with the real field.
is totally ordered. is complete.