Definition. A vector space (or linear space) consists of the following:
- a field of scalars;
- a set of objects, called vectors;
- a rule (or operation), called vector addition, which associates with each pair of vectors
in
a vector
in
, called the sum of
and
, in such a way that
( a ) addition is commutative, ;
( b ) addition is associative, ;
( c ) there is a unique vector 0 in , called the zero vector, such that for all in ;
( d ) for each vector in there is a unique vector in such that ; - a rule (or operation), called scalar multiplication, which associates with each scalar
in
and vector
in
a vector
in
, called the product of
and
, in such a way that
( a ) for every in ;
( b ) ;
( c ) ;
( d ) .
Definition. A vector
in
is said to be a linear combination of the vectors
in
provided there exist scalars
in
such that
Definition. Let be a vector space over the field . A subspace of is a subset of which is itself a vector space over with the operation of vector addition and scalar multiplication on .
Theorem 1. A non-empty subset of is a subspace of if and only if for each pair of vectors in and each scalar in the vector is in .
Lemma. If
is an
matrix over
and
are
matrices over
then
Theorem 2. Let be a vector space over the field . Then intersection of any collection of subspaces of is a subspace of .
Definition. Let be a set of vectors in a vector space . The subspace spanned by is defined to be the intersection of all subspaces of which contains . When is a finite set of vectors, , we shall simply call the subspace spanned by the vectors .
Theorem 3. The subspace spanned by a non-empty subset of a vector space is the set of all linear combinations of vectors in .
Definition. If are subsets of a vector space , the set of all sums of vectors in is called the sum of the subsets and is denoted by or by .
Definition. Let
be a vector space over
. A subset
of
is said to be linearly dependent (or simply, dependent) if there exist distinct vectors
in
and scalars
in
, not all of which are 0, such that
A set which is not linearly depenent is called linearly independent. If the set
contains only finitely many vectors
, we sometimes say that
are dependent (or independent) instead of saying
is dependent (or independent).
Definition. Let be a vector space. A basis for is a linealy independent set of vectors in which spans the space . The space is finite-dimensional if it has a finite basis.
Theorem 4. Let
be a vector space which is spanned by a finite set of vectors
. Then any independent set of vectors in
is finite and contains no more than
elements.
Corollary 1. If
is a finite-dimensional vector space, then any two bases of
have the same (finite) number of elements.
Corollary 2. Let
be a finite-dimensional vector space and let
. Then
( a ) any sunset of
which contains more than
vectors is linearly dependent;
( b ) no subset of
which contains fewer than
vectors can span
.
Theorem 5. If
is a subspace of a finite-dimensional vector space
, every linearly independent subset of
is finite and is part of a (finite) basis for
.
Corollary 1. If
is a proper subspace of a finite-dimensional vector space
, then
is finite-dimensional and
.
Corollary 2. In a finite-dimensional vector space
every non-empty linearly independent set of vectors is part of a basis.
Corollary 3. Let
be an
matrix over a field
, and suppose the row vectors of
form a linearly independent set of vectors in
. Then
is invertible.
Theorem 6. If
and
are finite-dimensional subspaces of a vector space
, then
is finite-dimensional and
Definition. If is a finite-dimensional vector space, an ordered basis for is a finite sequence of vectors which is linearly independent and spans .
Theorem 7. Let
be an
-dimensional vector space over the field
, and let
and
be two ordered bases of
. Then there is a unique, necessarily invertible,
matrix
with entries in
such that
for every vector
in
. Then columns of
are given by
Theorem 8. Suppose
is an
invertilbe matrix over
. Let
be an
-dimensional vector space over
, and let
be an ordered basis of
. Then there is a unique basis
of
such that
for every vector
in
.
Theorem 9. Row-equivalent matrices have the same row space.
Theorem 10. Let be a non-zero row-reduced echelon matrix. Then the non-zero row vectors of form a basis for the row space of .
Theorem 11. Let
and
be positive integers and let
be a field. Suppose
is a subspace of
and
. Then there is precisely one
row-reduced echelon matrix over
which has
as its row space.
Corollary. Each
matrix
is row-equivalent to one and oonly one row-reduced echelon matrix.
Corollary. Let
and
be
matrices over the field
. Then
and
are row-equivalent if and only if they have the same row space.