The fundamental starting point for linear algebra is the concept of the vector space, and this in turn requires familiarity with fields.
By a field (F, +, ·) we mean a commutative ring with unity which satisfies the additional axiom: (A1): For every non-zero element a ∈ F there exists an element a-1 ∈ F such that: a· a-1 = 1. The element a-1 = 1/a is called the reciprocal or multiplicative inverse of a. Thus, the key thing about a field is that it comprises a set of elements which can be added or multiplied in such a way that addition and multiplications satisfy the ordinary rules of arithmetic, and in such a way one can divide by non-zero elements.
Of course, if one has a field, one can also have a sub-field. Thus, say P and L are fields and L is contained in P (e.g. L ∈ P ), then L is a sub-field of P.
A vector space V over the field K, say, is a set of objects that can be added and multiplied by elements of K in such a way that the sum of any 2 elements of V is again, an element of V.
One can also have some set W ⊂ V , i.e. W is a subset of V, which satisfies specific conditions: i) the sum of any 2
elements of W is also an element of W, ii) the multiple m(w1) of an element
of W is also an element of W, the element 0 of V is also an element of W, then we call W a subspace
of V.
As an illustration, let V = R n and let W be the set of vectors in V whose last coordinate = 0. Then, W is a subset of V, which we could identify with R n-1 .
As another illustration, let V be an arbitrary vector space and let v 1, v2, v3........v n be elements of V. Also let x 1, x 2, x 3...... x n be numbers. Then it is possible to form an expression of the type:
x 1 v 1 + x 2 v 2 + x 3 v 3 +.............x n v n
which is called a linear combination of v 1, v2, v3........v n. The set of all linear combinations of v 1, v2, v3........v n is a subspace of V,
Yet another example: let A be a vector in R 3. Let W be the set of all elements B in R 3 such that B · A = 0, i.e. such that B is perpendicular to A. Then W is a subspace of R 3.
An additional important consideration is whether elements of a vector space are linearly dependent or linearly independent. We say the elements v 1, v2, v3........v n are linearly dependent over a field F if there exist elements in F not all equal to zero such that:
a 1 v 1 + a 2 v 2 + ..............a n v n = 0
If, on the other hand, there do not exist such numbers a1, a2 etc. we say that the elements v 1, v2, v3........v n are linearly independent.
Now, if elements v 1, v2, v3........v n of the vector space V generate V and also are linearly independent, then (v 1, v2, v3........v n) is called a basis of V. One can also say that those elements v 1, v2, v3........v n form a basis of V.
Let V be the vector space of functions generated by the two functions: exp(t) and exp(2t), then what are the coordinates for f(V) = 3 exp(t) + 5 exp(2t)?
Ans. The coordinates are (3, 5) with respect to the basis {exp(t), exp(2t)} .
Example Problem :
Show that the vectors (1, 1) and (-3, 2) are linearly independent.
Solution: Let a, b be two numbers associated with some vector space - call it W- such that:
a(1,1) + b(-3,2) = 0
Writing out the components as linear combinations:
a - 3b = 0 and a + 2b = 0
Then solve simultaneously:
a - 3b = 0
a + 2b = 0
----------
0 -5b = 0
or b = 0, so a = 0
Both a and b are equal to zero so the vectors are linearly independent.
As an illustration, let V = R n and let W be the set of vectors in V whose last coordinate = 0. Then, W is a subset of V, which we could identify with R n-1 .
As another illustration, let V be an arbitrary vector space and let v 1, v2, v3........v n be elements of V. Also let x 1, x 2, x 3...... x n be numbers. Then it is possible to form an expression of the type:
x 1 v 1 + x 2 v 2 + x 3 v 3 +.............x n v n
which is called a linear combination of v 1, v2, v3........v n. The set of all linear combinations of v 1, v2, v3........v n is a subspace of V,
Yet another example: let A be a vector in R 3. Let W be the set of all elements B in R 3 such that B · A = 0, i.e. such that B is perpendicular to A. Then W is a subspace of R 3.
An additional important consideration is whether elements of a vector space are linearly dependent or linearly independent. We say the elements v 1, v2, v3........v n are linearly dependent over a field F if there exist elements in F not all equal to zero such that:
a 1 v 1 + a 2 v 2 + ..............a n v n = 0
If, on the other hand, there do not exist such numbers a1, a2 etc. we say that the elements v 1, v2, v3........v n are linearly independent.
Now, if elements v 1, v2, v3........v n of the vector space V generate V and also are linearly independent, then (v 1, v2, v3........v n) is called a basis of V. One can also say that those elements v 1, v2, v3........v n form a basis of V.
Example: Let W f be a vector space of functions generated by
the two functions: exp(t) and exp(2t), then {exp(t), exp(2t)} is a basis of W f
As a further illustration, let V be
a vector space and let (v 1,
v2, v3........v n) be a basis of V. The elements of V can be represented by n-tuples relative to this basis, e.g. if an
element v of V is written as a linear combination:
v = x 1 v 1 + x 2 v 2 + x 3 v 3 +.............x n v n
then we call (x 1, x 2, .......x n) the coordinates of v with respect to our basis.Let V be the vector space of functions generated by the two functions: exp(t) and exp(2t), then what are the coordinates for f(V) = 3 exp(t) + 5 exp(2t)?
Ans. The coordinates are (3, 5) with respect to the basis {exp(t), exp(2t)} .
Example Problem :
Show that the vectors (1, 1) and (-3, 2) are linearly independent.
Solution: Let a, b be two numbers associated with some vector space - call it W- such that:
a(1,1) + b(-3,2) = 0
Writing out the components as linear combinations:
a - 3b = 0 and a + 2b = 0
Then solve simultaneously:
a - 3b = 0
a + 2b = 0
----------
0 -5b = 0
or b = 0, so a = 0
Both a and b are equal to zero so the vectors are linearly independent.
Suggested Problem:
Show that the vectors (1, 1) and (-1, 2) form a basis of R 2.
No comments:
Post a Comment