Thursday, December 15, 2011

More Linear Algebra: Vector spaces, Subspaces, Dimension

We'll now take a brief break from the theory of the political elite and look at more linear algebra in preparation for ultimately finding the dimension of a space of solutions, and the basis for it as well as finding orthogonal bases for assorted subspaces. The fundamental starting point for all of these is the concept of the vector space, and this in turn requires familiarity with fields.

As blogged about earlier (with groups), a field is defined:

By a field (F, +, ·) we mean a commutative ring with unity which satisfies the additional axiom: (A1): For every non-zero element a Î F there exists an element a^-1 C F such that: a· a^-1 = 1. The element a^-1 = 1/a is called the reciprocal or multiplicative inverse of a. Thus, the key thing about a field is that it comrpises a set of elements which can be added or multiplied in such a way that addition and multiplications satisfy the ordinary rules of arithmetic, and in such a way one can divide by non-zero elements.

Of course, if one has a field, one can also have a sub-field. Thus, say P and L are fields and L is contained in P (e.g. L C P ), then L is a sub-field of P.

Now, a vector space V over the field K, say, is a set of objects that can be added and multiplied by elements of K in such a way that the sum of any 2 elements of V is again, an element of V.

One can also have some set W C V , i.e. W is a subset of V, which satisfies specific conditions: i) the sum of any 2 elements of W is also an element of W, ii) the muliple m(w1) of an element of W is also an element of W, the element 0 of V is also an element of W, then we call W a subspace of V.

As an illustration, let V = R^n and let W be the set of vectors in V whose last coordinate = 0. Then, W is a subset of V, which we could identify with R^n-1.

As another illustration, let V be an arbitrary vector space and let v1, v2, v3........vn be elements of V. Also let x1, x2, x3......xn be numbers. Then it is possible to form an expression of the type:

x1 v1 + x2 v2 + x3 v3 +.............xn vn

which is called a linear combination of v1, v2, v3........vn. The set of all linear combinations of v1, v2, v3........vn is a subspace of V,

Yet another example: let A be a vector in R^3. Let W be the set of all elements B in R^3 such that B · A = 0, i.e. such that B is perpendicular to A. Then W is a subspace of R^3.

An additional important consideration is whether elements of a vector space are linearly dependent or linearly independent. We say the elements v1, v2, v3........vn are linearly dependent over a field F if there exist elements in F not all equal to zero such that:

a1 v1 + a2 v2 + ..............an vn = 0

If, on the other hand, there do not exist such numbers a1, a2 etc. we say that the elements v1, v2, v3........vn are linearly independent.

Now, if elements v1, v2, v3........vn of the vector space V generate V and also are linearly independent, then (v1, v2, v3........vn) is called a basis of V. One can also say that those elements v1, v2, v3........vn form a basis of V.

Example: Let W_f be a vector space of functions generated by the two functions: exp(t) and exp(2t), then {exp(t), exp(2t)} is a basis of fW.

As a further illustration, let V be a vector space and let (v1, v2, v3........vn) be a basis of V. The elements of V can be represented by n-tuples relative to this basis, e.g. if an element v of V is written as a linear combination: v = x1v1 + x2 v2 +.........xn vn, then we call (x1, x2, .......xn) the coordiantes of v with respect to our basis.

Example: Let V be the vector space of functions generated by the two functions: exp(t) and exp(2t), then what are the coordinates for f(V) = 3 exp(t) + 5 exp(2t)?

Ans. The coordinates are (3,5) with respect to the basis {exp(t), exp(2t)} .

Sample Problem (1):

Show that the vectors (1, 1) and (-3, 2) are linearly independent.

Solution: Let a, b be two numbers associated with some vector space - call it W- such that:

a(1,1) + b(-3,2) = 0

Writing out the components as linear combinations:

a - 3b = 0 and a + 2b = 0

Then solve simultaneously:

a - 3b = 0
a + 2b = 0
----------
0 -5b = 0

or b = 0, so a = 0

Both a and b are equal to zero so the vectors are linearly independent.

Sample Problem (2):

Find the coordinates of (1, 0) with respect to the two vectors (1,1) and (-1, 2)

Solution:

We must find numbers a and b which meet the condition:

a(1, 1) + b(-1, 2) = (1, 0)

This can be rewritten:

a - b = 1 and a + 2b = 0

Solve simultaneously, by subtracting the 2nd from the 1st:

a - b = 1
a + 2b = 0
-----------
0 - 3b = 1

and 3b = -1, so b = - 1/3, then a = 1 + b = 1 - 1/3 = 2/3

Then the coordinates of (1, 0) with respect to (1, 1) and (-1,2) are: (2/3, -1/3)


Sample Problem (3):

Show that the vectors (1, 1) and (-1, 2) form a basis of R^2.

Solution: This requires showing; a) the vectors are linearly independent, and b) they generate R^2.

As before (earlier problems), we set out the condition via expression for linear independence:

a(1, 1) + b(-1, 2) = (0, 0)

-> a - b = 0 and a + 2b = 0

solve simultaneusly by subtracting the 2nd from the 1st:

a - b = 0
a + 2b = 0
----------
0 - 3b = 0 so that b = 0 and a = 0

Thus the vectors are linearly independent.

(b) To show generation of R^2, let (a,b) be an arbitrary element of of R^2 and write out:

x (1, 1) + y(-1, 2) = (a, b)

which leads to the pair of simultaenous equations:

x - y = a and x + 2y = b

As before, subtracting the 2nd from the 1st eqn.

x - y = a
x + 2y = b
----------
0 - 3y = a - b or y = (b - a)/ 3

Therefore, (x,y) are the coordinates of (a,b) with respect to the basis {(1,1), (-1,2)}.

Problems:

1) Show the following vectors are linearly independent:

a) (π, 0) and (0, 1)

b) (1, 1, 0), (1, 1, 1) and (0, 1, -1)

2) Express X as a linear combination of the given vectors A, B and find the coordinates of X with respect to A, B:

a) X = (1, 0), A = (1, 1), B = (0, 1)

b) X = (1,1), A = (2, 1), B = (-1, 0)

Solutions to these will be given along with those from the previous problem set in the next instalment of linear algebra!

No comments:

Post a Comment