The fundamental starting point for linear algebra is the concept of the **vector space**, and this in turn requires
familiarity with ** fields**.

By a field (

**F, +, ·**)

**we mean a commutative ring with unity which satisfies the additional axiom: (A1): For every non-zero element a ∈ F there exists an element a**

^{-1}∈ F such that: a· a

^{-1}= 1. The element a

^{-1}= 1/a is called

*the reciprocal or multiplicative inverse*of a. Thus, the key thing about a field is that it comprises a set of elements which can be added or multiplied in such a way that addition and multiplications satisfy the ordinary rules of arithmetic, and in such a way one can divide by non-zero elements.

Of course, if one has a field, one can also have

*a sub-field*. Thus, say P and L are fields and L is contained in P (e.g. L ∈ P ), then L is a sub-field of P.

A vector space

*over the field K, say, is a set of objects that can be added and multiplied by elements of K in such a way that the sum of any 2 elements of*

**V****V**is again, an element of

**V.**

One can also have some set

As an illustration, let V =

As another illustration, let

x

which is called a

Yet another example: let

An additional important consideration is whether elements of a vector space are

a

If, on the other hand, there

Now, if elements

Let

Show that the vectors (1, 1) and (-3, 2) are linearly independent.

a(1,1) + b(-3,2) = 0

Writing out the components as linear combinations:

a - 3b = 0 and a + 2b = 0

Then solve simultaneously:

a - 3b = 0

a + 2b = 0

----------

0 -5b = 0

or b = 0, so a = 0

Both a and b are equal to zero so the vectors are linearly independent.

**W**⊂*, i.e.***V****W**is a subset of**V**, which satisfies specific conditions: i) the sum of any 2 elements of**W**is also an element of**W**, ii) the multiple m(w1) of an element of**W**is also an element of**W**, the element 0 of**V**is also an element of**W**, then we call**W**a*subspace*of**V**.As an illustration, let V =

**R**^{ n}and let**W**be the set of vectors in**V**whose last coordinate = 0. Then,**W**is a subset of**V**, which we could identify with R**.**^{ n-1}As another illustration, let

**V**be an arbitrary vector space and let**v**,_{1}**v**,_{2}**v**........_{3}**v**be elements of_{n}**V**. Also let x**, x**_{ 1}**, x**_{ 2}**...... x**_{ 3}**be numbers. Then it is possible to form an expression of the type:**_{ n}x

_{ 1}**v**+ x_{1}_{ 2}**v**+ x_{2}_{ 3}**v**+.............x_{3}_{n}**v**_{n}which is called a

*linear combination*of**v**,_{1}**v**,_{2}**v**........_{3}**v**. The set of all linear combinations of_{n}**v**,_{1}**v**,_{2}**v**........_{3}**v**is a subspace of_{n}**V**,Yet another example: let

**A**be a vector in**R****. Let**^{ 3}**W**be the set of all elements B in**R****such that**^{ 3}**B****·****A**= 0, i.e. such that**B**is perpendicular to**A**. Then**W**is a subspace of**R****.**^{ 3}An additional important consideration is whether elements of a vector space are

**linearly dependent**or linearly independent. We say the elements**v**,_{1}**v**,_{2}**v**........_{3}**v**are linearly_{n}*dependent*over a field**F**if there exist elements in**F**not all equal to zero such that:a

_{ 1}**v**+ a_{1}_{ 2}**v**+ ..............a_{2}_{n}**v**= 0_{n}If, on the other hand, there

*do not exist*such numbers a1, a2 etc. we say that the elements**v**,_{1}**v**,_{2}**v**........_{3}**v**are_{n}**linearly independent**.Now, if elements

**v**,_{1}**v**,_{2}**v**........_{3}**v**of the vector space_{n}**V**generate**V**and also are linearly independent, then (**v**,_{1}**v**,_{2}**v**........_{3}**v**) is called a_{n}**basis**of**V**. One can also say that those elements**v**,_{1}**v**,_{2}**v**........_{3}**v**form a basis of_{n}**V**.*Example*: Let **W _{f }** be a vector space of functions generated by
the two functions: exp(t) and exp(2t), then {exp(t), exp(2t)} is a basis of

**W**

_{f}As a further illustration, let

**V**be a vector space and let (

**v**,

_{1}**v**,

_{2}**v**........

_{3}**v**) be a basis of

_{n}**V**. The elements of

**V**can be represented by n-tuples relative to this basis, e.g. if an element

**v**of

**V**is written as a linear combination:

**v** = x_{ 1}**v _{1}** + x

_{ 2}**v**+ x

_{2}

_{ 3 }**v**+.............x

_{3}

_{n}**v**

_{n}**, x**_{ 1}**, .......x**_{ 2}**) the coordinates of**_{ n}**v**with respect to our basis.Let

**V**be the vector space of functions generated by the two functions: exp(t) and exp(2t), then what are the coordinates for f(**V**) = 3 exp(t) + 5 exp(2t)?*Ans*. The coordinates are (3, 5) with respect to the basis {exp(t), exp(2t)} .**Example Problem :**Show that the vectors (1, 1) and (-3, 2) are linearly independent.

*Solution*: Let a, b be two numbers associated with some vector space - call it W- such that:a(1,1) + b(-3,2) = 0

Writing out the components as linear combinations:

a - 3b = 0 and a + 2b = 0

Then solve simultaneously:

a - 3b = 0

a + 2b = 0

----------

0 -5b = 0

or b = 0, so a = 0

Both a and b are equal to zero so the vectors are linearly independent.

__Suggested Problem:__Show that the vectors (1, 1) and (-1, 2) form a basis of

**R****.**^{2}
## No comments:

Post a Comment