Menu
For free
Registration
home  /  Self-development/ How to prove that vectors are linearly independent. Linearly dependent and linearly independent vector systems

How to prove that vectors are linearly independent. Linearly dependent and linearly independent vector systems

In this article we will cover:

  • what are collinear vectors;
  • what are the conditions for collinearity of vectors;
  • what properties of collinear vectors exist;
  • what is the linear dependence of collinear vectors.
Definition 1

Collinear vectors are vectors that are parallel to one line or lie on one line.

Example 1

Conditions for collinearity of vectors

Two vectors are collinear if any of the following conditions are true:

  • condition 1 . Vectors a and b are collinear if there is a number λ such that a = λ b;
  • condition 2 . Vectors a and b are collinear with equal coordinate ratios:

a = (a 1 ; a 2) , b = (b 1 ; b 2) ⇒ a ∥ b ⇔ a 1 b 1 = a 2 b 2

  • condition 3 . Vectors a and b are collinear provided that the cross product and the zero vector are equal:

a ∥ b ⇔ a, b = 0

Note 1

Condition 2 not applicable if one of the vector coordinates is zero.

Note 2

Condition 3 applies only to those vectors that are specified in space.

Examples of problems to study the collinearity of vectors

Example 1

We examine the vectors a = (1; 3) and b = (2; 1) for collinearity.

How to solve?

In this case, it is necessary to use the 2nd collinearity condition. For given vectors it looks like this:

The equality is false. From this we can conclude that vectors a and b are non-collinear.

Answer : a | | b

Example 2

What value m of the vector a = (1; 2) and b = (- 1; m) is necessary for the vectors to be collinear?

How to solve?

Using the second collinearity condition, vectors will be collinear if their coordinates are proportional:

This shows that m = - 2.

Answer: m = - 2 .

Criteria for linear dependence and linear independence of vector systems

Theorem

Vector system vector space is linearly dependent only if one of the vectors of the system can be expressed in terms of the remaining vectors of the given system.

Proof

Let the system e 1 , e 2 , . . . , e n is linearly dependent. Let us write a linear combination of this system equal to the zero vector:

a 1 e 1 + a 2 e 2 + . . . + a n e n = 0

in which at least one of the combination coefficients is not equal to zero.

Let a k ≠ 0 k ∈ 1 , 2 , . . . , n.

We divide both sides of the equality by a non-zero coefficient:

a k - 1 (a k - 1 a 1) e 1 + (a k - 1 a k) e k + . . . + (a k - 1 a n) e n = 0

Let's denote:

A k - 1 a m , where m ∈ 1 , 2 , . . . , k - 1 , k + 1 , n

In this case:

β 1 e 1 + . . . + β k - 1 e k - 1 + β k + 1 e k + 1 + . . . + β n e n = 0

or e k = (- β 1) e 1 + . . . + (- β k - 1) e k - 1 + (- β k + 1) e k + 1 + . . . + (- β n) e n

It follows that one of the vectors of the system is expressed through all other vectors of the system. Which is what needed to be proven (etc.).

Adequacy

Let one of the vectors be linearly expressed through all other vectors of the system:

e k = γ 1 e 1 + . . . + γ k - 1 e k - 1 + γ k + 1 e k + 1 + . . . + γ n e n

We move the vector e k to the right side of this equality:

0 = γ 1 e 1 + . . . + γ k - 1 e k - 1 - e k + γ k + 1 e k + 1 + . . . + γ n e n

Since the coefficient of the vector e k is equal to - 1 ≠ 0, we get a non-trivial representation of zero by a system of vectors e 1, e 2, . . . , e n , and this, in turn, means that this system vectors are linearly dependent. Which is what needed to be proven (etc.).

Consequence:

  • A system of vectors is linearly independent when none of its vectors can be expressed in terms of all other vectors of the system.
  • A system of vectors that contains a zero vector or two equal vectors is linearly dependent.

Properties of linearly dependent vectors

  1. For 2- and 3-dimensional vectors, the following condition is met: two linearly dependent vectors are collinear. Two collinear vectors are linearly dependent.
  2. For 3-dimensional vectors the condition is satisfied: three linearly dependent vectors- coplanar. (3 coplanar vectors are linearly dependent).
  3. For n-dimensional vectors, the following condition is satisfied: n + 1 vectors are always linearly dependent.

Examples of solving problems involving linear dependence or linear independence of vectors

Example 3

Let's check the vectors a = 3, 4, 5, b = - 3, 0, 5, c = 4, 4, 4, d = 3, 4, 0 for linear independence.

Solution. Vectors are linearly dependent because the dimension of vectors is less than the number of vectors.

Example 4

Let's check the vectors a = 1, 1, 1, b = 1, 2, 0, c = 0, - 1, 1 for linear independence.

Solution. We find the values ​​of the coefficients at which the linear combination will equal the zero vector:

x 1 a + x 2 b + x 3 c 1 = 0

We write the vector equation in linear form:

x 1 + x 2 = 0 x 1 + 2 x 2 - x 3 = 0 x 1 + x 3 = 0

We solve this system using the Gauss method:

1 1 0 | 0 1 2 - 1 | 0 1 0 1 | 0 ~

From the 2nd line we subtract the 1st, from the 3rd - the 1st:

~ 1 1 0 | 0 1 - 1 2 - 1 - 1 - 0 | 0 - 0 1 - 1 0 - 1 1 - 0 | 0 - 0 ~ 1 1 0 | 0 0 1 - 1 | 0 0 - 1 1 | 0 ~

From the 1st line we subtract the 2nd, to the 3rd we add the 2nd:

~ 1 - 0 1 - 1 0 - (- 1) | 0 - 0 0 1 - 1 | 0 0 + 0 - 1 + 1 1 + (- 1) | 0 + 0 ~ 0 1 0 | 1 0 1 - 1 | 0 0 0 0 | 0

From the solution it follows that the system has many solutions. This means that there is a non-zero combination of values ​​of such numbers x 1, x 2, x 3 for which the linear combination of a, b, c equals the zero vector. Therefore, the vectors a, b, c are linearly dependent. ​​​​​​​

If you notice an error in the text, please highlight it and press Ctrl+Enter

Linear dependence and vector independence

Definitions of linearly dependent and independent vector systems

Definition 22

Let us have a system of n-vectors and a set of numbers
, Then

(11)

is called a linear combination of a given system of vectors with a given set of coefficients.

Definition 23

Vector system
is called linearly dependent if there is such a set of coefficients
, of which at least one is not equal to zero, that the linear combination of a given system of vectors with this set of coefficients is equal to the zero vector:

Let
, Then

Definition 24 ( through the representation of one vector of the system as a linear combination of the others)

Vector system
is called linearly dependent if at least one of the vectors of this system can be represented as a linear combination of the remaining vectors of this system.

Statement 3

Definitions 23 and 24 are equivalent.

Definition 25(via zero linear combination)

Vector system
is called linearly independent if a zero linear combination of this system is possible only for all
equal to zero.

Definition 26(due to the impossibility of representing one vector of the system as a linear combination of the others)

Vector system
is called linearly independent if not one of the vectors of this system cannot be represented as a linear combination of other vectors of this system.

Properties of linearly dependent and independent vector systems

Theorem 2 (zero vector in the system of vectors)

If a system of vectors has a zero vector, then the system is linearly dependent.

 Let
, Then .

We get
, therefore, by definition of a linearly dependent system of vectors through a zero linear combination (12) the system is linearly dependent. 

Theorem 3 (dependent subsystem in a vector system)

If a system of vectors has a linearly dependent subsystem, then the entire system is linearly dependent.

 Let
- linearly dependent subsystem
, among which at least one is not equal to zero:

This means, by definition 23, the system is linearly dependent. 

Theorem 4

Any subsystem of a linearly independent system is linearly independent.

 From the opposite. Let the system be linearly independent and have a linearly dependent subsystem. But then, according to Theorem 3, the entire system will also be linearly dependent. Contradiction. Consequently, a subsystem of a linearly independent system cannot be linearly dependent. 

Geometric meaning of linear dependence and independence of a system of vectors

Theorem 5

Two vectors And are linearly dependent if and only if
.

Necessity.

And - linearly dependent
that the condition is satisfied
. Then
, i.e.
.

Adequacy.

Linearly dependent. 

Corollary 5.1

The zero vector is collinear to any vector

Corollary 5.2

In order for two vectors to be linearly independent, it is necessary and sufficient that was not collinear .

Theorem 6

In order for a system of three vectors to be linearly dependent, it is necessary and sufficient that these vectors be coplanar .

Necessity.

- are linearly dependent, therefore, one vector can be represented as a linear combination of the other two.

, (13)

Where
And
. According to the parallelogram rule there is a diagonal of a parallelogram with sides
, but the parallelogram is flat figure
coplanar
- are also coplanar.

Adequacy.

- coplanar. Let's apply three vectors to point O:

C

B`

– linearly dependent 

Corollary 6.1

The zero vector is coplanar to any pair of vectors.

Corollary 6.2

In order for vectors
were linearly independent, it is necessary and sufficient that they are not coplanar.

Corollary 6.3

Any vector of a plane can be represented as a linear combination of any two non-collinear vectors of the same plane.

Theorem 7

Any four vectors in space are linearly dependent .

 Let's consider 4 cases:

Let's draw a plane through vectors, then a plane through vectors and a plane through vectors. Then we draw planes passing through point D, parallel to the pairs of vectors ; ; respectively. We build a parallelepiped along the lines of intersection of planes O.B. 1 D 1 C 1 ABDC.

Let's consider O.B. 1 D 1 C 1 – parallelogram by construction according to the parallelogram rule
.

Consider OADD 1 – a parallelogram (from the property of a parallelepiped)
, Then

EMBED Equation.3 .

By Theorem 1
such that . Then
, and by definition 24 the system of vectors is linearly dependent. 

Corollary 7.1

The sum of three non-coplanar vectors in space is a vector that coincides with the diagonal of a parallelepiped built on these three vectors applied to a common origin, and the origin of the sum vector coincides with the common origin of these three vectors.

Corollary 7.2

If we take 3 non-coplanar vectors in space, then any vector of this space can be decomposed into a linear combination of these three vectors.


The concepts of linear dependence and independence of a system of vectors are very important when studying vector algebra, since the concepts of dimension and basis of space are based on them. In this article we will give definitions, consider the properties of linear dependence and independence, obtain an algorithm for studying a system of vectors for linear dependence, and analyze in detail the solutions of examples.

Page navigation.

Determination of linear dependence and linear independence of a system of vectors.

Let's consider a set of p n-dimensional vectors, denote them as follows. Let's make a linear combination of these vectors and arbitrary numbers (real or complex): . Based on the definition of operations on n-dimensional vectors, as well as the properties of the operations of adding vectors and multiplying a vector by a number, it can be argued that the written linear combination represents some n-dimensional vector, that is, .

This is how we approached the definition of the linear dependence of a system of vectors.

Definition.

If a linear combination can represent a zero vector then when among the numbers there is at least one non-zero, then the system of vectors is called linearly dependent.

Definition.

If a linear combination is a zero vector only when all numbers are equal to zero, then the system of vectors is called linearly independent.

Properties of linear dependence and independence.

Based on these definitions, we formulate and prove properties of linear dependence and linear independence of a system of vectors.

    If several vectors are added to a linearly dependent system of vectors, the resulting system will be linearly dependent.

    Proof.

    Since the system of vectors is linearly dependent, equality is possible if there is at least one non-zero number from the numbers . Let .

    Let's add s more vectors to the original system of vectors , and we obtain the system . Since and , then the linear combination of vectors of this system is of the form

    represents the zero vector, and . Consequently, the resulting system of vectors is linearly dependent.

    If several vectors are excluded from a linearly independent system of vectors, then the resulting system will be linearly independent.

    Proof.

    Let us assume that the resulting system is linearly dependent. By adding all the discarded vectors to this system of vectors, we obtain the original system of vectors. By condition, it is linearly independent, but due to the previous property of linear dependence, it must be linearly dependent. We have arrived at a contradiction, therefore our assumption is incorrect.

    If a system of vectors has at least one zero vector, then such a system is linearly dependent.

    Proof.

    Let the vector in this system of vectors be zero. Let us assume that the original system of vectors is linearly independent. Then vector equality is possible only when . However, if we take any , different from zero, then the equality will still be true, since . Consequently, our assumption is incorrect, and the original system of vectors is linearly dependent.

    If a system of vectors is linearly dependent, then at least one of its vectors is linearly expressed in terms of the others. If a system of vectors is linearly independent, then none of the vectors can be expressed in terms of the others.

    Proof.

    First, let's prove the first statement.

    Let the system of vectors be linearly dependent, then there is at least one nonzero number and the equality is true. This equality can be resolved with respect to , since in this case we have

    Consequently, the vector is linearly expressed through the remaining vectors of the system, which is what needed to be proved.

    Now let's prove the second statement.

    Since the system of vectors is linearly independent, equality is possible only for .

    Let us assume that some vector of the system is expressed linearly in terms of the others. Let this vector be , then . This equality can be rewritten as , on its left side there is a linear combination of system vectors, and the coefficient in front of the vector is different from zero, which indicates a linear dependence of the original system of vectors. So we came to a contradiction, which means the property is proven.

An important statement follows from the last two properties:
if a system of vectors contains vectors and , where is an arbitrary number, then it is linearly dependent.

Study of a system of vectors for linear dependence.

Let's pose a problem: we need to establish a linear dependence or linear independence of a system of vectors.

The logical question is: “how to solve it?”

Something useful from a practical point of view can be learned from the definitions and properties of linear dependence and independence of a system of vectors discussed above. These definitions and properties allow us to establish a linear dependence of a system of vectors in the following cases:

What to do in other cases, which are the majority?

Let's figure this out.

Let us recall the formulation of the theorem on the rank of a matrix, which we presented in the article.

Theorem.

Let r – rank of matrix A of order p by n, . Let M be the basis minor of the matrix A. All rows (all columns) of the matrix A that do not participate in the formation of the basis minor M are linearly expressed through the rows (columns) of the matrix generating the basis minor M.

Now let us explain the connection between the theorem on the rank of a matrix and the study of a system of vectors for linear dependence.

Let's compose a matrix A, the rows of which will be the vectors of the system under study:

What would linear independence of a system of vectors mean?

From the fourth property of linear independence of a system of vectors, we know that none of the vectors of the system can be expressed in terms of the others. In other words, no row of matrix A will be linearly expressed in terms of other rows, therefore, linear independence of the system of vectors will be equivalent to the condition Rank(A)=p.

What will the linear dependence of the system of vectors mean?

Everything is very simple: at least one row of the matrix A will be linearly expressed in terms of the others, therefore, linear dependence of the system of vectors will be equivalent to the condition Rank(A)

.

So, the problem of studying a system of vectors for linear dependence is reduced to the problem of finding the rank of a matrix composed of vectors of this system.

It should be noted that for p>n the system of vectors will be linearly dependent.

Comment: when compiling matrix A, the vectors of the system can be taken not as rows, but as columns.

Algorithm for studying a system of vectors for linear dependence.

Let's look at the algorithm using examples.

Examples of studying a system of vectors for linear dependence.

Example.

A system of vectors is given. Examine it for linear dependence.

Solution.

Since the vector c is zero, the original system of vectors is linearly dependent due to the third property.

Answer:

The vector system is linearly dependent.

Example.

Examine a system of vectors for linear dependence.

Solution.

It is not difficult to notice that the coordinates of the vector c are equal to the corresponding coordinates of the vector multiplied by 3, that is, . Therefore, the original system of vectors is linearly dependent.

The vector system is called linearly dependent, if there are numbers among which at least one is different from zero, such that the equality https://pandia.ru/text/78/624/images/image004_77.gif" width="57" height="24 src=" >.

If this equality is satisfied only in the case when all , then the system of vectors is called linearly independent.

Theorem. The vector system will linearly dependent if and only if at least one of its vectors is a linear combination of the others.

Example 1. Polynomial is a linear combination of polynomials https://pandia.ru/text/78/624/images/image010_46.gif" width="88 height=24" height="24">. The polynomials constitute a linearly independent system, since the polynomial https: //pandia.ru/text/78/624/images/image012_44.gif" width="129" height="24">.

Example 2. The matrix system, , https://pandia.ru/text/78/624/images/image016_37.gif" width="51" height="48 src="> is linearly independent, since a linear combination is equal to the zero matrix only in in the case when https://pandia.ru/text/78/624/images/image019_27.gif" width="69" height="21">, , https://pandia.ru/text/78/624 /images/image022_26.gif" width="40" height="21"> linearly dependent.

Solution.

Let's make a linear combination of these vectors https://pandia.ru/text/78/624/images/image023_29.gif" width="97" height="24">=0..gif" width="360" height=" 22">.

Equating coordinates of the same name equal vectors, we get https://pandia.ru/text/78/624/images/image027_24.gif" width="289" height="69">

Finally we get

And

The system has a unique trivial solution, so a linear combination of these vectors is equal to zero only in the case when all coefficients are equal to zero. Therefore, this system of vectors is linearly independent.

Example 4. The vectors are linearly independent. What will the vector systems be like?

a).;

b).?

Solution.

a). Let's make a linear combination and equate it to zero

Using the properties of operations with vectors in linear space, we rewrite the last equality in the form

Since the vectors are linearly independent, the coefficients at must be equal to zero, i.e..gif" width="12" height="23 src=">

The resulting system of equations has a unique trivial solution .

Since equality (*) executed only when https://pandia.ru/text/78/624/images/image031_26.gif" width="115 height=20" height="20"> – linearly independent;


b). Let's make an equality https://pandia.ru/text/78/624/images/image039_17.gif" width="265" height="24 src="> (**)

Applying similar reasoning, we obtain

Solving the system of equations by the Gauss method, we obtain

or

The latter system has an infinite number of solutions https://pandia.ru/text/78/624/images/image044_14.gif" width="149" height="24 src=">. Thus, there is a non-zero set of coefficients for which holds the equality (**) . Therefore, the system of vectors – linearly dependent.

Example 5 A system of vectors is linearly independent, and a system of vectors is linearly dependent..gif" width="80" height="24">.gif" width="149 height=24" height="24"> (***)

In equality (***) . Indeed, at , the system would be linearly dependent.

From the relation (***) we get or Let's denote .

We get

Tasks for independent decision(in the audience)

1. A system containing a zero vector is linearly dependent.

2. System consisting of one vector A, is linearly dependent if and only if, a=0.

3. A system consisting of two vectors is linearly dependent if and only if the vectors are proportional (that is, one of them is obtained from the other by multiplying by a number).

4. If you add a vector to a linearly dependent system, you get a linearly dependent system.

5. If a vector is removed from a linearly independent system, then the resulting system of vectors is linearly independent.

6. If the system S is linearly independent, but becomes linearly dependent when adding a vector b, then the vector b linearly expressed through system vectors S.

c). System of matrices , , in the space of second-order matrices.

10. Let the system of vectors a,b,c vector space is linearly independent. Prove the linear independence of the following vector systems:

a).a+b, b, c.

b).a+https://pandia.ru/text/78/624/images/image062_13.gif" width="15" height="19">– arbitrary number

c).a+b, a+c, b+c.

11. Let a,b,c– three vectors on the plane from which a triangle can be formed. Will these vectors be linearly dependent?

12. Two vectors are given a1=(1, 2, 3, 4),a2=(0, 0, 0, 1). Find two more four-dimensional vectors a3 anda4 so that the system a1,a2,a3,a4 was linearly independent .

Definition. Linear combination of vectors a 1 , ..., a n with coefficients x 1 , ..., x n is called a vector

x 1 a 1 + ... + x n a n .

trivial, if all coefficients x 1 , ..., x n are equal to zero.

Definition. The linear combination x 1 a 1 + ... + x n a n is called non-trivial, if at least one of the coefficients x 1, ..., x n is not equal to zero.

linearly independent, if there is no non-trivial combination of these vectors equal to the zero vector.

That is, the vectors a 1, ..., a n are linearly independent if x 1 a 1 + ... + x n a n = 0 if and only if x 1 = 0, ..., x n = 0.

Definition. The vectors a 1, ..., a n are called linearly dependent, if there is a non-trivial combination of these vectors equal to the zero vector.

Properties of linearly dependent vectors:

    For 2 and 3 dimensional vectors.

    Two linearly dependent vectors are collinear. (Collinear vectors are linearly dependent.)

    For 3-dimensional vectors.

    Three linearly dependent vectors are coplanar. (Three coplanar vectors are linearly dependent.)

  • For n-dimensional vectors.

    n + 1 vectors are always linearly dependent.

Examples of problems on linear dependence and linear independence of vectors:

Example 1. Check whether the vectors a = (3; 4; 5), b = (-3; 0; 5), c = (4; 4; 4), d = (3; 4; 0) are linearly independent.

Solution:

The vectors will be linearly dependent, since the dimension of the vectors is less than the number of vectors.

Example 2. Check whether the vectors a = (1; 1; 1), b = (1; 2; 0), c = (0; -1; 1) are linearly independent.

Solution:

x 1 + x 2 = 0
x 1 + 2x 2 - x 3 = 0
x 1 + x 3 = 0
1 1 0 0 ~
1 2 -1 0
1 0 1 0
~ 1 1 0 0 ~ 1 1 0 0 ~
1 - 1 2 - 1 -1 - 0 0 - 0 0 1 -1 0
1 - 1 0 - 1 1 - 0 0 - 0 0 -1 1 0

subtract the second from the first line; add a second line to the third line:

~ 1 - 0 1 - 1 0 - (-1) 0 - 0 ~ 1 0 1 0
0 1 -1 0 0 1 -1 0
0 + 0 -1 + 1 1 + (-1) 0 + 0 0 0 0 0

This solution shows that the system has many solutions, that is, there is a non-zero combination of values ​​of the numbers x 1, x 2, x 3 such that the linear combination of vectors a, b, c is equal to the zero vector, for example:

A + b + c = 0

which means the vectors a, b, c are linearly dependent.

Answer: vectors a, b, c are linearly dependent.

Example 3. Check whether the vectors a = (1; 1; 1), b = (1; 2; 0), c = (0; -1; 2) are linearly independent.

Solution: Let's find the values coefficients at which the linear combination of these vectors will be equal to the zero vector.

x 1 a + x 2 b + x 3 c 1 = 0

This vector equation can be written as a system linear equations

x 1 + x 2 = 0
x 1 + 2x 2 - x 3 = 0
x 1 + 2x 3 = 0

Let's solve this system using the Gauss method

1 1 0 0 ~
1 2 -1 0
1 0 2 0

subtract the first from the second line; subtract the first from the third line:

~ 1 1 0 0 ~ 1 1 0 0 ~
1 - 1 2 - 1 -1 - 0 0 - 0 0 1 -1 0
1 - 1 0 - 1 2 - 0 0 - 0 0 -1 2 0

subtract the second from the first line; add a second to the third line.