Menu
For free
Registration
home  /  Self-development/ Is the system of vectors linearly independent. Linear dependence and linear independence of vectors

Is the system of vectors linearly independent? Linear dependence and linear independence of vectors

Linear dependence and linear independence of vectors.
Basis of vectors. Affine coordinate system

There is a cart with chocolates in the auditorium, and every visitor today will get a sweet couple - analytical geometry with linear algebra. This article will cover two sections at once. higher mathematics, and we'll see how they get along in one wrapper. Take a break, eat a Twix! ...damn, what a bunch of nonsense. Although, okay, I won’t score, in the end, you should have a positive attitude towards studying.

Linear dependence of vectors, linear vector independence, basis of vectors and other terms have not only a geometric interpretation, but, above all, an algebraic meaning. The very concept of “vector” from the point of view of linear algebra is not always the “ordinary” vector that we can depict on a plane or in space. You don’t need to look far for proof, try drawing a vector of five-dimensional space . Or the weather vector, which I just went to Gismeteo for: – temperature and Atmosphere pressure respectively. The example is, of course, incorrect from the point of view of properties vector space, but, nevertheless, no one forbids formalizing these parameters as a vector. Breath of autumn...

No, I'm not going to bore you with theory, linear vector spaces, the task is to understand definitions and theorems. The new terms (linear dependence, independence, linear combination, basis, etc.) apply to all vectors from an algebraic point of view, but geometric examples will be given. Thus, everything is simple, accessible and clear. Beyond tasks analytical geometry We will also look at some typical algebra tasks. To master the material, it is advisable to familiarize yourself with the lessons Vectors for dummies And How to calculate the determinant?

Linear dependence and independence of plane vectors.
Plane basis and affine coordinate system

Let's consider the plane of your computer desk (just a table, bedside table, floor, ceiling, whatever you like). The task will consist of the following actions:

1) Select plane basis. Roughly speaking, a tabletop has a length and a width, so it is intuitive that two vectors will be required to construct the basis. One vector is clearly not enough, three vectors are too much.

2) Based on the selected basis set coordinate system(coordinate grid) to assign coordinates to all objects on the table.

Don't be surprised, at first the explanations will be on the fingers. Moreover, on yours. Please place left index finger on the edge of the tabletop so that he looks at the monitor. This will be a vector. Now place right little finger on the edge of the table in the same way - so that it is directed at the monitor screen. This will be a vector. Smile, you look great! What can we say about vectors? Data vectors collinear, which means linear expressed through each other:
, well, or vice versa: , where is some number different from zero.

You can see a picture of this action in class. Vectors for dummies, where I explained the rule for multiplying a vector by a number.

Will your fingers set the basis on the plane of the computer desk? Obviously not. Collinear vectors travel back and forth across alone direction, and a plane has length and width.

Such vectors are called linearly dependent.

Reference: The words “linear”, “linearly” denote the fact that in mathematical equations, expressions do not contain squares, cubes, other powers, logarithms, sines, etc. There are only linear (1st degree) expressions and dependencies.

Two plane vectors linearly dependent if and only if they are collinear.

Cross your fingers on the table so that there is any angle between them other than 0 or 180 degrees. Two plane vectorslinear Not dependent if and only if they are not collinear. So, the basis is obtained. There is no need to be embarrassed that the basis turned out to be “skewed” with non-perpendicular vectors of different lengths. Very soon we will see that not only an angle of 90 degrees is suitable for its construction, and not only unit vectors of equal length

Any plane vector the only way is expanded according to the basis:
, where are real numbers. The numbers are called vector coordinates in this basis.

It is also said that vectorpresented as linear combination basis vectors. That is, the expression is called vector decompositionby basis or linear combination basis vectors.

For example, we can say that the vector is decomposed along an orthonormal basis of the plane, or we can say that it is represented as a linear combination of vectors.

Let's formulate definition of basis formally: The basis of the plane is called a pair of linearly independent (non-collinear) vectors, , wherein any a plane vector is a linear combination of basis vectors.

An essential point of the definition is the fact that the vectors are taken in a certain order. Bases – these are two completely different bases! As they say, you cannot replace the little finger of your left hand in place of the little finger of your right hand.

We have figured out the basis, but it is not enough to set a coordinate grid and assign coordinates to each item on your computer desk. Why isn't it enough? The vectors are free and wander throughout the entire plane. So how do you assign coordinates to those little dirty spots on the table left over from a wild weekend? A starting point is needed. And such a landmark is a point familiar to everyone - the origin of coordinates. Let's understand the coordinate system:

I'll start with the “school” system. Already in the introductory lesson Vectors for dummies I highlighted some differences between the rectangular coordinate system and the orthonormal basis. Here's the standard picture:

When they talk about rectangular coordinate system, then most often they mean the origin of coordinates, coordinate axes and scale along the axes. Try typing “rectangular coordinate system” into a search engine, and you will see that many sources will tell you about coordinate axes familiar from the 5th-6th grade and how to plot points on a plane.

On the other hand, it seems that a rectangular coordinate system can be completely defined in terms of an orthonormal basis. And that's almost true. The wording is as follows:

origin, And orthonormal the basis is set Cartesian rectangular plane coordinate system . That is, the rectangular coordinate system definitely is defined by a single point and two unit orthogonal vectors. That is why you see the drawing that I gave above - in geometric problems Often (but not always) both vectors and coordinate axes are drawn.

I think everyone understands that using a point (origin) and an orthonormal basis ANY POINT on the plane and ANY VECTOR on the plane coordinates can be assigned. Figuratively speaking, “everything on a plane can be numbered.”

Are coordinate vectors required to be unit? No, they can have an arbitrary non-zero length. Consider a point and two orthogonal vectors of arbitrary non-zero length:


Such a basis is called orthogonal. The origin of coordinates with vectors is defined by a coordinate grid, and any point on the plane, any vector has its coordinates in a given basis. For example, or. The obvious inconvenience is that the coordinate vectors in general have different lengths other than unity. If the lengths are equal to unity, then the usual orthonormal basis is obtained.

! Note : in the orthogonal basis, as well as below in the affine bases of plane and space, units along the axes are considered CONDITIONAL. For example, one unit along the x-axis contains 4 cm, one unit along the ordinate axis contains 2 cm. This information is enough to, if necessary, convert “non-standard” coordinates into “our usual centimeters”.

And the second question, which has actually already been answered, is whether the angle between the basis vectors must be equal to 90 degrees? No! As the definition states, the basis vectors must be only non-collinear. Accordingly, the angle can be anything except 0 and 180 degrees.

A point on the plane called origin, And non-collinear vectors, , set affine plane coordinate system :


Sometimes such a coordinate system is called oblique system. As examples, the drawing shows points and vectors:

As you understand, affine system coordinates is even less convenient; the formulas for the lengths of vectors and segments, which we discussed in the second part of the lesson, do not work in it Vectors for dummies, many delicious formulas related to scalar product of vectors. But the rules for adding vectors and multiplying a vector by a number, formulas for dividing a segment in this relation, as well as some other types of problems that we will consider soon are valid.

And the conclusion is that the most convenient special case of an affine coordinate system is the Cartesian rectangular system. That’s why you most often have to see her, my dear one. ...However, everything in this life is relative - there are many situations in which an oblique angle (or some other one, for example, polar) coordinate system. And humanoids might like such systems =)

Let's move on to the practical part. All problems in this lesson are valid both for the rectangular coordinate system and for the general affine case. There is nothing complicated here; all the material is accessible even to a schoolchild.

How to determine collinearity of plane vectors?

Typical thing. In order for two plane vectors were collinear, it is necessary and sufficient that their corresponding coordinates be proportional Essentially, this is a coordinate-by-coordinate detailing of the obvious relationship.

Example 1

a) Check if the vectors are collinear .
b) Do the vectors form a basis? ?

Solution:
a) Let us find out whether there is for vectors proportionality coefficient, such that the equalities are satisfied:

I’ll definitely tell you about the “foppish” version of applying this rule, which works quite well in practice. The idea is to immediately make up the proportion and see if it is correct:

Let's make a proportion from the ratios of the corresponding coordinates of the vectors:

Let's shorten:
, thus the corresponding coordinates are proportional, therefore,

The relationship could be made the other way around; this is an equivalent option:

For self-test, you can use the fact that collinear vectors are linearly expressed through each other. In this case, the equalities take place . Their validity can be easily verified through elementary operations with vectors:

b) Two plane vectors form a basis if they are not collinear (linearly independent). We examine vectors for collinearity . Let's create a system:

From the first equation it follows that , from the second equation it follows that , which means the system is inconsistent(no solutions). Thus, the corresponding coordinates of the vectors are not proportional.

Conclusion: the vectors are linearly independent and form a basis.

A simplified version of the solution looks like this:

Let's make a proportion from the corresponding coordinates of the vectors :
, which means that these vectors are linearly independent and form a basis.

Usually this option is not rejected by reviewers, but a problem arises in cases where some coordinates are equal to zero. Like this: . Or like this: . Or like this: . How to work through proportion here? (indeed, you cannot divide by zero). It is for this reason that I called the simplified solution “foppish”.

Answer: a) , b) form.

A little creative example for independent decision:

Example 2

At what value of the parameter are the vectors will they be collinear?

In the sample solution, the parameter is found through the proportion.

There is an elegant algebraic way to check vectors for collinearity. Let’s systematize our knowledge and add it as the fifth point:

For two plane vectors the following statements are equivalent:

2) the vectors form a basis;
3) the vectors are not collinear;

+ 5) the determinant composed of the coordinates of these vectors is nonzero.

Respectively, the following opposite statements are equivalent:
1) vectors are linearly dependent;
2) vectors do not form a basis;
3) the vectors are collinear;
4) vectors can be linearly expressed through each other;
+ 5) the determinant composed of the coordinates of these vectors is equal to zero.

I really, really hope that this moment you already understand all the terms and statements you come across.

Let's take a closer look at the new, fifth point: two plane vectors are collinear if and only if the determinant composed of the coordinates of the given vectors is equal to zero:. To apply this feature, of course, you need to be able to find determinants.

Let's decide Example 1 in the second way:

a) Let us calculate the determinant made up of the coordinates of the vectors :
, which means that these vectors are collinear.

b) Two plane vectors form a basis if they are not collinear (linearly independent). Let's calculate the determinant made up of vector coordinates :
, which means the vectors are linearly independent and form a basis.

Answer: a) , b) form.

It looks much more compact and prettier than a solution with proportions.

With the help of the material considered, it is possible to establish not only the collinearity of vectors, but also to prove the parallelism of segments and straight lines. Let's consider a couple of problems with specific geometric shapes.

Example 3

The vertices of a quadrilateral are given. Prove that a quadrilateral is a parallelogram.

Proof: There is no need to create a drawing in the problem, since the solution will be purely analytical. Let's remember the definition of a parallelogram:
Parallelogram A quadrilateral whose opposite sides are parallel in pairs is called.

Thus, it is necessary to prove:
1) parallelism of opposite sides and;
2) parallelism of opposite sides and.

We prove:

1) Find the vectors:


2) Find the vectors:

The result is the same vector (“according to school” – equal vectors). Collinearity is quite obvious, but it is better to formalize the decision clearly, with arrangement. Let's calculate the determinant made up of vector coordinates:
, which means that these vectors are collinear, and .

Conclusion: Opposite sides quadrilaterals are parallel in pairs, which means that it is a parallelogram by definition. Q.E.D.

More good and different figures:

Example 4

The vertices of a quadrilateral are given. Prove that a quadrilateral is a trapezoid.

For a more rigorous formulation of the proof, it is better, of course, to get the definition of a trapezoid, but it is enough to simply remember what it looks like.

This is a task for you to solve on your own. Complete solution at the end of the lesson.

And now it’s time to slowly move from the plane into space:

How to determine collinearity of space vectors?

The rule is very similar. In order for two space vectors to be collinear, it is necessary and sufficient that their corresponding coordinates be proportional.

Example 5

Find out whether the following space vectors are collinear:

A) ;
b)
V)

Solution:
a) Let’s check whether there is a coefficient of proportionality for the corresponding coordinates of the vectors:

The system has no solution, which means the vectors are not collinear.

“Simplified” is formalized by checking the proportion. In this case:
– the corresponding coordinates are not proportional, which means the vectors are not collinear.

Answer: the vectors are not collinear.

b-c) These are points for independent decision. Try it out in two ways.

There is a method for checking spatial vectors for collinearity through a third-order determinant; this method is covered in the article Vector product of vectors.

Similar to the plane case, the considered tools can be used to study the parallelism of spatial segments and straight lines.

Welcome to the second section:

Linear dependence and independence of vectors in three-dimensional space.
Spatial basis and affine coordinate system

Many of the patterns that we examined on the plane will be valid for space. I tried to minimize the theory notes, since the lion's share of the information has already been chewed. However, I recommend that you read the introductory part carefully, as new terms and concepts will appear.

Now, instead of the plane of the computer desk, we explore three-dimensional space. First, let's create its basis. Someone is now indoors, someone is outdoors, but in any case, we cannot escape three dimensions: width, length and height. Therefore, to construct a basis, three spatial vectors will be required. One or two vectors are not enough, the fourth is superfluous.

And again we warm up on our fingers. Please raise your hand up and spread it in different directions thumb, index and middle finger. These will be vectors, they look in different directions, have different lengths and have different angles between themselves. Congratulations, the basis of three-dimensional space is ready! By the way, there is no need to demonstrate this to teachers, no matter how hard you twist your fingers, but there is no escape from definitions =)

Next, let's ask ourselves an important question: do any three vectors form a basis three-dimensional space ? Please press three fingers firmly onto the top of the computer desk. What happened? Three vectors are located in the same plane, and, roughly speaking, we have lost one of the dimensions - height. Such vectors are coplanar and, it is quite obvious that the basis of three-dimensional space is not created.

It should be noted that coplanar vectors do not have to lie in the same plane; they can be in parallel planes(just don’t do this with your fingers, only Salvador Dali pulled off this way =)).

Definition: vectors are called coplanar, if there is a plane to which they are parallel. It is logical to add here that if such a plane does not exist, then the vectors will not be coplanar.

Three coplanar vectors are always linearly dependent, that is, they are linearly expressed through each other. For simplicity, let us again imagine that they lie in the same plane. Firstly, vectors are not only coplanar, they can also be collinear, then any vector can be expressed through any vector. In the second case, if, for example, the vectors are not collinear, then the third vector is expressed through them in a unique way: (and why is easy to guess from the materials in the previous section).

The converse is also true: three non-coplanar vectors are always linearly independent, that is, they are in no way expressed through each other. And, obviously, only such vectors can form the basis of three-dimensional space.

Definition: The basis of three-dimensional space is called a triple of linearly independent (non-coplanar) vectors, taken in a certain order, and any vector of space the only way is decomposed over a given basis, where are the coordinates of the vector in this basis

Let me remind you that we can also say that the vector is represented in the form linear combination basis vectors.

The concept of a coordinate system is introduced in exactly the same way as for flat case, one point and any three linearly are enough independent vectors:

origin, And non-coplanar vectors, taken in a certain order, set affine coordinate system of three-dimensional space :

Of course, the coordinate grid is “oblique” and inconvenient, but, nevertheless, the constructed coordinate system allows us definitely determine the coordinates of any vector and the coordinates of any point in space. Similar to a plane, some formulas that I have already mentioned will not work in the affine coordinate system of space.

The most familiar and convenient special case of an affine coordinate system, as everyone guesses, is rectangular space coordinate system:

A point in space called origin, And orthonormal the basis is set Cartesian rectangular space coordinate system . Familiar picture:

Before moving on to practical tasks, let’s again systematize the information:

For three space vectors the following statements are equivalent:
1) the vectors are linearly independent;
2) the vectors form a basis;
3) the vectors are not coplanar;
4) vectors cannot be linearly expressed through each other;
5) the determinant, composed of the coordinates of these vectors, is different from zero.

I think the opposite statements are understandable.

Linear dependence/independence of space vectors is traditionally checked using a determinant (point 5). The remaining practical tasks will be of a pronounced algebraic nature. It's time to hang up the geometry stick and wield the baseball bat of linear algebra:

Three vectors of space are coplanar if and only if the determinant composed of the coordinates of the given vectors is equal to zero: .

I would like to draw your attention to a small technical nuance: the coordinates of vectors can be written not only in columns, but also in rows (the value of the determinant will not change because of this - see properties of determinants). But it is much better in columns, since it is more beneficial for solving some practical problems.

For those readers who have a little forgotten the methods of calculating determinants, or maybe have little understanding of them at all, I recommend one of my oldest lessons: How to calculate the determinant?

Example 6

Check whether the following vectors form the basis of three-dimensional space:

Solution: In fact, the entire solution comes down to calculating the determinant.

a) Let’s calculate the determinant made up of vector coordinates (the determinant is revealed in the first line):

, which means that the vectors are linearly independent (not coplanar) and form the basis of three-dimensional space.

Answer: these vectors form a basis

b) This is a point for independent decision. Full solution and answer at the end of the lesson.

There are also creative tasks:

Example 7

At what value of the parameter will the vectors be coplanar?

Solution: Vectors are coplanar if and only if the determinant composed of the coordinates of these vectors is equal to zero:

Essentially, you need to solve an equation with a determinant. We swoop down on zeros like kites on jerboas - it’s best to open the determinant in the second line and immediately get rid of the minuses:

We carry out further simplifications and reduce the matter to the simplest linear equation:

Answer: at

It’s easy to check here; to do this, you need to substitute the resulting value into the original determinant and make sure that , opening it again.

In conclusion, we will consider another typical problem, which is more algebraic in nature and is traditionally included in a linear algebra course. It is so common that it deserves its own topic:

Prove that 3 vectors form the basis of three-dimensional space
and find the coordinates of the 4th vector in this basis

Example 8

Vectors are given. Show that vectors form a basis in three-dimensional space and find the coordinates of the vector in this basis.

Solution: First, let's deal with the condition. By condition, four vectors are given, and, as you can see, they already have coordinates in some basis. What this basis is is not of interest to us. And the following thing is of interest: three vectors may well form a new basis. And the first stage completely coincides with the solution of Example 6; it is necessary to check whether the vectors are truly linearly independent:

Let's calculate the determinant made up of vector coordinates:

, which means that the vectors are linearly independent and form the basis of three-dimensional space.

! Important : vector coordinates Necessarily write down into columns determinant, not in strings. Otherwise, there will be confusion in the further solution algorithm.

The vector system is called linearly dependent, if there are numbers among which at least one is different from zero, such that the equality https://pandia.ru/text/78/624/images/image004_77.gif" width="57" height="24 src=" >.

If this equality is satisfied only in the case when all , then the system of vectors is called linearly independent.

Theorem. The vector system will linearly dependent if and only if at least one of its vectors is a linear combination of the others.

Example 1. Polynomial is a linear combination of polynomials https://pandia.ru/text/78/624/images/image010_46.gif" width="88 height=24" height="24">. The polynomials constitute a linearly independent system, since the polynomial https: //pandia.ru/text/78/624/images/image012_44.gif" width="129" height="24">.

Example 2. The matrix system, , https://pandia.ru/text/78/624/images/image016_37.gif" width="51" height="48 src="> is linearly independent, since a linear combination is equal to the zero matrix only in in the case when https://pandia.ru/text/78/624/images/image019_27.gif" width="69" height="21">, , https://pandia.ru/text/78/624 /images/image022_26.gif" width="40" height="21"> linearly dependent.

Solution.

Let's make a linear combination of these vectors https://pandia.ru/text/78/624/images/image023_29.gif" width="97" height="24">=0..gif" width="360" height=" 22">.

Equating the same coordinates of equal vectors, we get https://pandia.ru/text/78/624/images/image027_24.gif" width="289" height="69">

Finally we get

And

The system has a unique trivial solution, so a linear combination of these vectors is equal to zero only in the case when all coefficients are equal to zero. Therefore, this system of vectors is linearly independent.

Example 4. The vectors are linearly independent. What will the vector systems be like?

a).;

b).?

Solution.

a). Let's make a linear combination and equate it to zero

Using the properties of operations with vectors in linear space, we rewrite the last equality in the form

Since the vectors are linearly independent, the coefficients at must be equal to zero, i.e..gif" width="12" height="23 src=">

The resulting system of equations has a unique trivial solution .

Since equality (*) executed only when https://pandia.ru/text/78/624/images/image031_26.gif" width="115 height=20" height="20"> – linearly independent;


b). Let's make an equality https://pandia.ru/text/78/624/images/image039_17.gif" width="265" height="24 src="> (**)

Applying similar reasoning, we obtain

Solving the system of equations by the Gauss method, we obtain

or

The latter system has an infinite number of solutions https://pandia.ru/text/78/624/images/image044_14.gif" width="149" height="24 src=">. Thus, there is a non-zero set of coefficients for which holds the equality (**) . Therefore, the system of vectors – linearly dependent.

Example 5 A system of vectors is linearly independent, and a system of vectors is linearly dependent..gif" width="80" height="24">.gif" width="149 height=24" height="24"> (***)

In equality (***) . Indeed, at , the system would be linearly dependent.

From the relation (***) we get or Let's denote .

We get

Problems for independent solution (in the classroom)

1. A system containing a zero vector is linearly dependent.

2. System consisting of one vector A, is linearly dependent if and only if, a=0.

3. A system consisting of two vectors is linearly dependent if and only if the vectors are proportional (that is, one of them is obtained from the other by multiplying by a number).

4. If you add a vector to a linearly dependent system, you get a linearly dependent system.

5. If a vector is removed from a linearly independent system, then the resulting system of vectors is linearly independent.

6. If the system S is linearly independent, but becomes linearly dependent when adding a vector b, then the vector b linearly expressed through system vectors S.

c). System of matrices , , in the space of second-order matrices.

10. Let the system of vectors a,b,c vector space is linearly independent. Prove the linear independence of the following vector systems:

a).a+b, b, c.

b).a+https://pandia.ru/text/78/624/images/image062_13.gif" width="15" height="19">– arbitrary number

c).a+b, a+c, b+c.

11. Let a,b,c– three vectors on the plane from which a triangle can be formed. Will these vectors be linearly dependent?

12. Two vectors are given a1=(1, 2, 3, 4),a2=(0, 0, 0, 1). Find two more four-dimensional vectors a3 anda4 so that the system a1,a2,a3,a4 was linearly independent .

Definition 1. A linear combination of vectors is the sum of the products of these vectors and scalars
:

Definition 2. Vector system
is called a linearly dependent system if their linear combination (2.8) vanishes:

and among the numbers
there is at least one that is different from zero.

Definition 3. Vectors
are called linearly independent if their linear combination (2.8) vanishes only in the case when all numbers.

From these definitions the following corollaries can be obtained.

Corollary 1. In a linearly dependent system of vectors, at least one vector can be expressed as a linear combination of the others.

Proof. Let (2.9) be satisfied and, for definiteness, let the coefficient
. We then have:
. Note that the converse is also true.

Corollary 2. If the system of vectors
contains a zero vector, then this system is (necessarily) linearly dependent - the proof is obvious.

Corollary 3. If among n vectors
any k(
) vectors are linearly dependent, then that’s all n vectors are linearly dependent (we will omit the proof).

2 0 . Linear combinations of two, three and four vectors. Let's consider the issues of linear dependence and independence of vectors on a straight line, plane and in space. Let us present the corresponding theorems.

Theorem 1. In order for two vectors to be linearly dependent, it is necessary and sufficient that they be collinear.

Necessity. Let the vectors And linearly dependent. This means that their linear combination
=0 and (for the sake of definiteness)
. This implies the equality
, and (by definition of multiplying a vector by a number) vectors And collinear.

Adequacy. Let the vectors And collinear ( ) (we assume that they are different from the zero vector; otherwise their linear dependence is obvious).

By Theorem (2.7) (see §2.1, item 2 0) then
such that
, or
– the linear combination is equal to zero, and the coefficient at equals 1 – vectors And linearly dependent.

The following corollary follows from this theorem.

Consequence. If the vectors And are not collinear, then they are linearly independent.

Theorem 2. In order for three vectors to be linearly dependent, it is necessary and sufficient that they be coplanar.

Necessity. Let the vectors ,And linearly dependent. Let us show that they are coplanar.

From the definition of linear dependence of vectors it follows the existence of numbers
And such that the linear combination
, and at the same time (to be specific)
. Then from this equality we can express the vector :=
, that is, the vector equal to the diagonal of a parallelogram constructed on the vectors on the right side of this equality (Fig. 2.6). This means that the vectors ,And lie in the same plane.

Adequacy. Let the vectors ,And coplanar. Let us show that they are linearly dependent.

Let us exclude the case of collinearity of any pair of vectors (because then this pair is linearly dependent and by Corollary 3 (see paragraph 1 0) all three vectors are linearly dependent). Note that this assumption also excludes the existence of a zero vector among these three.

Let's move three coplanar vectors into one plane and bring them to a common origin. Through the end of the vector draw lines parallel to the vectors And ; we get the vectors And (Fig. 2.7) - their existence is ensured by the fact that the vectors And vectors that are not collinear by assumption. It follows that the vector =+. Rewriting this equality in the form (–1) ++=0, we conclude that the vectors ,And linearly dependent.

Two corollaries follow from the proven theorem.

Corollary 1. Let And non-collinear vectors, vector – arbitrary, lying in the plane defined by the vectors And , vector. Then there are numbers And such that

=+. (2.10)

Corollary 2. If the vectors ,And are not coplanar, then they are linearly independent.

Theorem 3. Any four vectors are linearly dependent.

We will omit the proof; with some modifications it copies the proof of Theorem 2. Let us give a corollary from this theorem.

Consequence. For any non-coplanar vectors ,,and any vector
And such that

. (2.11)

Comment. For vectors in (three-dimensional) space, the concepts of linear dependence and independence have, as follows from Theorems 1-3 above, a simple geometric meaning.

Let there be two linearly dependent vectors And . In this case, one of them is a linear combination of the second, that is, it simply differs from it by a numerical factor (for example,
). Geometrically, this means that both vectors are on a common line; they can have the same or opposite directions (Fig. 2.8 xx).

If two vectors are located at an angle to each other (Fig. 2.9 xx), then in this case it is impossible to obtain one of them by multiplying the other by a number - such vectors are linearly independent. Therefore, the linear independence of two vectors And means that these vectors cannot be laid on one straight line.

Let us find out the geometric meaning of linear dependence and independence of three vectors.

Let the vectors ,And are linearly dependent and let (to be specific) the vector is a linear combination of vectors And , that is, located in the plane containing the vectors And . This means that the vectors ,And lie in the same plane. The converse is also true: if the vectors ,And lie in the same plane, then they are linearly dependent.

Thus, the vectors ,And are linearly independent if and only if they do not lie in the same plane.

3 0 . The concept of basis. One of the most important concepts of linear and vector algebra is the concept of basis. Let's introduce some definitions.

Definition 1. A pair of vectors is called ordered if it is specified which vector of this pair is considered the first and which the second.

Definition 2. Ordered pair ,noncollinear vectors is called a basis on the plane defined by the given vectors.

Theorem 1. Any vector on the plane can be represented as a linear combination of the basis system of vectors ,:

(2.12)

and this representation is the only one.

Proof. Let the vectors And form a basis. Then any vector can be represented in the form
.

To prove uniqueness, assume that there is one more decomposition
. We then have = 0, and at least one of the differences is different from zero. The latter means that the vectors And linearly dependent, that is, collinear; this contradicts the statement that they form a basis.

But then there is only decomposition.

Definition 3. A triple of vectors is called ordered if it is indicated which vector is considered the first, which is the second, and which is the third.

Definition 4. An ordered triple of non-coplanar vectors is called a basis in space.

The decomposition and uniqueness theorem also holds here.

Theorem 2. Any vector can be represented as a linear combination of the basis vector system ,,:

(2.13)

and this representation is unique (we will omit the proof of the theorem).

In expansions (2.12) and (2.13) the quantities are called vector coordinates in a given basis (more precisely, by affine coordinates).

With a fixed basis
And
you can write
.

For example, if the basis is given
and it is given that
, then this means that there is a representation (decomposition)
.

4 0 . Linear operations on vectors in coordinate form. The introduction of a basis allows linear operations on vectors to be replaced by ordinary linear operations on numbers - the coordinates of these vectors.

Let some basis be given
. Obviously, specifying the vector coordinates in this basis completely determines the vector itself. The following proposals apply:

a) two vectors
And
are equal if and only if their corresponding coordinates are equal:

b) when multiplying a vector
per number its coordinates are multiplied by this number:

; (2.15)

c) when adding vectors, their corresponding coordinates are added:

We will omit the proofs of these properties; Let us prove property b) only as an example. We have

==

Comment. In space (on the plane) you can choose infinitely many bases.

Let's give an example of a transition from one basis to another, and establish relationships between the vector coordinates in different bases.

Example 1. In the basic system
three vectors are given:
,
And
. In basis ,,vector has decomposition. Find vector coordinates in the basis
.

Solution. We have expansions:
,
,
; hence,
=
+2
+
= =
, that is
in the basis
.

Example 2. Let in some basis
four vectors are given by their coordinates:
,
,
And
.

Find out whether the vectors form
basis; if the answer is positive, find the decomposition of the vector on this basis.

Solution. 1) vectors form a basis if they are linearly independent. Let's make a linear combination of vectors
(
) and find out at what
And it goes to zero:
=0. We have:

=
+
+
=

By defining the equality of vectors in coordinate form, we obtain the following system of (linear homogeneous algebraic) equations:
;
;
, whose determinant
=1
, that is, the system has (only) a trivial solution
. This means linear independence of vectors
and therefore they form a basis.

2) expand the vector on this basis. We have: =
or in coordinate form.

Moving on to the equality of vectors in coordinate form, we obtain a system of linear inhomogeneous algebraic equations:
;
;
. Solving it (for example, using Cramer’s rule), we get:
,
,
And (
)
. We have the vector decomposition in the basis
:=.

5 0 . Projection of a vector onto an axis. Properties of projections. Let there be some axis l, that is, a straight line with a direction chosen on it and let some vector be given Let us define the concept of vector projection per axis l.

Definition. Vector projection per axis l the product of the modulus of this vector and the cosine of the angle between the axis is called l and vector (Fig. 2.10):

. (2.17)

A corollary of this definition is the statement that equal vectors have equal projections (on the same axis).

Let us note the properties of projections.

1) projection of the sum of vectors onto some axis l equal to the sum of the projections of the terms of the vectors onto the same axis:

2) the projection of the product of a scalar by a vector is equal to the product of this scalar by the projection of the vector onto the same axis:

=
. (2.19)

Consequence. The projection of a linear combination of vectors onto the axis is equal to the linear combination of their projections:

We will omit the proofs of the properties.

6 0 . Rectangular Cartesian coordinate system in space.Decomposition of a vector in unit vectors of the axes. Let three mutually perpendicular unit vectors be chosen as a basis; we introduce special notations for them
. By placing their beginnings at a point O, we will direct along them (in accordance with the orts
) coordinate axes Ox,Oy andO z(an axis with a positive direction, origin and unit of length selected on it is called a coordinate axis).

Definition. An ordered system of three mutually perpendicular coordinate axes with a common origin and a common unit of length is called a rectangular Cartesian coordinate system in space.

Axis Ox called the abscissa axis, Oy– ordinate axis uO z axis applicator.

Let's deal with the expansion of an arbitrary vector in terms of basis
. From the theorem (see §2.2, paragraph 3 0, (2.13)) it follows that
can be uniquely expanded over the basis
(here instead of designating coordinates
use
):

. (2.21)

B (2.21)
essence (Cartesian rectangular) vector coordinates . Meaning Cartesian coordinates is established by the following theorem.

Theorem. Cartesian rectangular coordinates
vector are projections of this vector respectively on the axis Ox,Oy andO z.

Proof. Let's place the vector to the origin of the coordinate system - point O. Then its end will coincide with some point
.

Let's draw through the point
three planes parallel to the coordinate planes Oyz,Oxz And Oxy(Fig. 2.11 xx). We then get:

. (2.22)

In (2.22) the vectors
And
are called vector components
along the axes Ox,Oy andO z.

Let through
And the angles formed by the vector are indicated respectively with orts
. Then for the components we obtain the following formulas:

=
=
,
=

=
,
=

=
(2.23)

From (2.21), (2.22) (2.23) we find:

=
=
;=
=
;=
=
(2.23)

– coordinates
vector there are projections of this vector onto the coordinate axes Ox,Oy andO z respectively.

Comment. Numbers
are called direction cosines of the vector .

Vector module (diagonal of a rectangular parallelepiped) is calculated by the formula:

. (2.24)

From formulas (2.23) and (2.24) it follows that the direction cosines can be calculated using the formulas:

=
;
=
;
=
. (2.25)

Raising both sides of each of the equalities in (2.25) and adding the left and right sides of the resulting equalities term by term, we arrive at the formula:

– not any three angles form a certain direction in space, but only those whose cosines are related by relation (2.26).

7 0 . Radius vector and point coordinates.Determining a vector by its beginning and end. Let's introduce a definition.

Definition. The radius vector (denoted ) is the vector connecting the origin O with this point (Fig. 2.12 xx):

. (2.27)

Any point in space corresponds to a certain radius vector (and vice versa). Thus, points in space are represented in vector algebra by their radius vectors.

Obviously the coordinates
points M are projections of its radius vector
on coordinate axes:

(2.28’)

and thus,

(2.28)

– the radius vector of a point is a vector whose projections on the coordinate axes are equal to the coordinates of this point. This leads to two entries:
And
.

We obtain formulas for calculating vector projections
according to the coordinates of its origin - point
and the end - point
.

Let's draw the radius vectors
and vector
(Fig. 2.13). We get that

=
=(2.29)

– the projections of the vector onto the coordinate unit vectors are equal to the differences between the corresponding coordinates of the end and beginning of the vector.

8 0 . Some problems involving Cartesian coordinates.

1) conditions for collinearity of vectors . From the theorem (see §2.1, paragraph 2 0, formula (2.7)) it follows that for collinearity of vectors And it is necessary and sufficient for the following relation to hold: =. From this vector equality we obtain three equalities in coordinate form:, which implies the condition for collinearity of vectors in coordinate form:

(2.30)

– for collinearity of vectors And it is necessary and sufficient that their corresponding coordinates be proportional.

2) distance between points . From representation (2.29) it follows that the distance
between points
And
is determined by the formula

=
=. (2.31)

3) division of a segment in a given ratio . Let points be given
And
and attitude
. Need to find
– point coordinates M (Fig. 2.14).

From the condition of collinearity of vectors we have:
, where
And

. (2.32)

From (2.32) we obtain in coordinate form:

From formulas (2.32’) we can obtain formulas for calculating the coordinates of the midpoint of the segment
, assuming
:

Comment. We will count the segments
And
positive or negative depending on whether their direction coincides with the direction from the beginning
segment to the end
, or does not match. Then, using formulas (2.32) – (2.32”), you can find the coordinates of the point dividing the segment
externally, that is, in such a way that the dividing point M is on the continuation of the segment
, and not inside it. At the same time, of course,
.

4) spherical surface equation . Let's create an equation for a spherical surface - the geometric locus of points
, equidistant at a distance from some fixed center - a point
. It is obvious that in this case
and taking into account formula (2.31)

Equation (2.33) is the equation of the desired spherical surface.


The concepts of linear dependence and independence of a system of vectors are very important when studying vector algebra, since the concepts of dimension and basis of space are based on them. In this article we will give definitions, consider the properties of linear dependence and independence, obtain an algorithm for studying a system of vectors for linear dependence, and analyze in detail the solutions of examples.

Page navigation.

Determination of linear dependence and linear independence of a system of vectors.

Let's consider a set of p n-dimensional vectors, denote them as follows. Let's make a linear combination of these vectors and arbitrary numbers (real or complex): . Based on the definition of operations on n-dimensional vectors, as well as the properties of the operations of adding vectors and multiplying a vector by a number, it can be argued that the written linear combination represents some n-dimensional vector, that is, .

This is how we approached the definition of the linear dependence of a system of vectors.

Definition.

If a linear combination can represent a zero vector then when among the numbers there is at least one non-zero, then the system of vectors is called linearly dependent.

Definition.

If a linear combination is a zero vector only when all numbers are equal to zero, then the system of vectors is called linearly independent.

Properties of linear dependence and independence.

Based on these definitions, we formulate and prove properties of linear dependence and linear independence of a system of vectors.

    If several vectors are added to a linearly dependent system of vectors, the resulting system will be linearly dependent.

    Proof.

    Since the system of vectors is linearly dependent, equality is possible if there is at least one non-zero number from the numbers . Let .

    Let's add s more vectors to the original system of vectors , and we obtain the system . Since and , then the linear combination of vectors of this system is of the form

    represents the zero vector, and . Consequently, the resulting system of vectors is linearly dependent.

    If several vectors are excluded from a linearly independent system of vectors, then the resulting system will be linearly independent.

    Proof.

    Let us assume that the resulting system is linearly dependent. By adding all the discarded vectors to this system of vectors, we obtain the original system of vectors. By condition, it is linearly independent, but due to the previous property of linear dependence, it must be linearly dependent. We have arrived at a contradiction, therefore our assumption is incorrect.

    If a system of vectors has at least one zero vector, then such a system is linearly dependent.

    Proof.

    Let the vector in this system of vectors be zero. Let us assume that the original system of vectors is linearly independent. Then vector equality is possible only when . However, if we take any , different from zero, then the equality will still be true, since . Consequently, our assumption is incorrect, and the original system of vectors is linearly dependent.

    If a system of vectors is linearly dependent, then at least one of its vectors is linearly expressed in terms of the others. If a system of vectors is linearly independent, then none of the vectors can be expressed in terms of the others.

    Proof.

    First, let's prove the first statement.

    Let the system of vectors be linearly dependent, then there is at least one nonzero number and the equality is true. This equality can be resolved with respect to , since in this case we have

    Consequently, the vector is linearly expressed through the remaining vectors of the system, which is what needed to be proved.

    Now let's prove the second statement.

    Since the system of vectors is linearly independent, equality is possible only for .

    Let us assume that some vector of the system is expressed linearly in terms of the others. Let this vector be , then . This equality can be rewritten as , on its left side there is a linear combination of system vectors, and the coefficient in front of the vector is different from zero, which indicates a linear dependence of the original system of vectors. So we came to a contradiction, which means the property is proven.

An important statement follows from the last two properties:
if a system of vectors contains vectors and , where is an arbitrary number, then it is linearly dependent.

Study of a system of vectors for linear dependence.

Let's pose a problem: we need to establish a linear dependence or linear independence of a system of vectors.

The logical question is: “how to solve it?”

Something useful from a practical point of view can be learned from the definitions and properties of linear dependence and independence of a system of vectors discussed above. These definitions and properties allow us to establish a linear dependence of a system of vectors in the following cases:

What to do in other cases, which are the majority?

Let's figure this out.

Let us recall the formulation of the theorem on the rank of a matrix, which we presented in the article.

Theorem.

Let r – rank of matrix A of order p by n, . Let M be the basis minor of the matrix A. All rows (all columns) of the matrix A that do not participate in the formation of the basis minor M are linearly expressed through the rows (columns) of the matrix generating the basis minor M.

Now let us explain the connection between the theorem on the rank of a matrix and the study of a system of vectors for linear dependence.

Let's compose a matrix A, the rows of which will be the vectors of the system under study:

What would linear independence of a system of vectors mean?

From the fourth property of linear independence of a system of vectors, we know that none of the vectors of the system can be expressed in terms of the others. In other words, no row of matrix A will be linearly expressed in terms of other rows, therefore, linear independence of the system of vectors will be equivalent to the condition Rank(A)=p.

What will the linear dependence of the system of vectors mean?

Everything is very simple: at least one row of the matrix A will be linearly expressed in terms of the others, therefore, linear dependence of the system of vectors will be equivalent to the condition Rank(A)

.

So, the problem of studying a system of vectors for linear dependence is reduced to the problem of finding the rank of a matrix composed of vectors of this system.

It should be noted that for p>n the system of vectors will be linearly dependent.

Comment: when compiling matrix A, the vectors of the system can be taken not as rows, but as columns.

Algorithm for studying a system of vectors for linear dependence.

Let's look at the algorithm using examples.

Examples of studying a system of vectors for linear dependence.

Example.

A system of vectors is given. Examine it for linear dependence.

Solution.

Since the vector c is zero, the original system of vectors is linearly dependent due to the third property.

Answer:

The vector system is linearly dependent.

Example.

Examine a system of vectors for linear dependence.

Solution.

It is not difficult to notice that the coordinates of the vector c are equal to the corresponding coordinates of the vector multiplied by 3, that is, . Therefore, the original system of vectors is linearly dependent.

Task 1. Find out whether the system of vectors is linearly independent. The system of vectors will be specified by the matrix of the system, the columns of which consist of the coordinates of the vectors.

.

Solution. Let the linear combination equal to zero. Having written this equality in coordinates, we obtain the following system of equations:

.

Such a system of equations is called triangular. She has only one solution . Therefore, the vectors linearly independent.

Task 2. Find out whether the system of vectors is linearly independent.

.

Solution. Vectors are linearly independent (see Problem 1). Let us prove that the vector is a linear combination of vectors . Vector expansion coefficients are determined from the system of equations

.

This system, like a triangular one, has a unique solution.

Therefore, the system of vectors linearly dependent.

Comment. Matrices of the same type as in Problem 1 are called triangular , and in problem 2 – stepped triangular . The question of the linear dependence of a system of vectors is easily solved if the matrix composed of the coordinates of these vectors is step triangular. If the matrix does not have a special form, then using elementary string conversions , preserving linear relationships between the columns, it can be reduced to a step-triangular form.

Elementary string conversions matrices (EPS) the following operations on a matrix are called:

1) rearrangement of lines;

2) multiplying a string by a non-zero number;

3) adding another string to a string, multiplied by an arbitrary number.

Task 3. Find the maximum linearly independent subsystem and calculate the rank of the system of vectors

.

Solution. Let us reduce the matrix of the system using EPS to a step-triangular form. To explain the procedure, we denote the line with the number of the matrix to be transformed by the symbol . The column after the arrow indicates the actions on the rows of the matrix being converted that must be performed to obtain the rows of the new matrix.


.

Obviously, the first two columns of the resulting matrix are linearly independent, the third column is their linear combination, and the fourth does not depend on the first two. Vectors are called basic. They form a maximal linearly independent subsystem of the system , and the rank of the system is three.



Basis, coordinates

Task 4. Find the basis and coordinates of the vectors in this basis on the set geometric vectors, whose coordinates satisfy the condition .

Solution. The set is a plane passing through the origin. An arbitrary basis on a plane consists of two non-collinear vectors. The coordinates of the vectors in the selected basis are determined by solving the corresponding system of linear equations.

There is another way to solve this problem, when you can find the basis using the coordinates.

Coordinates spaces are not coordinates on the plane, since they are related by the relation , that is, they are not independent. The independent variables and (they are called free) uniquely define a vector on the plane and, therefore, they can be chosen as coordinates in . Then the basis consists of vectors lying in and corresponding to sets of free variables And , that is .

Task 5. Find the basis and coordinates of the vectors in this basis on the set of all vectors in space whose odd coordinates are equal to each other.

Solution. Let us choose, as in the previous problem, coordinates in space.

Because , then free variables uniquely determine the vector from and are therefore coordinates. The corresponding basis consists of vectors.

Task 6. Find the basis and coordinates of the vectors in this basis on the set of all matrices of the form , Where – arbitrary numbers.

Solution. Each matrix from is uniquely representable in the form:

This relation is the expansion of the vector from with respect to the basis
with coordinates .

Task 7. Find the dimension and basis of the linear hull of a system of vectors

.

Solution. Using the EPS, we transform the matrix from the coordinates of the system vectors to a step-triangular form.




.

Columns the last matrices are linearly independent, and the columns linearly expressed through them. Therefore, the vectors form a basis , And .

Comment. Basis in is chosen ambiguously. For example, vectors also form a basis .