Linear combinations, span, and basis vectors: Difference between revisions
No edit summary |
|||
Line 15: | Line 15: | ||
====How do we know if a vector is linearly dependent?==== | ====How do we know if a vector is linearly dependent?==== | ||
{{#ev:youtube|https://www.youtube.com/watch?v=k7RM-ot2NWY| | {{#ev:youtube|https://www.youtube.com/watch?v=k7RM-ot2NWY|250|lft|Let eet GO|frame||||start=0&end=100&loop=1}} | ||
Revision as of 13:58, 7 November 2021
A set B of vectors in a vector space V is called a basis if every element of V may be written in a unique way as a finite linear combination of elements of B.
The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B.
The elements of a basis are called basis vectors.
Equivalently, a set B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B.[1] In other words, a basis is a linearly independent spanning set.
A vector space can have several bases; however all the bases have the same number of elements, called the dimension of the vector space.
How do we know if a vector is linearly dependent?
Given a set of vectors, we can determine if they are linearly independent by writing the vectors as the columns of the matrix A, and solving Ax = 0. If there are any non-zero solutions, then the vectors are linearly dependent. If the only solution is x = 0, then they are linearly independent. In other words, if there is a nontrivial linear combination of the vectors that equals the zero vector. If no such linear combination exists, then the vectors are said to be linearly independent.
span basis of a vector linearly dependent linearly independent