Monday, February 8, 2010

If the columns of a matrix A are linearly independent, then so are its rows (True or False)?

True or false, and why?If the columns of a matrix A are linearly independent, then so are its rows (True or False)?
This is false. As a simple example, the 3x2 matrix with rows





1 0,


0 1,


0 0.





More generally, if you add a row of 0s to the bottom of any matrix with linearly independent columns, you get a matrix with linearly independent columns whose rows are linearly dependent.





If you knew additionally that A were _square_, you _would_ be able to conclude from linear independence of the columns that the rows are linearly independent. (In this case, both conditions are equivalent to the condition that A be invertible.)





A square matrix is the only kind of matrix whose sets of rows and columns can both be linearly independent at the same time. If you have a collection of r vectors in a vector space of dimension s and r %26gt; s, then the collection is automatically linearly dependent. If A is an mxn matrix, and m %26gt; n, we can apply this observation to conclude that the rows are linearly dependent (being m things in a space of smaller dimension), and if m %26lt; n, we can apply this observation to conclude that the columns are linearly dependent (being n things in a space of smaller dimension).
  • long hair styles
  • No comments:

    Post a Comment