Matrix Rank Calculator

Find the rank of a matrix using row echelon form with step-by-step row reduction.

Enter Matrix

Result

Matrix Rank
--
Matrix Size --
Non-zero Rows (REF) --
Nullity --

Step-by-Step Row Reduction

Rank = number of non-zero rows in row echelon form

Understanding Matrix Rank

The rank of a matrix is the dimension of the vector space spanned by its columns (or equivalently, its rows). It represents the maximum number of linearly independent column vectors (or row vectors) in the matrix. The rank gives us essential information about the system of linear equations the matrix represents.

How to Find Matrix Rank

The most common method for finding the rank of a matrix is through row reduction to row echelon form (REF). The process involves applying elementary row operations until the matrix is in upper triangular form, then counting the non-zero rows.

Elementary Row Operations

  • Row Swap: Interchange two rows (Ri ↔ Rj).
  • Scalar Multiplication: Multiply a row by a non-zero scalar (kRi).
  • Row Addition: Add a scalar multiple of one row to another (Ri + kRj).

Key Properties of Matrix Rank

Rank-Nullity Theorem

For an m x n matrix A: rank(A) + nullity(A) = n (number of columns).

rank(A) + nullity(A) = n

Rank Bounds

The rank of an m x n matrix is at most min(m, n).

rank(A) ≤ min(m, n)

Full Rank

A square n x n matrix has full rank (rank = n) if and only if it is invertible.

Full rank ⇔ det(A) ≠ 0

Transpose Property

The rank of a matrix equals the rank of its transpose.

rank(A) = rank(AT)

Applications of Matrix Rank

Matrix rank is fundamental in linear algebra and has applications in solving systems of linear equations, determining the dimension of solution spaces, data compression (via low-rank approximations), computer graphics transformations, and machine learning (principal component analysis). A system Ax = b has a solution if and only if rank(A) = rank([A|b]).

Rank and Linear Independence

If the rank of a matrix equals the number of its columns, then its column vectors are linearly independent. This is crucial for determining whether a set of vectors forms a basis for a vector space, and for understanding the structure of solutions to linear systems.