In linear algebra, a QR decomposition (also called a QR factorization) of a matrix is a decomposition of a matrix A into a product A=QR of an orthogonal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem, and is the basis for a particular eigenvalue algorithm, the QR algorithm.
If A has linearly independent columns (say n columns), then the first n columns of Q form an orthonormal basis for the column space of A. More specifically, the first k columns of Q form an orthonormal basis for the span of the first k columns of A for any 1≤k≤n.[1] The fact that any column k of A only depends on the first k columns of Q is responsible for the triangular form of R.
http://en.wikipedia.org/wiki/QR_decomposition