75 lines
3.1 KiB
Plaintext
75 lines
3.1 KiB
Plaintext
namespace Eigen {
|
|
|
|
/** \eigenManualPage LeastSquares Solving linear least squares systems
|
|
|
|
This page describes how to solve linear least squares systems using %Eigen. An overdetermined system
|
|
of equations, say \a Ax = \a b, has no solutions. In this case, it makes sense to search for the
|
|
vector \a x which is closest to being a solution, in the sense that the difference \a Ax - \a b is
|
|
as small as possible. This \a x is called the least square solution (if the Euclidean norm is used).
|
|
|
|
The three methods discussed on this page are the SVD decomposition, the QR decomposition and normal
|
|
equations. Of these, the SVD decomposition is generally the most accurate but the slowest, normal
|
|
equations is the fastest but least accurate, and the QR decomposition is in between.
|
|
|
|
\eigenAutoToc
|
|
|
|
|
|
\section LeastSquaresSVD Using the SVD decomposition
|
|
|
|
The \link BDCSVD::solve() solve() \endlink method in the BDCSVD class can be directly used to
|
|
solve linear squares systems. It is not enough to compute only the singular values (the default for
|
|
this class); you also need the singular vectors but the thin SVD decomposition suffices for
|
|
computing least squares solutions:
|
|
|
|
<table class="example">
|
|
<tr><th>Example:</th><th>Output:</th></tr>
|
|
<tr>
|
|
<td>\include TutorialLinAlgSVDSolve.cpp </td>
|
|
<td>\verbinclude TutorialLinAlgSVDSolve.out </td>
|
|
</tr>
|
|
</table>
|
|
|
|
This is example from the page \link TutorialLinearAlgebra Linear algebra and decompositions \endlink.
|
|
If you just need to solve the least squares problem, but are not interested in the SVD per se, a
|
|
faster alternative method is CompleteOrthogonalDecomposition.
|
|
|
|
|
|
\section LeastSquaresQR Using the QR decomposition
|
|
|
|
The solve() method in QR decomposition classes also computes the least squares solution. There are
|
|
three QR decomposition classes: HouseholderQR (no pivoting, fast but unstable if your matrix is
|
|
not rull rank), ColPivHouseholderQR (column pivoting, thus a bit slower but more stable) and
|
|
FullPivHouseholderQR (full pivoting, so slowest and slightly more stable than ColPivHouseholderQR).
|
|
Here is an example with column pivoting:
|
|
|
|
<table class="example">
|
|
<tr><th>Example:</th><th>Output:</th></tr>
|
|
<tr>
|
|
<td>\include LeastSquaresQR.cpp </td>
|
|
<td>\verbinclude LeastSquaresQR.out </td>
|
|
</tr>
|
|
</table>
|
|
|
|
|
|
\section LeastSquaresNormalEquations Using normal equations
|
|
|
|
Finding the least squares solution of \a Ax = \a b is equivalent to solving the normal equation
|
|
<i>A</i><sup>T</sup><i>Ax</i> = <i>A</i><sup>T</sup><i>b</i>. This leads to the following code
|
|
|
|
<table class="example">
|
|
<tr><th>Example:</th><th>Output:</th></tr>
|
|
<tr>
|
|
<td>\include LeastSquaresNormalEquations.cpp </td>
|
|
<td>\verbinclude LeastSquaresNormalEquations.out </td>
|
|
</tr>
|
|
</table>
|
|
|
|
This method is usually the fastest, especially when \a A is "tall and skinny". However, if the
|
|
matrix \a A is even mildly ill-conditioned, this is not a good method, because the condition number
|
|
of <i>A</i><sup>T</sup><i>A</i> is the square of the condition number of \a A. This means that you
|
|
lose roughly twice as many digits of accuracy using the normal equation, compared to the more stable
|
|
methods mentioned above.
|
|
|
|
*/
|
|
|
|
} |