Gaussian Elimination: Backsubstitution

The Gaussian Elimination with back-substitution is more optimal and less overwhelming that Gaussian Jordan. It uses partial pivoting, i.e. the pivoting is done only using row transforms, as a result the order of the solution and variable vectors remains unchanged. This mitigates the overhead of book-keeping and column swapping.

Basic Steps

Let us consider a matrix:

$\left[ \begin{array}{cccc} a_{00} & a_{01} & a_{02} & a_{03} \\ a_{10} & a_{11} & a_{12} & a_{13} \\ a_{20} & a_{21} & a_{22} & a_{23} \\ a_{30} & a_{31} & a_{32} & a_{33}\end{array} \right]$

We follow a similar but simpler procedure to the Gaussian Jordan Method.

Step 1: Upper Triangular Matrix

Only the elements below the pivot element are reduced to zero by subtracting the right amount of the “pivot row”.

After iterating over each pivot element we get an upper triangular matrix:

$\left[ \begin{array}{cccc} a_{00}^\prime & a_{01}^\prime & a_{02}^\prime & a_{03}^\prime \\ 0 & a_{11}^\prime & a_{12}^\prime & a_{13}^\prime \\ 0 & 0 & a_{22}^\prime & a_{23} ^\prime\\ 0 & 0 & 0 & a_{33}^\prime\end{array} \right] \cdot \left[ \begin{array}{c} x_0 \\ x_1 \\ x_2 \\ x_3\end{array} \right] = \left[ \begin{array}{c} b_0^\prime \\ b_1^\prime \\ b_2^\prime \\ b_3^\prime\end{array} \right]$

Each in the above equation is shown with a “prime” signifying that the element has changed during the transforms

Step 2: Back-Substitution

The name back-substitution arrives from the fact that that the last equation is a univariable equation and is trivial.

$a_{33}^\prime x_3 = b_3^\prime$

This value can be “back-substituted” into the previous equation to get the value of $x_2$

$a_{22}^\prime x_2 + a_{23}^\prime x_3 = b_2^\prime$

which further gives,

$x_2 = \frac{1}{a_{22}^\prime}\left[b_2^\prime - a_{23}^\prime x_3\right]$

The typical back-substitution can be represented with:

$x_i = \frac{1}{a_{ii}^\prime}\left[b_i^\prime - \sum_{j = i + 1}^{N - 1}a_{ij}^\prime x_j\right]$

Performance Considerations

Strictly talking in terms of complexity, both Gaussian Jordan Elimination and Gaussian Elimination with back-substitution are $O(N^3)$ algorithms. The latter is more optimal because of the reduction in the amount of operations in the innermost for loops. The difference can be attributed to full pivoting as all rows are reduced as opposed to only a subset of rows (resultant is a triangular matrix) in Gaussian Elimination with back-substitution. This reduces the number of multiplications ($N^3$) and additions ($N^2 M$) by a factor of 3. We can reduce this factor to 1.5 by avoiding the calculation of the inverse in Gaussian Jordan Elimination.

Tryst at Blogs..

Aside

I am going to try my hand at blogging. I am tech and math geek so my blog will be themed upon that..