help numerical analysis

shelly89

Junior Member
Joined
Oct 17, 2012
Messages
53
For any integer value of n, define F(n) as the combined operation count of forward and backsubstitutions and define G(n) as the operation count of gaussian elimination.
When n gets larger and larger,why does F(n) grow much more slowly that G(n) does?


I have no idea why? can anyone point me in the right direction.

Thank you.
 
I'm not sure if this is what you are asking but I'll take a cut at it from the stand point of someone who last worried about those kind of thing some time ago. Lets assume we want to solve a system of n equations in n unknowns and assume you do not have a sparse matrix. The number of Gaussian elimination operations one must do at row j (j=2, 3, 4, ..., n) for column i (i=1, 2, 3, ..., n-1) is one divide, n-i+1 multiplies (and n-i+1 adds but since adds are so short compared to multiplies and divides, they are generally ignored if there aren't a dominating number of them). Thus for row j we have n-1 divides,
MTj = (n-1) (n2-1 - \(\displaystyle \Sigma_1^{n-1}\,\, i = \frac{n^2 + n - 2}{2}) \) ~ 0.5 n^3
Once the system has been tri-diagonalized we can do back substitution to compute the unknowns. For the j th unknown (j=n, n-1, n-2, ..., 1) we have a divide and n-j multiplies (and some add/subtracts). The number of multiplies are
MBj = n2 - \(\displaystyle \Sigma_1^n\,\, i =\, \frac{n^2 - n}{2} \) ~ 0.5 n^2

Since the number of divides are about the same the difference in solution time is in the number of multiplication operations with Gaussian elimination being on the order of n3 [G(n)~n3]and back substitution on the order of n2[F(n)~n2].
 
Top