I'm not sure if this is what you are asking but I'll take a cut at it from the stand point of someone who last worried about those kind of thing some time ago. Lets assume we want to solve a system of n equations in n unknowns and assume you do not have a sparse matrix. The number of Gaussian elimination operations one must do at row j (j=2, 3, 4, ..., n) for column i (i=1, 2, 3, ..., n-1) is one divide, n-i+1 multiplies (and n-i+1 adds but since adds are so short compared to multiplies and divides, they are generally ignored if there aren't a dominating number of them). Thus for row j we have n-1 divides,
MT
j = (n-1) (n
2-1 -
Σ1n−1i=2n2+n−2) ~ 0.5 n^3
Once the system has been tri-diagonalized we can do back substitution to compute the unknowns. For the j th unknown (j=n, n-1, n-2, ..., 1) we have a divide and n-j multiplies (and some add/subtracts). The number of multiplies are
MB
j = n
2 -
Σ1ni=2n2−n ~ 0.5 n^2
Since the number of divides are about the same the difference in solution time is in the number of multiplication operations with Gaussian elimination being on the order of n
3 [G(n)~n
3]and back substitution on the order of n
2[F(n)~n
2].