CHAPTER 09.08: ADEQUACY OF SOLUTIONS:
Relating changes in coefficient matrix to changes in solution vector Proof In
this segment we will prove this particular theorem which relates the relative
change in the solution vector to the relative change we are making in the
coefficient matrix as far as the norms are concerned. If we are given this
set of equation AX=C. What we want to see is that, hey, if we make a small
change in our coefficient matrix shown by delta A. Then it is going to result
in a change in the solution vector, which have given as delta X and we want
to show that the relative change in the solution vector is related to the
relative change in the coefficient matrix based on the norms by on this
quantity right here. Which is the norm of A times the norm of A inverse. It
can amplify as much as this number which is the norm of A times the norm of A
inverse, which is also called the conditioned number of the matrix. So
let’s go ahead and see how we can prove this. So let A times X be equal to C.
So this is a given set of equations which we have. Now what we are doing is
we are making a change to the coefficient matrix. We are making A to be A
prime. Then we know if hey if we keep
the right side to be the same, then it is going to result in a change in the
solution vector and it’s called X prime. Now if that’s the case, then let’s
go and define two different matrixes. One matrix we will define as what is
delta A. Delta A is defined as A prime minus A. So it is the coefficient
matrix which/what we changed it to minus the original A matrix. And delta X
will be called to be X prime minus X. It is the difference between what this
X got changed to minus the original solution vector. So based on this, let’s
go and see and figure out what is the relationship between the relative
change in X to the relative change in the A matrix right here. So we know
that A times X is equal to C and so is A prime time’s X prime equal to C. So
that means that this quantity is the same as this quantity right here. So I
can write that A times X is equal to A prime and X prime. They are the same.
But we know that A prime is nothing but A plus delta A. What the original matrix
was and what it changed to it is. And we know that X prime is nothing but X
plus delta X. So based on this we can use our rules of binary operations here.
So we can say hey, this is A times X plus delta A times X plus A times delta X
plus delta A times delta X. So we are basically expanding this multiplication
of two matrixes to be that. That’s A times X. Now what we are going to do is,
since this is the same as this that cancels out. And what we are going to do
is bring this spot here to the left hand side. That becomes minus A times
delta X and that will be equal to delta A times X plus delta A times delta X.
Which is the same as saying, hey let’s put this as delta A times X plus delta
X. So we can take a common delta A out of these two additions and we get
that. Now since we have this, what we can also do is, we can multiply both
sides by A inverse. So I can say A inverse A times delta X. So we are
assuming that inverse does exist. Only then will we have a unique solution.
So assuming that A inverse does exist for this set of equations, we are going
to multiply by A inverse on both sides. So I get A inverse times delta A
times X plus delta X right here. So
the reason why we are doing that is because we are trying to get what delta X
is and then we are going to apply some theorems about norms or the properties
of norms to be able to come up with the proof of our theorem. So this and
this does not cancel but I shouldn’t have it. It gives us the identity matrix
So I times delta X. So this A times A will give you the identity matrix. If
you are assuming that inverse exist.so A inverse times delta A times X plus
delta X and since I times anything or any other matrixes like delta X equals
itself.(5:47)It is defined in this case. So we get minus delta X equal’s
inverse A times delta A times X plus delta X. Now we already know if we find
the norm of this matrix right here that has to be less than the product of
these three norms. What I mean by saying is that the norm of this matrix here
will be less than or equal to the product of this norm, of norm of this and norm
of this. So what we get is a norm of delta X will be less than or equal to A
inverse times delta A times norm of X plus delta X. So
we are shedding these brackets here just for simplicity reasons. They are
still there. It’s just so it doesn't look bad so that’s why we are shedding
these brackets there but they are still matrixes but the norm of these
matrixes are simply single scalar, positive scaler. Now what we are going to
do is, we are going to multiply by A, norm of A. Again, norm of A is not
going to be zero. Because if it is zero, then you will not have the inverse
of the matrix. Less than or equal than the norm of A times norm of A inverse
times norm of delta A times norm of X plus delta X. Keep in mind what we are
doing here is that we are multiplying by a scalar here, which is norm of A
and I said that norm of A is even for zero, it will still hold the
inequality. But we know that norm of A is not zero because if it were then A
inverse would not exist. So now we do is, we do a little bit of manipulation
here. We did norm of delta X, we take this X plus delta X norm to the left
hand side. Keep in mind that these inequalities will not change because all
we are doing is that we are using positive numbers throughout the inequality.
So that doesn’t change the inequality from less than equal to greater than
equal to. From here I will get norm of A times norm of
A inverse times norm of delta A divided by norm of A. The reason why we
multiplied both sides by A so we get the conditioned number right here. That
was the reason why we did that. And so now we have the relative change in the
solution vector. Which can be amplified as much as the conditioned number of
the coefficient matrix if you are making a change in the coefficient matrix.
And that’s the proof of this theorem. And that of course is the end of this
segment. |