# Thread: Proof for any smaller common divisor than g.c.d. being not a linear combination

1. ## Proof for any smaller common divisor than g.c.d. being not a linear combination

In linear combinations, the basic property is if there are two integer quantities, say a and d, then any integer multiplied to the two will again lead to an integer; i.e. ax+dy, ∀a,b,x,y∈Z is also an integer. This is based on the closure of integers w.r.t. addition.

In g.c.d. computation of d = qa + r, with d(dividend), a(divisor), q(quotient), r(remainder), there may be many common divisors but if smaller than g.c.d., then not a linear combination of a and r.

I am unable to make a proof for the above, that is rigorous and directly addresses the issue rather than contradiction based approach, as the below attempt shows the failure of my rigorous approach.

If c is a common divisor of d and a, then d=nc, a=mc, ∃n,m∈Z. So, d=qa+r => c(n−m)=r. Also, n≥m as d≥a. So, c∣r also. Alternately, the divisibility of r by c can be derived by the linear combination: r=d−qa, i.e. if the linear combination c∣d & c∣qa; then c∣(d−qa), and hence c∣r.

Also, gcd will make use of two linear combinations at each step: (i) d=qa+r, (ii) r=d−qa. By (i), any common divisor d′ of a & r is also a divisor of d. Similarly, by (ii) the same common divisor d′ of d and a is also a divisor of r.

2. What is the exact proposition to be proved? Why is a proof by contradiction unacceptable?

3. Originally Posted by JeffM
What is the exact proposition to be proved? Why is a proof by contradiction unacceptable?
I view it as a fundamental case in which need apply proof-constructing skills. I know that it is an involved proof, and by involved I mean that will consider a lot of nitty-gritty. But, this is what is needed by me to make me better at proofing. Also, I want to be sure that there is no alternate way (apart from contradiction) in such basic premise sort of (axiomatic) cases.

4. You still have not stated fully and exactly what is the proposition to be proved.

5. Originally Posted by JeffM
You still have not stated fully and exactly what is the proposition to be proved.
I want to prove a circular reasoning, as far as my understanding goes. I want to prove (i) that there are common divisors smaller than any linear combination, (ii) that any common divisor lower than g.c.d. is not a linear combination.

My understanding is that some logic as below, if developed could achieve the goal:

If I take two integers (smallest possible case), let 'a' and 'b'. Then their linear combination (with any integer 'x', 'y') is given by ax + by = c. 'c' is not guaranteed to be divisor of 'a' and 'b', unless it is a gcd. Even, to be a linear combination (the value 'c') it must be a positive or negative multiple of g.c.d.

Issue is I cannot prove that if 'd' is a common divisor of 'a' and 'b', then 'd' cannot be a linear combination. However, it is obvious that d must be a divisor of any linear combination, including the g.c.d.

Finally, the proof should conclude that the only divisors of 'a' and 'b' are all common divisors up till the g.c.d.

6. I shall ask for the third time: can you give in mathematical notation the proposition that you want to prove? if you cannot, then I am going to stop bothering with your posts.

7. Originally Posted by JeffM
I shall ask for the third time: can you give in mathematical notation the proposition that you want to prove? if you cannot, then I am going to stop bothering with your posts.
I hope I have solved it.
-------------------------------

I want to prove that if there are possibly divisors smaller than g.c.d., they are not linear combinations of the integers involved.

If c is a common divisor, then a = cx, b = cy, ∃x,y∈Z. Then, c is a divisor of the linear combination cx+cy. Here is the fallacy, as how can I mathematically say that a gcd is a linear combination, but not c.

My logic goes as follows:

As earlier stated, a=cx, b=cy. So, need find new multipliers, say ∃ e,f∈Z to prove that c = e.c.x+f.c.y is a linear combination.
The equation can be reduced to, for c≠0:
1 = ex+fy
So, the options are :

(i) ex=1,fy=0 => both e=±1e=±1, and either f=0, or y=0
(ii) ex=0,fy=1 => both f=±1, & y=±1, and either e=0, or x=0
But, x,y≠1, as these are the multipliers needed to equate c to a,b respectively.

8. Originally Posted by jiten
I hope I have solved it.
-------------------------------

I want to prove that if there are possibly divisors smaller than g.c.d., they are not linear combinations of the integers involved.

If c is a common divisor, then a = cx, b = cy, ∃x,y∈Z. Then, c is a divisor of the linear combination cx+cy. Here is the fallacy, as how can I mathematically say that a gcd is a linear combination, but not c.

My logic goes as follows:

As earlier stated, a=cx, b=cy. So, need find new multipliers, say ∃ e,f∈Z to prove that c = e.c.x+f.c.y is a linear combination.
This equation can be reduced to, for c≠0:
1 = ex+fy
So, the options are :

(i) ex=1,fy=0 => both e=±1e=±1, and either f=0, or y=0
(ii) ex=0,fy=1 => both f=±1, & y=±1, and either e=0, or x=0
But, x,y≠1, as these are the multipliers needed to equate c to a,b respectively.
This is like pulling teeth.

$\text {Given: } a, \, b,\ c,\ x,\ y \in \mathbb Z,\ c \ne 0,\ a = cx, \text { and } b = cy.$

Are those the conditions? What then is to be deduced from those conditions? And how does the greatest common divisor of some unspecified number enter the picture. It is of course true that c is a divisor
of a + b, but there is no reason to believe that c is the greatest common divisor. And do you really want to work in Z rather than in N? Anyway, we can make c be the greatest common divisor by adding the condition

$w \in \mathbb Z \text { and } w \ | \ a + b \implies w \le c$.

Is that a condition that needs to be added?

I still do not know what you are hoping to prove or disprove?

I wanted Z rather than N, as even though a & b are expressed as cx & cy, they can be negatives as the a & b are in Z.

I just wanted to prove that any common divisor less than g.c.d. is not a linear combination, even though it would divide all linear combinations of the two (or more) numbers it is a common divisor of.

I achieved by contradiction that it cannot be a linear combination.

If you feel it is a very tough way, then I found a very simple reasoning to prove the same, that was like - as gcd is the smallest linear combination, then any smaller common divisor is not a linear combination.
It was called proof by contradiction, but I felt it was not helpful enough. So, I devised my proof.