Prove that if a²+b² is a multiple of ab+1, for positive integer a and b, then (a²+b²)/(ab+1) is a perfect square.
(In reply to
Puzzle Solution by K Sengupta)
In consonance with the methodology in terms of the previous post, we take (a^2+b^2)/(ab+1) = p, so that each of a, b and e are positive integers.
Now, fixing p, we consider positive integers C and D such that
C^2 + D^2 = p(CD + 1)......(#).
We also consider Min(C, D) as small as possible.
Without loss of generality, let us assume that C> =D.
From (#), C^2 -pCD +(D^2-p)=0
Regarding this as a quadratic for C, we observe that the other root E (say) satisfies C+E = pD and CE = D^2-p.
Now, CE = D^2-p gives E = D^2/C - p/C < D, so that:
Min(E, D) < Min (C, D). This is a contradiction.
Consequently, E cannot be a positive integer.
Now,
(C+1)(E+1)
= C+E+CE+1
= D^2 +(D-1)p+1
> 0
So, either both (C+1, E+1) must exceed 0, or both (C+1, E+1) must be
less than 0.
So, E+1> 0
Or, E> -1
But, E = pD-C is an integer
Since E is not a positive integer, this is possible iff E =0.
From CE = D^2-p, it thus follows that p=D^2.
Consequently, the quotient (a^2+b^2)/(ab+1) is always a perfect square.
Edited on March 26, 2007, 4:21 am