Prove that if a²+b² is a multiple of ab+1, for positive integer a and b, then (a²+b²)/(ab+1) is a perfect square.
Consider only a,b which a²+b² is a multiple of ab+1. Assume a and b are not relatively prime, so factor them as a=mc, b=md, where m>1. This implies m²(c²+d²) is divisible by (m²(cd)+1) but they are relatively prime for m>1, so (a²+b²)/(ab+1) must be a factor of m² (instead of needing to be 1)
a²+b²-2(ab+1) is also divisible by ab+1 and equals (a-b)²-2. It also be greater than -(ab+1) since a²+b² > ab+1 (for a>1 or b>1), and (a-b)²-2 =0 gives no results. So (a-b)²-2 >= ab+1.
Edited on March 25, 2007, 1:45 am
Posted by Gamer
on 2007-03-24 21:03:44