Prove that if a²+b² is a multiple of ab+1, for positive integer a and b, then (a²+b²)/(ab+1) is a perfect square.
Consider only a,b which a²+b² is a multiple of ab+1. Assume a and b are not relatively prime, so factor them as a=mc, b=md, where m>1. This implies m²(c²+d²) is divisible by (m²(cd)+1) but they are relatively prime for m>1, so (a²+b²)/(ab+1) must be a factor of m² (instead of needing to be 1)
a²+b²2(ab+1) is also divisible by ab+1 and equals (ab)²2. It also be greater than (ab+1) since a²+b² > ab+1 (for a>1 or b>1), and (ab)²2 =0 gives no results. So (ab)²2 >= ab+1.
Edited on March 25, 2007, 1:45 am

Posted by Gamer
on 20070324 21:03:44 